Index
All Classes and Interfaces|All Packages|Constant Field Values|Serialized Form
A
- abort() - Method in interface org.apache.beam.runners.fnexecution.state.StateDelegator.Registration
-
De-registers the handler for all future requests for state for the registered process bundle instruction id.
- abort(Executor) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- absolute(String, String...) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData.Path
-
Construct a path from an absolute component path hierarchy.
- AbstractBeamCalcRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamRelNode to replace
ProjectandFilternode. - AbstractBeamCalcRel(RelOptCluster, RelTraitSet, RelNode, RexProgram) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.AbstractBeamCalcRel
- AbstractFlinkCombineRunner<K,
InputT, - Class in org.apache.beam.runners.flink.translation.functionsAccumT, OutputT, W> -
Abstract base for runners that execute a
Combine.PerKey. - AbstractFlinkCombineRunner() - Constructor for class org.apache.beam.runners.flink.translation.functions.AbstractFlinkCombineRunner
- AbstractFlinkCombineRunner.CompleteFlinkCombiner<K,
InputT, - Class in org.apache.beam.runners.flink.translation.functionsAccumT, OutputT> - AbstractFlinkCombineRunner.FinalFlinkCombiner<K,
AccumT, - Class in org.apache.beam.runners.flink.translation.functionsOutputT> -
A final combiner that takes in
AccumTand producesOutputT. - AbstractFlinkCombineRunner.FlinkCombiner<K,
InputT, - Interface in org.apache.beam.runners.flink.translation.functionsAccumT, OutputT> -
Adapter interface that allows using a
CombineFnBase.GlobalCombineFnto either produce theAccumTas output or to combine several accumulators into anOutputT. - AbstractFlinkCombineRunner.PartialFlinkCombiner<K,
InputT, - Class in org.apache.beam.runners.flink.translation.functionsAccumT> -
A partial combiner that takes in
InputTand producesAccumT. - AbstractGetterTypeSupplier() - Constructor for class org.apache.beam.sdk.schemas.AutoValueSchema.AbstractGetterTypeSupplier
- AbstractInOutIterator<K,
InputT, - Class in org.apache.beam.runners.spark.translationOutputT> -
Abstract base class for iterators that process Spark input data and produce corresponding output values.
- AbstractInOutIterator(SparkProcessContext<K, InputT, OutputT>) - Constructor for class org.apache.beam.runners.spark.translation.AbstractInOutIterator
- AbstractReadFileRangesFn(SerializableFunction<String, ? extends FileBasedSource<InT>>, ReadAllViaFileBasedSource.ReadFileRangesFnExceptionHandler) - Constructor for class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceTransform.AbstractReadFileRangesFn
- AbstractResult() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.AbstractResult
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.ArrayQualifierContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.ArrayQualifierListContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.FieldSpecifierContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.MapQualifierContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.MapQualifierListContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifiedComponentContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifyComponentContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.SimpleIdentifierContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.WildcardContext
- accept(BeamFnApi.Elements) - Method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver
- accept(SchemaZipFold.Context, Optional<Schema.Field>, Optional<Schema.Field>) - Method in class org.apache.beam.sdk.schemas.utils.SchemaZipFold
-
Accepts two fields, context.parent() is always ROW.
- accept(SchemaZipFold.Context, Schema.FieldType, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.utils.SchemaZipFold
-
Accepts two components, context.parent() is always ROW, MAP, ARRAY or absent.
- accept(ByteString) - Method in class org.apache.beam.sdk.fn.data.DecodingFnDataReceiver
- accept(T) - Method in interface org.apache.beam.sdk.fn.data.FnDataReceiver
- accept(T) - Method in interface org.apache.beam.sdk.function.ThrowingConsumer
- accept(T1, T2) - Method in interface org.apache.beam.sdk.function.ThrowingBiConsumer
- accessPattern() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.SideInputSpec
- accessType() - Method in class org.apache.beam.sdk.io.solace.data.Semp.QueueData
- accumulate(T, T) - Method in class org.apache.beam.sdk.schemas.utils.SchemaZipFold
-
Accumulate two results together.
- accumulateWeight(long) - Method in class org.apache.beam.sdk.fn.data.WeightedList
- ACCUMULATING_FIRED_PANES - Enum constant in enum class org.apache.beam.sdk.values.WindowingStrategy.AccumulationMode
- AccumulatingCombineFn() - Constructor for class org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn
- accumulatingFiredPanes() - Method in class org.apache.beam.sdk.transforms.windowing.Window
-
Returns a new
WindowPTransformthat uses the registered WindowFn and Triggering behavior, and that accumulates elements in a pane after they are triggered. - ACCUMULATOR_NAME - Static variable in class org.apache.beam.runners.flink.metrics.FlinkMetricContainer
- AccumulatorCheckpointingSparkListener() - Constructor for class org.apache.beam.runners.spark.metrics.MetricsAccumulator.AccumulatorCheckpointingSparkListener
- ackDeadlineSeconds(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Return the ack deadline, in seconds, for
subscription. - ackDeadlineSeconds(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
- ackDeadlineSeconds(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
- ackDeadlineSeconds(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- ackId() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.IncomingMessage
-
Id to pass back to Pubsub to acknowledge receipt of this message.
- acknowledge(PubsubClient.SubscriptionPath, List<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Acknowldege messages from
subscriptionwithackIds. - acknowledge(PubsubClient.SubscriptionPath, List<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
- acknowledge(PubsubClient.SubscriptionPath, List<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
- acknowledge(PubsubClient.SubscriptionPath, List<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- acquireTaskAttemptIdLock(Configuration, int) - Method in interface org.apache.beam.sdk.io.hadoop.format.ExternalSynchronization
-
Creates unique
TaskAttemptIDfor given taskId. - acquireTaskAttemptIdLock(Configuration, int) - Method in class org.apache.beam.sdk.io.hadoop.format.HDFSSynchronization
- acquireTaskIdLock(Configuration) - Method in interface org.apache.beam.sdk.io.hadoop.format.ExternalSynchronization
-
Creates
TaskIDwith unique id among given job. - acquireTaskIdLock(Configuration) - Method in class org.apache.beam.sdk.io.hadoop.format.HDFSSynchronization
- action() - Method in class org.apache.beam.runners.spark.translation.BoundedDataset
- action() - Method in interface org.apache.beam.runners.spark.translation.Dataset
- action() - Method in class org.apache.beam.runners.spark.translation.streaming.UnboundedDataset
- ActionFactory - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.action
-
Factory class for creating instances that will handle different functions of DoFns.
- ActionFactory - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.action
-
Factory class for creating instances that will handle each type of record within a change stream query.
- ActionFactory() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ActionFactory
- ActionFactory() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.ActionFactory
- activate(MetricsContainer) - Method in class org.apache.beam.sdk.metrics.MetricsEnvironment.MetricsContainerHolder
- activate(MetricsContainer) - Method in interface org.apache.beam.sdk.metrics.MetricsEnvironment.MetricsEnvironmentState
- ACTIVE_PARTITION_READ_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Counter for the active partition reads during the execution of the Connector.
- actuateProjectionPushdown(Map<TupleTag<?>, FieldAccessDescriptor>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- actuateProjectionPushdown(Map<TupleTag<?>, FieldAccessDescriptor>) - Method in interface org.apache.beam.sdk.schemas.ProjectionProducer
-
Actuate a projection pushdown.
- add(int, GlobalWatermarkHolder.SparkWatermarks) - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
- add(long) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique
-
Deprecated.Adds a value to the heap, returning whether the value is (large enough to be) in the heap.
- add(long, Instant, boolean) - Method in class org.apache.beam.sdk.extensions.ordered.combiner.SequenceRangeAccumulator
- add(InputT) - Method in interface org.apache.beam.sdk.state.GroupingState
-
Add a value to the buffer.
- add(Class<?>, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.UdfImplReflectiveFunctionBase.ParameterListBuilder
- add(Iterable<String>) - Method in class org.apache.beam.runners.jet.metrics.BoundedTrieImpl
- add(Iterable<String>) - Method in interface org.apache.beam.sdk.metrics.BoundedTrie
-
Adds a path to the trie.
- add(Iterable<String>) - Method in class org.apache.beam.sdk.metrics.Lineage
-
Adds the given fqn as lineage.
- add(Iterable<TimestampedValue<T>>) - Static method in class org.apache.beam.sdk.testing.TestStream.ElementEvent
-
For internal use only: no backwards compatibility guarantees.
- add(Type, String, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.UdfImplReflectiveFunctionBase.ParameterListBuilder
- add(String) - Method in class org.apache.beam.runners.jet.metrics.StringSetImpl
- add(String) - Method in interface org.apache.beam.sdk.metrics.StringSet
-
Add a value to this set.
- add(String...) - Method in class org.apache.beam.runners.jet.metrics.BoundedTrieImpl
- add(String...) - Method in class org.apache.beam.runners.jet.metrics.StringSetImpl
- add(String...) - Method in interface org.apache.beam.sdk.metrics.BoundedTrie
-
Adds a path to the trie.
- add(String...) - Method in interface org.apache.beam.sdk.metrics.StringSet
-
Add values to this set.
- add(String, String, Iterable<String>, String) - Method in class org.apache.beam.sdk.metrics.Lineage
-
Add a FQN (fully-qualified name) to Lineage.
- add(String, Iterable<String>) - Method in class org.apache.beam.sdk.metrics.Lineage
-
Add a FQN (fully-qualified name) to Lineage.
- add(String, Iterable<String>, String) - Method in class org.apache.beam.sdk.metrics.Lineage
-
Add a FQN (fully-qualified name) to Lineage.
- add(List<ValueInSingleWindow<T>>, TableDataInsertAllResponse.InsertErrors, TableReference, FailsafeValueInSingleWindow<TableRow, TableRow>) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.ErrorContainer
- add(MetricsContainerStepMap) - Method in class org.apache.beam.runners.flink.metrics.MetricsAccumulator
- add(MetricsContainerStepMap) - Method in class org.apache.beam.runners.spark.metrics.MetricsContainerStepMapAccumulator
- add(MetricsContainerStepMap) - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.MetricsAccumulator
- add(DisplayData.ItemSpec<?>) - Method in interface org.apache.beam.sdk.transforms.display.DisplayData.Builder
-
Register the given display item.
- add(KV<byte[], byte[]>) - Method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter
- add(KV<byte[], byte[]>) - Method in class org.apache.beam.sdk.extensions.sorter.ExternalSorter
-
Adds a given record to the sorter.
- add(WindowedValue<InputT>, SparkCombineFn<InputT, ValueT, AccumT, ?>) - Method in interface org.apache.beam.runners.spark.translation.SparkCombineFn.WindowedAccumulator
-
Add value with unexploded windows into the accumulator.
- add(T) - Method in class org.apache.beam.runners.twister2.translators.functions.Twister2SinkFunction
- add(T, long) - Method in class org.apache.beam.sdk.fn.data.WeightedList
- add(T, long, Coder<T>) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.Sketch
- add(T, Coder<T>) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.Sketch
- addAccum(AccumT) - Method in interface org.apache.beam.sdk.state.CombiningState
-
Add an accumulator to this state cell.
- addAll(List<T>, long) - Method in class org.apache.beam.sdk.fn.data.WeightedList
- addAll(Map<Integer, Queue<GlobalWatermarkHolder.SparkWatermarks>>) - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
- addAll(WeightedList<T>) - Method in class org.apache.beam.sdk.fn.data.WeightedList
- addAnnotation(String, byte[]) - Method in class org.apache.beam.sdk.transforms.PTransform
- addArray(Object...) - Method in class org.apache.beam.sdk.values.Row.Builder
- addArray(Collection<T>) - Method in class org.apache.beam.sdk.values.Row.Builder
- addArrayField(String, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addAttempted(T, BiFunction<T, T, T>) - Method in class org.apache.beam.sdk.metrics.MetricResult
- addBatchWriteRequest(long, boolean) - Method in interface org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler.Stats
- addBoolean(Map<String, Object>, String, boolean) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- addBooleanField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addByteArrayField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addByteField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addCoderAndEncodedRecord(Coder<T>, T) - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Record.Builder
- addCollectionToSingletonOutput(PCollection<?>, String, PCollectionView<?>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
-
Adds an output to this
CollectionToSingletonDataflow step, consuming the specified inputPValueand producing the specified outputPValue. - addCommitted(T, BiFunction<T, T, T>) - Method in class org.apache.beam.sdk.metrics.MetricResult
- addDataSet(String, DataSet<T>) - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.BatchTranslationContext
- addDataStream(String, DataStream<T>) - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator.StreamingTranslationContext
- addDateTimeField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addDecimalField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addDictionary(Map<String, Object>, String, Map<String, Object>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- addDouble(Map<String, Object>, String, Double) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- addDoubleField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addElements(TimestampedValue<T>, TimestampedValue<T>...) - Method in class org.apache.beam.sdk.testing.TestStream.Builder
-
Adds the specified elements to the source with the provided timestamps.
- addElements(T, T...) - Method in class org.apache.beam.sdk.testing.TestStream.Builder
-
Adds the specified elements to the source with timestamp equal to the current watermark.
- addEncodingInput(Coder<?>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
-
Sets the encoding for this Dataflow step.
- addErrorCollection(PCollection<ErrorT>) - Method in interface org.apache.beam.sdk.transforms.errorhandling.ErrorHandler
- addErrorCollection(PCollection<ErrorT>) - Method in class org.apache.beam.sdk.transforms.errorhandling.ErrorHandler.DefaultErrorHandler
- addErrorCollection(PCollection<ErrorT>) - Method in class org.apache.beam.sdk.transforms.errorhandling.ErrorHandler.PTransformErrorHandler
- addErrorForCode(int, String) - Method in class org.apache.beam.sdk.extensions.gcp.util.CustomHttpErrors.Builder
-
Adds a matcher to log the provided string if the error matches a particular status code.
- addErrorForCodeAndUrlContains(int, String, String) - Method in class org.apache.beam.sdk.extensions.gcp.util.CustomHttpErrors.Builder
-
Adds a matcher to log the provided string if the error matches a particular status code and the url contains a certain string.
- addExceptionStackTrace(Exception) - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Failure.Builder
- addExperiment(ExperimentalOptions, String) - Static method in interface org.apache.beam.sdk.options.ExperimentalOptions
-
Adds experiment to options if not already present.
- addFailure(String, String) - Method in class org.apache.beam.sdk.io.cdap.context.FailureCollectorWrapper
- addField(String, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addField(Schema.Field) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addFields(List<Schema.Field>) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addFields(Schema.Field...) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- AddFields - Class in org.apache.beam.sdk.schemas.transforms
-
A transform to add new nullable fields to a PCollection's schema.
- AddFields() - Constructor for class org.apache.beam.sdk.schemas.transforms.AddFields
- AddFields.Inner<T> - Class in org.apache.beam.sdk.schemas.transforms
-
Inner PTransform for AddFields.
- addFloatField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- AddHarnessIdInterceptor - Class in org.apache.beam.sdk.fn.channel
-
A
ClientInterceptorthat attaches a provided SDK Harness ID to outgoing messages. - addHumanReadableJson(Object) - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Record.Builder
- addIfAbsent(T) - Method in interface org.apache.beam.sdk.state.SetState
-
Ensures a value is a member of the set, returning
trueif it was added andfalseotherwise. - addIfNotDefault(DisplayData.ItemSpec<T>, T) - Method in interface org.apache.beam.sdk.transforms.display.DisplayData.Builder
-
Register the given display item if the value is different than the specified default.
- addIfNotNull(DisplayData.ItemSpec<?>) - Method in interface org.apache.beam.sdk.transforms.display.DisplayData.Builder
-
Register the given display item if the value is not null.
- addIncompleteNewPartitions(NewPartition) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler.OrphanedMetadataCleaner
-
Add NewPartition if it hasn't been updated for 15 minutes.
- addIncompleteNewPartitions(NewPartition) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler.PartitionReconciler
-
Capture NewPartition row that cannot merge on its own.
- addInput(double[], Double) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
- addInput(int[], Integer) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
- addInput(long[], Boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CountIf.CountIfFn
- addInput(long[], Long) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
- addInput(AccumT, InputT) - Method in class org.apache.beam.sdk.extensions.sql.impl.LazyAggregateCombineFn
- addInput(AccumT, InputT) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions.PositionAwareCombineFn
- addInput(AccumT, InputT) - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
- addInput(AccumT, InputT) - Method in interface org.apache.beam.sdk.extensions.sql.udf.AggregateFn
-
Adds the given input value to the given accumulator, returning the new accumulator value.
- addInput(AccumT, InputT) - Method in class org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn
- addInput(AccumT, InputT) - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
-
Adds the given input value to the given accumulator, returning the new accumulator value.
- addInput(AccumT, InputT, Long, Long, Long) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions.PositionAwareCombineFn
- addInput(AccumT, InputT, CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
-
Adds the given input value to the given accumulator, returning the new accumulator value.
- addInput(HyperLogLogPlus, InputT) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
- addInput(MergingDigest, Double) - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
- addInput(InputT) - Method in interface org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn.Accumulator
-
Adds the given input value to this accumulator, modifying this accumulator.
- addInput(Iterable<T>, T) - Method in class org.apache.beam.runners.flink.FlinkStreamingAggregationsTranslators.ConcatenateAsIterable
- addInput(Long, Long) - Method in class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.Sum
- addInput(Object[], DataT) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
- addInput(Object[], DataT, CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
- addInput(String, byte[]) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggByte
- addInput(String, Boolean) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
-
Adds an input with the given name and value to this Dataflow step.
- addInput(String, Long) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
-
Adds an input with the given name and value to this Dataflow step.
- addInput(String, String) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
-
Adds an input with the given name and value to this Dataflow step.
- addInput(String, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggString
- addInput(String, List<? extends Map<String, Object>>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
-
Adds an input that is a list of objects.
- addInput(String, Map<String, Object>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
-
Adds an input that is a dictionary of strings to objects.
- addInput(String, PInput) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
-
Adds an input with the given name to this Dataflow step, coming from the specified input PValue.
- addInput(List<String>, String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Concatenate
- addInput(List<T>, T) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.ArrayAgg.ArrayAggArray
- addInput(List<V>, V) - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
- addInput(K, AccumT, AccumT, PipelineOptions, SideInputReader, Collection<? extends BoundedWindow>) - Method in class org.apache.beam.runners.flink.translation.functions.AbstractFlinkCombineRunner.FinalFlinkCombiner
- addInput(K, AccumT, InputT, PipelineOptions, SideInputReader, Collection<? extends BoundedWindow>) - Method in class org.apache.beam.runners.flink.translation.functions.AbstractFlinkCombineRunner.CompleteFlinkCombiner
- addInput(K, AccumT, InputT, PipelineOptions, SideInputReader, Collection<? extends BoundedWindow>) - Method in interface org.apache.beam.runners.flink.translation.functions.AbstractFlinkCombineRunner.FlinkCombiner
- addInput(K, AccumT, InputT, PipelineOptions, SideInputReader, Collection<? extends BoundedWindow>) - Method in class org.apache.beam.runners.flink.translation.functions.AbstractFlinkCombineRunner.PartialFlinkCombiner
- addInput(SequenceRangeAccumulator, TimestampedValue<KV<EventKeyT, KV<Long, EventT>>>) - Method in class org.apache.beam.sdk.extensions.ordered.combiner.DefaultSequenceCombiner
- addInput(SketchFrequencies.Sketch<InputT>, InputT) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
- addInput(CovarianceAccumulator, Row) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CovarianceFn
- addInput(VarianceAccumulator, T) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.VarianceFn
- addInput(BeamBuiltinAggregations.BitXOr.Accum, T) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations.BitXOr
- addInput(ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique, T) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
-
Deprecated.
- addInput(Combine.Holder<V>, V) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
- addInput(Top.BoundedHeap<KV<Integer, T>, SerializableComparator<KV<Integer, T>>>, T) - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
- addInt16Field(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addInt32Field(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addInt64Field(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addIterable(Iterable<T>) - Method in class org.apache.beam.sdk.values.Row.Builder
- addIterableField(String, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- Additional Outputs - Search tag in class org.apache.beam.sdk.transforms.ParDo
- Section
- additionalOutputTags - Variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- addKnownCoderUrn(String) - Static method in class org.apache.beam.runners.fnexecution.wire.LengthPrefixUnknownCoders
-
Registers a coder as being of known type and as such not meriting length prefixing.
- addLabel(String, String) - Method in class org.apache.beam.sdk.metrics.LabeledMetricNameUtils.MetricNameBuilder
-
Add a metric label KV pair to the metric name.
- addLengthPrefixedCoder(String, RunnerApi.Components.Builder, boolean) - Static method in class org.apache.beam.runners.fnexecution.wire.LengthPrefixUnknownCoders
-
Recursively traverses the coder tree and wraps the first unknown coder in every branch with a
LengthPrefixCoderunless an ancestor coder is itself aLengthPrefixCoder. - addList(Map<String, Object>, String, List<? extends Map<String, Object>>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- addList(Map<String, Object>, String, T[]) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- addLogicalTypeConversions(GenericData) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
- addLogicalTypeField(String, Schema.LogicalType<InputT, BaseT>) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addLong(Map<String, Object>, String, long) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- addLongs(Map<String, Object>, String, long...) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- addMapField(String, Schema.FieldType, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addMessageListener(Consumer<JobApi.JobMessage>) - Method in class org.apache.beam.runners.jobsubmission.JobInvocation
-
Listen for job messages with a
Consumer. - addMethodParameters(Method) - Method in class org.apache.beam.sdk.extensions.sql.impl.UdfImplReflectiveFunctionBase.ParameterListBuilder
- addMetricLabel(String, String) - Method in class org.apache.beam.sdk.metrics.LabeledMetricNameUtils.MetricNameBuilder
-
Add a metric label KV pair to the metric.
- addMissingPartitions(List<Range.ByteStringRange>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler.OrphanedMetadataCleaner
-
Add all the missingPartitions.
- addMissingPartitions(List<Range.ByteStringRange>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler.PartitionReconciler
-
Capture partitions that are not currently being streamed.
- addNameFilter(MetricNameFilter) - Method in class org.apache.beam.sdk.metrics.MetricsFilter.Builder
-
Add a
MetricNameFilter. - addNull(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- addNullableArrayField(String, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addNullableBooleanField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addNullableByteArrayField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addNullableByteField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addNullableDateTimeField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addNullableDecimalField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addNullableDoubleField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addNullableField(String, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addNullableFloatField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addNullableInt16Field(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addNullableInt32Field(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addNullableInt64Field(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addNullableIterableField(String, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addNullableLogicalTypeField(String, Schema.LogicalType<InputT, BaseT>) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addNullableMapField(String, Schema.FieldType, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addNullableRowField(String, Schema) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addNullableStringField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addObject(Map<String, Object>, String, Map<String, Object>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- addOptions(Schema.Options) - Method in class org.apache.beam.sdk.schemas.Schema.Options.Builder
- addOutput(Output) - Method in class org.apache.beam.sdk.io.cdap.context.BatchSinkContextImpl
-
Overrides the output configuration of this Batch job to the specified
Output. - addOutput(String, PCollection<?>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
-
Adds a primitive output to this Dataflow step with the given name as the local output name, producing the specified output
PValue, including itsCoderif aTypedPValue. - addOverrideForClass(Class<?>, DataflowWorkerLoggingOptions.Level) - Method in class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.WorkerLogLevelOverrides
-
Deprecated.Overrides the default log level for the passed in class.
- addOverrideForClass(Class<?>, SdkHarnessOptions.LogLevel) - Method in class org.apache.beam.sdk.options.SdkHarnessOptions.SdkHarnessLogLevelOverrides
-
Overrides the default log level for the passed in class.
- addOverrideForName(String, DataflowWorkerLoggingOptions.Level) - Method in class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.WorkerLogLevelOverrides
-
Deprecated.Overrides the default log level for the passed in name.
- addOverrideForName(String, SdkHarnessOptions.LogLevel) - Method in class org.apache.beam.sdk.options.SdkHarnessOptions.SdkHarnessLogLevelOverrides
-
Overrides the default log logLevel for the passed in name.
- addOverrideForPackage(Package, DataflowWorkerLoggingOptions.Level) - Method in class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.WorkerLogLevelOverrides
-
Deprecated.Overrides the default log level for the passed in package.
- addOverrideForPackage(Package, SdkHarnessOptions.LogLevel) - Method in class org.apache.beam.sdk.options.SdkHarnessOptions.SdkHarnessLogLevelOverrides
-
Overrides the default log level for the passed in package.
- addProperties(MetadataEntity, Map<String, String>) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- addReader(int) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceSplitEnumerator
- addReader(int) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.LazyFlinkSourceSplitEnumerator
- addRowField(String, Schema) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addRows(Object...) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestBoundedTable
-
Add rows to the builder.
- addRows(String, Row...) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider
- addRows(Duration, Object...) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestUnboundedTable
-
Add rows to the builder.
- addRunnerWireCoder(PipelineNode.PCollectionNode, RunnerApi.Components.Builder, RunnerApi.ExecutableStagePayload.WireCoderSetting) - Static method in class org.apache.beam.runners.fnexecution.wire.WireCoders
-
Creates a runner-side wire coder for a port read/write for the given PCollection.
- addSchema(String, TableProvider) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv.BeamSqlEnvBuilder
-
Add a top-level schema backed by the table provider.
- addSdkWireCoder(PipelineNode.PCollectionNode, RunnerApi.Components.Builder, RunnerApi.ExecutableStagePayload.WireCoderSetting) - Static method in class org.apache.beam.runners.fnexecution.wire.WireCoders
-
Creates an SDK-side wire coder for a port read/write for the given PCollection.
- AddShardKeyDoFn - Class in org.apache.beam.sdk.io.solace.write
-
This class adds pseudo-key with a given cardinality.
- AddShardKeyDoFn(int) - Constructor for class org.apache.beam.sdk.io.solace.write.AddShardKeyDoFn
- addSideInputValue(StreamRecord<RawUnionValue>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
-
Add the side input value.
- addSideInputValue(StreamRecord<RawUnionValue>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.ExecutableStageDoFnOperator
- addSplits(List<FlinkSourceSplit<T>>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- addSplitsBack(List<FlinkSourceSplit<T>>, int) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceSplitEnumerator
- addSplitsBack(List<FlinkSourceSplit<T>>, int) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.LazyFlinkSourceSplitEnumerator
- addSplitsToUnfinishedForCheckpoint(long, List<FlinkSourceSplit<T>>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
-
To be overridden in unbounded reader.
- addSplitsToUnfinishedForCheckpoint(long, List<FlinkSourceSplit<T>>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.unbounded.FlinkUnboundedSourceReader
- addStateListener(Consumer<JobApi.JobStateEvent>) - Method in class org.apache.beam.runners.jobsubmission.JobInvocation
-
Listen for job state changes with a
Consumer. - addStep(String) - Method in class org.apache.beam.sdk.metrics.MetricsFilter.Builder
-
Add a step filter.
- addStep(PTransform<?, ?>, String) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
-
Adds a step to the Dataflow workflow for the given transform, with the given Dataflow step type.
- addString(Map<String, Object>, String, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- addStringField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addStringList(Map<String, Object>, String, List<String>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- addTags(MetadataEntity, Iterable<String>) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- addTags(MetadataEntity, String...) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- addTimers(Iterator<TimerInternals.TimerData>) - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
- addToCurrentBundle(Solace.Record) - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
- addTraceFor(AbstractGoogleClientRequest<?>, String) - Method in class org.apache.beam.sdk.extensions.gcp.options.GoogleApiDebugOptions.GoogleApiTracer
-
Creates a
GoogleApiDebugOptions.GoogleApiTracerthat sets the tracetraceDestinationon all calls that match for the given request type. - addTraceFor(AbstractGoogleClient, String) - Method in class org.apache.beam.sdk.extensions.gcp.options.GoogleApiDebugOptions.GoogleApiTracer
-
Creates a
GoogleApiDebugOptions.GoogleApiTracerthat sets the trace destination on all calls that match the given client type. - addUdaf(String, Combine.CombineFn) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv.BeamSqlEnvBuilder
-
Register a UDAF function which can be used in GROUP-BY expression.
- addUdf(String, Class<?>, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv.BeamSqlEnvBuilder
-
Register a UDF function which can be used in SQL expression.
- addUdf(String, Class<? extends BeamSqlUdf>) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv.BeamSqlEnvBuilder
-
Register a UDF function which can be used in SQL expression.
- addUdf(String, SerializableFunction) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv.BeamSqlEnvBuilder
-
Register a UDF function which can be used in SQL expression.
- addUuids() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteIO
-
Add Uuids to to-be-published messages that ensures that uniqueness is maintained.
- AddUuidsTransform - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
-
A transform to add UUIDs to each message to be written to Pub/Sub Lite.
- AddUuidsTransform() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.AddUuidsTransform
- addValue(Object) - Method in class org.apache.beam.sdk.values.Row.Builder
- addValues(Object...) - Method in class org.apache.beam.sdk.values.Row.Builder
- addValues(List<Object>) - Method in class org.apache.beam.sdk.values.Row.Builder
- addWatermarkHoldUsage(Instant) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals
- advance() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
- advance() - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
-
See
GlobalWatermarkHolder.advance(String). - advance() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
-
For subscription mode only: Track progression of time according to the
Clockpassed . - advance() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
- advance() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
- advance() - Method in class org.apache.beam.sdk.io.Source.Reader
-
Advances the reader to the next valid record.
- advance() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
-
Advances the reader to the next valid record.
- advanceBy(Duration) - Static method in class org.apache.beam.sdk.testing.TestStream.ProcessingTimeEvent
-
For internal use only: no backwards compatibility guarantees.
- Advanced features - Search tag in class org.apache.beam.sdk.io.TextIO
- Section
- Advanced Kafka Configuration - Search tag in class org.apache.beam.sdk.io.kafka.KafkaIO
- Section
- Advanced SolaceIO#read(TypeDescriptor, SerializableFunction, SerializableFunction) top-level method - Search tag in class org.apache.beam.sdk.io.solace.SolaceIO
- Section
- advanceImpl() - Method in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
- advanceImpl() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
-
Advances to the next record and returns
true, or returns false if there is no next record. - advanceNextBatchWatermarkToInfinity() - Method in class org.apache.beam.runners.spark.io.CreateStream
-
Advances the watermark in the next batch to the end-of-time.
- advanceProcessingTime(Duration) - Method in class org.apache.beam.sdk.testing.TestStream.Builder
-
Advance the processing time by the specified amount.
- advanceTo(Instant) - Static method in class org.apache.beam.sdk.testing.TestStream.WatermarkEvent
-
For internal use only: no backwards compatibility guarantees.
- advanceWatermark() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
-
Advances the watermark.
- advanceWatermarkForNextBatch(Instant) - Method in class org.apache.beam.runners.spark.io.CreateStream
-
Advances the watermark in the next batch.
- advanceWatermarkTo(Instant) - Method in class org.apache.beam.sdk.testing.TestStream.Builder
-
Advance the watermark of this source to the specified instant.
- advanceWatermarkToInfinity() - Method in class org.apache.beam.sdk.testing.TestStream.Builder
-
Advance the watermark to infinity, completing this
TestStream. - AdvancingPhaser - Class in org.apache.beam.sdk.fn.stream
-
A
Phaserwhich never terminates. - AdvancingPhaser(int) - Constructor for class org.apache.beam.sdk.fn.stream.AdvancingPhaser
- AfterAll - Class in org.apache.beam.sdk.transforms.windowing
-
A composite
Triggerthat fires when all of its sub-triggers are ready. - afterBundleCommit(Instant, DoFn.BundleFinalizer.Callback) - Method in interface org.apache.beam.sdk.transforms.DoFn.BundleFinalizer
-
The provided function will be called after the runner successfully commits the output of a successful bundle.
- afterEach(ExtensionContext) - Method in class org.apache.beam.sdk.testing.TestPipelineExtension
- AfterEach - Class in org.apache.beam.sdk.transforms.windowing
-
A composite
Triggerthat executes its sub-triggers in order. - AfterFirst - Class in org.apache.beam.sdk.transforms.windowing
-
A composite
Triggerthat fires once after at least one of its sub-triggers have fired. - afterIterations(int) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
-
Returns a
Watch.Growth.TerminationConditionthat holds after the given number of polling iterations have occurred per-input. - AfterPane - Class in org.apache.beam.sdk.transforms.windowing
-
A
Triggerthat fires at some point after a specified number of input elements have arrived. - AfterProcessingTime - Class in org.apache.beam.sdk.transforms.windowing
-
A
Triggertrigger that fires at a specified point in processing time, relative to when input first arrives. - AfterSynchronizedProcessingTime - Class in org.apache.beam.sdk.transforms.windowing
-
FOR INTERNAL USE ONLY.
- afterTimeSinceNewOutput(SerializableFunction<InputT, ReadableDuration>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
-
Like
Watch.Growth.afterTimeSinceNewOutput(ReadableDuration), but the duration is input-dependent. - afterTimeSinceNewOutput(ReadableDuration) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
-
Returns a
Watch.Growth.TerminationConditionthat holds after the given time has elapsed after the last time theWatch.Growth.PollResultfor the current input contained a previously unseen output. - afterTotalOf(SerializableFunction<InputT, ReadableDuration>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
-
Like
Watch.Growth.afterTotalOf(ReadableDuration), but the duration is input-dependent. - afterTotalOf(ReadableDuration) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
-
Returns a
Watch.Growth.TerminationConditionthat holds after the given time has elapsed after the current input was seen. - AfterWatermark - Class in org.apache.beam.sdk.transforms.windowing
-
AfterWatermarktriggers fire based on progress of the system watermark. - AfterWatermark.AfterWatermarkEarlyAndLate - Class in org.apache.beam.sdk.transforms.windowing
- AfterWatermark.FromEndOfWindow - Class in org.apache.beam.sdk.transforms.windowing
-
A watermark trigger targeted relative to the end of the window.
- aggregate(Combine.CombineFn<InputT, ?, OutputT>) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
-
Aggregate the grouped data using the specified
Combine.CombineFn. - AggregateCombiner() - Constructor for class org.apache.beam.sdk.schemas.transforms.Group.AggregateCombiner
- aggregateField(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
- aggregateField(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
- aggregateField(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
- aggregateField(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
- aggregateField(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.AggregateCombiner
-
Build up an aggregation function over the input elements.
- aggregateField(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
- aggregateField(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
- aggregateField(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
- aggregateField(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
- aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
-
Build up an aggregation function over the input elements.
- aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
-
Build up an aggregation function over the input elements.
- aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
-
Build up an aggregation function over the input elements.
- aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
-
Build up an aggregation function over the input elements.
- aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.AggregateCombiner
-
Build up an aggregation function over the input elements.
- aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
-
Build up an aggregation function over the input elements.
- aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
-
Build up an aggregation function over the input elements.
- aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
-
Build up an aggregation function over the input elements.
- aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
-
Build up an aggregation function over the input elements.
- aggregateFieldBaseValue(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
- aggregateFieldBaseValue(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
- aggregateFieldBaseValue(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
- aggregateFieldBaseValue(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
- aggregateFieldBaseValue(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
- aggregateFieldBaseValue(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
- aggregateFieldBaseValue(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
- aggregateFieldBaseValue(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
- aggregateFieldBaseValue(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
- aggregateFieldBaseValue(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
- aggregateFieldBaseValue(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
- aggregateFieldBaseValue(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
- aggregateFieldBaseValue(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
- aggregateFieldBaseValue(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
- aggregateFieldBaseValue(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
- aggregateFieldBaseValue(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
- aggregateFields(List<String>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
-
Build up an aggregation function over the input elements.
- aggregateFields(List<String>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
-
Build up an aggregation function over the input elements.
- aggregateFields(List<String>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
-
Build up an aggregation function over the input elements.
- aggregateFields(List<String>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
-
Build up an aggregation function over the input elements.
- aggregateFields(List<String>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
-
Build up an aggregation function over the input elements.
- aggregateFields(List<String>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
-
Build up an aggregation function over the input elements.
- aggregateFields(List<String>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
-
Build up an aggregation function over the input elements.
- aggregateFields(List<String>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
-
Build up an aggregation function over the input elements.
- aggregateFields(FieldAccessDescriptor, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
-
Build up an aggregation function over the input elements.
- aggregateFields(FieldAccessDescriptor, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
-
Build up an aggregation function over the input elements.
- aggregateFields(FieldAccessDescriptor, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
-
Build up an aggregation function over the input elements.
- aggregateFields(FieldAccessDescriptor, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
-
Build up an aggregation function over the input elements.
- aggregateFields(FieldAccessDescriptor, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
-
Build up an aggregation function over the input elements.
- aggregateFields(FieldAccessDescriptor, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
-
Build up an aggregation function over the input elements.
- aggregateFields(FieldAccessDescriptor, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
-
Build up an aggregation function over the input elements.
- aggregateFields(FieldAccessDescriptor, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
-
Build up an aggregation function over the input elements.
- aggregateFieldsById(List<Integer>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
- aggregateFieldsById(List<Integer>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
- aggregateFieldsById(List<Integer>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
- aggregateFieldsById(List<Integer>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.AggregateCombiner
-
Build up an aggregation function over the input elements by field id.
- aggregateFieldsById(List<Integer>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
- aggregateFieldsById(List<Integer>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
- aggregateFieldsById(List<Integer>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
- aggregateFieldsById(List<Integer>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
- AggregateFn<InputT,
AccumT, - Interface in org.apache.beam.sdk.extensions.sql.udfOutputT> -
An aggregate function that can be executed as part of a SQL query.
- AggregationCombineFnAdapter<T> - Class in org.apache.beam.sdk.extensions.sql.impl.transform.agg
-
Wrapper
Combine.CombineFns for aggregation function calls. - AggregationCombineFnAdapter() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.agg.AggregationCombineFnAdapter
- Aggregation of records - Search tag in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO
- Section
- AggregationQuery - Class in org.apache.beam.sdk.io.mongodb
-
Builds a MongoDB AggregateIterable object.
- AggregationQuery() - Constructor for class org.apache.beam.sdk.io.mongodb.AggregationQuery
- algorithm(String) - Method in class org.apache.beam.sdk.io.aws2.s3.SSECustomerKey.Builder
- ALIAS - Enum constant in enum class org.apache.beam.sdk.io.clickhouse.TableSchema.DefaultType
- align(Duration) - Method in interface org.apache.beam.sdk.state.Timer
-
Aligns the target timestamp used by
Timer.setRelative()to the next boundary ofperiod. - alignedTo(Duration) - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
-
Aligns the time to be the smallest multiple of
periodgreater than the epoch boundary (akanew Instant(0)). - alignedTo(Duration, Instant) - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
-
Aligns timestamps to the smallest multiple of
periodsince theoffsetgreater than the timestamp. - alignTo(Duration) - Static method in class org.apache.beam.sdk.transforms.windowing.TimestampTransform
-
For internal use only; no backwards-compatibility guarantees.
- alignTo(Duration, Instant) - Static method in class org.apache.beam.sdk.transforms.windowing.TimestampTransform
-
For internal use only; no backwards-compatibility guarantees.
- AlignTo() - Constructor for class org.apache.beam.sdk.transforms.windowing.TimestampTransform.AlignTo
- ALL - Enum constant in enum class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.ListQualifier
- ALL - Enum constant in enum class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.MapQualifier
- ALL_CONTEXTS - Static variable in class org.apache.beam.sdk.testing.CoderProperties
-
All the contexts, for use in test cases.
- ALL_KEYS - Static variable in class org.apache.beam.sdk.io.range.ByteKeyRange
-
The range of all keys, with empty start and end keys.
- allLeavesDescriptor(Schema, SerializableFunction<List<String>, String>) - Static method in class org.apache.beam.sdk.schemas.utils.SelectHelpers
- allMatches(String) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.AllMatchesPTransformthat checks if the entire line matches the Regex. - allMatches(Pattern) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.AllMatchesPTransformthat checks if the entire line matches the Regex. - AllMatches(Pattern) - Constructor for class org.apache.beam.sdk.transforms.Regex.AllMatches
- allMetrics() - Method in class org.apache.beam.sdk.metrics.MetricResults
- allocateAddressAndCreate(List<BindableService>, Endpoints.ApiServiceDescriptor.Builder) - Method in class org.apache.beam.sdk.fn.server.InProcessServerFactory
- allocateAddressAndCreate(List<BindableService>, Endpoints.ApiServiceDescriptor.Builder) - Method in class org.apache.beam.sdk.fn.server.ServerFactory
-
Creates an instance of this server using an ephemeral address.
- allocateAddressAndCreate(List<BindableService>, Endpoints.ApiServiceDescriptor.Builder) - Method in class org.apache.beam.sdk.fn.server.ServerFactory.InetSocketAddressServerFactory
- allocatePortAndCreateFor(List<? extends FnService>, ServerFactory) - Static method in class org.apache.beam.sdk.fn.server.GrpcFnServer
-
Create
GrpcFnServers for the providedFnServices running on an arbitrary port. - allocatePortAndCreateFor(ServiceT, ServerFactory) - Static method in class org.apache.beam.sdk.fn.server.GrpcFnServer
-
Create a
GrpcFnServerfor the providedFnServicerunning on an arbitrary port. - allOf(Iterable<SerializableMatcher<? super T>>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcherwith identical criteria toMatchers.allOf(Iterable). - allOf(SerializableMatcher<T>...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcherwith identical criteria toMatchers.allOf(Matcher[]). - allOf(Watch.Growth.TerminationCondition<InputT, FirstStateT>, Watch.Growth.TerminationCondition<InputT, SecondStateT>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
-
Returns a
Watch.Growth.TerminationConditionthat holds when both of the given two conditions hold. - ALLOW - Enum constant in enum class org.apache.beam.sdk.io.fs.EmptyMatchTreatment
-
Filepatterns matching no resources are allowed.
- ALLOW_DUPLICATES - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- ALLOW_FIELD_ADDITION - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.SchemaUpdateOption
-
Allow adding a nullable field to the schema.
- ALLOW_FIELD_RELAXATION - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.SchemaUpdateOption
-
Allow relaxing a required field in the original schema to nullable.
- ALLOW_IF_WILDCARD - Enum constant in enum class org.apache.beam.sdk.io.fs.EmptyMatchTreatment
-
Filepatterns matching no resources are allowed if the filepattern contains a glob wildcard character, and disallowed otherwise (i.e.
- allowDuplicates() - Method in class org.apache.beam.runners.dataflow.internal.DataflowGroupByKey
-
Returns whether it allows duplicated elements in the output.
- ALLOWS_SHARDABLE_STATE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- allowsDynamicSplitting() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
- allowsDynamicSplitting() - Method in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
- allowsDynamicSplitting() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
-
Whether this reader should allow dynamic splitting of the offset ranges.
- allReaders() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- AlwaysPassMatcher() - Constructor for class org.apache.beam.sdk.testing.TestPipelineOptions.AlwaysPassMatcher
- AlwaysPassMatcherFactory() - Constructor for class org.apache.beam.sdk.testing.TestPipelineOptions.AlwaysPassMatcherFactory
- alwaysRetry() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy
-
Always retry all failures.
- alwaysUseRead() - Method in class org.apache.beam.sdk.transforms.Create.Values
- AmqpIO - Class in org.apache.beam.sdk.io.amqp
-
AmqpIO supports AMQP 1.0 protocol using the Apache QPid Proton-J library.
- AmqpIO.Read - Class in org.apache.beam.sdk.io.amqp
-
A
PTransformto read/receive messages using AMQP 1.0 protocol. - AmqpIO.Write - Class in org.apache.beam.sdk.io.amqp
-
A
PTransformto send messages using AMQP 1.0 protocol. - AmqpMessageCoder - Class in org.apache.beam.sdk.io.amqp
-
A coder for AMQP message.
- AmqpMessageCoder() - Constructor for class org.apache.beam.sdk.io.amqp.AmqpMessageCoder
- AmqpMessageCoderProviderRegistrar - Class in org.apache.beam.sdk.io.amqp
-
A
CoderProviderRegistrarfor standard types used withAmqpIO. - AmqpMessageCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.io.amqp.AmqpMessageCoderProviderRegistrar
- and(Iterable<PCollection<T>>) - Method in class org.apache.beam.sdk.values.PCollectionList
-
Returns a new
PCollectionListthat has all thePCollectionsof thisPCollectionListplus the givenPCollectionsappended to the end, in order. - and(String, PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
-
A version of
KeyedPCollectionTuple.and(String, PCollection)that takes in a string instead of a TupleTag. - and(String, PCollection<Row>) - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
-
Returns a new
PCollectionRowTuplethat has eachPCollectionand tag of thisPCollectionRowTupleplus the givenPCollectionassociated with the given tag. - and(String, PCollection<T>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
-
A version of
PCollectionTuple.and(TupleTag, PCollection)that takes in a String instead of a TupleTag. - and(List<TupleTag<?>>) - Method in class org.apache.beam.sdk.values.TupleTagList
-
Returns a new
TupleTagListthat has all theTupleTagsof thisTupleTagListplus the givenTupleTagsappended to the end, in order. - and(PCollection.IsBounded) - Method in enum class org.apache.beam.sdk.values.PCollection.IsBounded
-
Returns the composed IsBounded property.
- and(PCollection<T>) - Method in class org.apache.beam.sdk.values.PCollectionList
-
Returns a new
PCollectionListthat has all thePCollectionsof thisPCollectionListplus the givenPCollectionappended to the end. - and(TupleTag<?>) - Method in class org.apache.beam.sdk.values.TupleTagList
-
Returns a new
TupleTagListthat has all theTupleTagsof thisTupleTagListplus the givenTupleTagappended to the end. - and(TupleTag<T>, PCollection<T>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
-
Returns a new
PCollectionTuplethat has eachPCollectionandTupleTagof thisPCollectionTupleplus the givenPCollectionassociated with the givenTupleTag. - and(TupleTag<V>, List<V>) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
-
Returns a new
CoGbkResultbased on this, with the given tag and given data added to it. - and(TupleTag<V>, PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
-
Returns a new
KeyedPCollectionTuple<K>that is the same as this, appended with the given PCollection. - annotateFromBytes(PCollectionView<Map<ByteString, VideoContext>>, List<Feature>) - Static method in class org.apache.beam.sdk.extensions.ml.VideoIntelligence
-
Annotates videos from ByteStrings of their contents.
- annotateFromBytesWithContext(List<Feature>) - Static method in class org.apache.beam.sdk.extensions.ml.VideoIntelligence
-
Annotates videos from key-value pairs of ByteStrings and VideoContext.
- annotateFromURI(List<Feature>, PCollectionView<Map<String, VideoContext>>) - Static method in class org.apache.beam.sdk.extensions.ml.VideoIntelligence
-
Annotates videos from GCS URIs.
- annotateFromUriWithContext(List<Feature>) - Static method in class org.apache.beam.sdk.extensions.ml.VideoIntelligence
-
Annotates videos from key-value pairs of GCS URI and VideoContext.
- annotateImagesFromBytes(PCollectionView<Map<ByteString, ImageContext>>, List<Feature>, long) - Static method in class org.apache.beam.sdk.extensions.ml.CloudVision
-
Creates a
PTransformthat annotates images from their contents encoded inByteStrings. - annotateImagesFromBytes(PCollectionView<Map<ByteString, ImageContext>>, List<Feature>, long, int) - Static method in class org.apache.beam.sdk.extensions.ml.CloudVision
-
Creates a
PTransformthat annotates images from their contents encoded inByteStrings. - AnnotateImagesFromBytes(PCollectionView<Map<ByteString, ImageContext>>, List<Feature>, long, int) - Constructor for class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromBytes
- annotateImagesFromBytesWithContext(List<Feature>, long) - Static method in class org.apache.beam.sdk.extensions.ml.CloudVision
-
Creates a
PTransformthat annotates images from KVs of their GCS addresses in Strings andImageContextfor each image. - annotateImagesFromBytesWithContext(List<Feature>, long, int) - Static method in class org.apache.beam.sdk.extensions.ml.CloudVision
-
Creates a
PTransformthat annotates images from KVs of their GCS addresses in Strings andImageContextfor each image. - AnnotateImagesFromBytesWithContext(List<Feature>, long, int) - Constructor for class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromBytesWithContext
- annotateImagesFromGcsUri(PCollectionView<Map<String, ImageContext>>, List<Feature>, long) - Static method in class org.apache.beam.sdk.extensions.ml.CloudVision
-
Creates a
PTransformthat annotates images from their GCS addresses. - annotateImagesFromGcsUri(PCollectionView<Map<String, ImageContext>>, List<Feature>, long, int) - Static method in class org.apache.beam.sdk.extensions.ml.CloudVision
-
Creates a
PTransformthat annotates images from their GCS addresses. - AnnotateImagesFromGcsUri(PCollectionView<Map<String, ImageContext>>, List<Feature>, long, int) - Constructor for class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromGcsUri
- annotateImagesFromGcsUriWithContext(List<Feature>, long) - Static method in class org.apache.beam.sdk.extensions.ml.CloudVision
-
Creates a
PTransformthat annotates images from KVs of their String-encoded contents andImageContextfor each image. - annotateImagesFromGcsUriWithContext(List<Feature>, long, int) - Static method in class org.apache.beam.sdk.extensions.ml.CloudVision
-
Creates a
PTransformthat annotates images from KVs of their String-encoded contents andImageContextfor each image. - AnnotateImagesFromGcsUriWithContext(List<Feature>, long, int) - Constructor for class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromGcsUriWithContext
- AnnotateText - Class in org.apache.beam.sdk.extensions.ml
-
A
PTransformusing the Cloud AI Natural language processing capability. - AnnotateText() - Constructor for class org.apache.beam.sdk.extensions.ml.AnnotateText
- AnnotateText.Builder - Class in org.apache.beam.sdk.extensions.ml
- AnnotateVideoFromBytes(PCollectionView<Map<ByteString, VideoContext>>, List<Feature>) - Constructor for class org.apache.beam.sdk.extensions.ml.VideoIntelligence.AnnotateVideoFromBytes
- AnnotateVideoFromBytesWithContext(List<Feature>) - Constructor for class org.apache.beam.sdk.extensions.ml.VideoIntelligence.AnnotateVideoFromBytesWithContext
- AnnotateVideoFromUri(PCollectionView<Map<String, VideoContext>>, List<Feature>) - Constructor for class org.apache.beam.sdk.extensions.ml.VideoIntelligence.AnnotateVideoFromUri
- AnnotateVideoFromURIWithContext(List<Feature>) - Constructor for class org.apache.beam.sdk.extensions.ml.VideoIntelligence.AnnotateVideoFromURIWithContext
- annotations - Variable in class org.apache.beam.sdk.transforms.PTransform
- Annotations For PipelineOptions - Search tag in interface org.apache.beam.sdk.options.PipelineOptions
- Section
- any(long) - Static method in class org.apache.beam.sdk.transforms.Sample
-
Sample#any(long)takes aPCollection<T>and a limit, and produces a newPCollection<T>containing up to limit elements of the inputPCollection. - anyCombineFn(int) - Static method in class org.apache.beam.sdk.transforms.Sample
-
Returns a
Combine.CombineFnthat computes a fixed-sized potentially non-uniform sample of its inputs. - anyOf(Iterable<SerializableMatcher<? super T>>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcherwith identical criteria toMatchers.anyOf(Iterable). - anyOf(SerializableMatcher<T>...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcherwith identical criteria toMatchers.anyOf(Matcher[]). - anything() - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcherwith identical criteria toMatchers.anything(). - anyValueCombineFn() - Static method in class org.apache.beam.sdk.transforms.Sample
-
Returns a
Combine.CombineFnthat computes a single and potentially non-uniform sample value of its inputs. - API_METRIC_LABEL - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl
- ApiIOError - Class in org.apache.beam.io.requestresponse
-
ApiIOErroris a data class for storing details about an error. - ApiIOError() - Constructor for class org.apache.beam.io.requestresponse.ApiIOError
- append(K, W, Iterator<V>) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.BagUserStateHandler
-
Appends the values to the bag user state for the given key and window.
- APPEND - Enum constant in enum class org.apache.beam.sdk.io.redis.RedisIO.Write.Method
-
Use APPEND command.
- APPEND - Enum constant in enum class org.apache.beam.sdk.io.snowflake.enums.WriteDisposition
- APPEND_ROWS - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics.RpcMethod
- appendRows(long, ProtoRows) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StreamAppendClient
-
Append rows to a Storage API write stream at the given offset.
- appendRowsRowStatusCounter(BigQuerySinkMetrics.RowStatus, String, String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
- applicableTo(PCollection<?>) - Static method in class org.apache.beam.runners.dataflow.internal.DataflowGroupByKey
- applicableTo(PCollection<?>) - Static method in class org.apache.beam.sdk.transforms.GroupByKey
- ApplicationNameOptions - Interface in org.apache.beam.sdk.options
-
Options that allow setting the application name.
- apply(byte[]) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages.DeserializeBytesIntoPubsubMessagePayloadOnly
- apply(byte[]) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages.ParsePubsubMessageProtoAsPayload
- apply(double, double) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
-
Applies the binary operation to the two operands, returning the result.
- apply(int, int) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
-
Applies the binary operation to the two operands, returning the result.
- apply(long, long) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
-
Applies the binary operation to the two operands, returning the result.
- apply(MongoCollection<Document>) - Method in class org.apache.beam.sdk.io.mongodb.AggregationQuery
- apply(MongoCollection<Document>) - Method in class org.apache.beam.sdk.io.mongodb.FindQuery
- apply(InputT) - Method in interface org.apache.beam.sdk.coders.DelegateCoder.CodingFunction
- apply(InputT) - Method in class org.apache.beam.sdk.transforms.InferableFunction
- apply(InputT) - Method in interface org.apache.beam.sdk.transforms.ProcessFunction
-
Returns the result of invoking this function on the given input.
- apply(InputT) - Method in interface org.apache.beam.sdk.transforms.SerializableFunction
-
Returns the result of invoking this function on the given input.
- apply(InputT) - Method in class org.apache.beam.sdk.transforms.SimpleFunction
- apply(InputT, Contextful.Fn.Context) - Method in interface org.apache.beam.sdk.transforms.Contextful.Fn
-
Invokes the function on the given input with the given context.
- apply(Iterable<? extends InputT>) - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
- apply(Iterable<? extends InputT>) - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
-
Applies this
CombineFnto a collection of input values to produce a combined output value. - apply(Iterable<? extends InputT>, CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
-
Applies this
CombineFnWithContextto a collection of input values to produce a combined output value. - apply(String, Session) - Method in class org.apache.beam.sdk.io.jms.TextMessageMapper
- apply(String, PTransform<? super PBegin, OutputT>) - Method in class org.apache.beam.sdk.Pipeline
- apply(String, PTransform<? super PBegin, OutputT>) - Method in class org.apache.beam.sdk.values.PBegin
-
Applies the given
PTransformto thisPBegin, usingnameto identify this specific application of the transform. - apply(String, PTransform<? super PCollection<T>, OutputT>) - Method in class org.apache.beam.sdk.values.PCollection
-
Applies the given
PTransformto this inputPCollection, usingnameto identify this specific application of the transform. - apply(String, PTransform<? super PCollectionRowTuple, OutputT>) - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
-
Applies the given
PTransformto this inputPCollectionRowTuple, usingnameto identify this specific application of the transform. - apply(String, PTransform<? super PCollectionTuple, OutputT>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
-
Applies the given
PTransformto this inputPCollectionTuple, usingnameto identify this specific application of the transform. - apply(String, PTransform<KeyedPCollectionTuple<K>, OutputT>) - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
- apply(String, PTransform<PCollectionList<T>, OutputT>) - Method in class org.apache.beam.sdk.values.PCollectionList
-
Applies the given
PTransformto this inputPCollectionList, usingnameto identify this specific application of the transform. - apply(String, T) - Method in interface org.apache.beam.sdk.io.aws2.sqs.SqsIO.WriteBatches.EntryMapperFn.Builder
- apply(Void) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceProviderFromDataSourceConfiguration
- apply(Void) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.PoolableDataSourceProvider
- apply(Void) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverProviderFromDriverConfiguration
- apply(Void) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceProviderFromDataSourceConfiguration
- apply(SQLException) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DefaultRetryStrategy
- apply(SQLException) - Method in interface org.apache.beam.sdk.io.jdbc.JdbcIO.RetryStrategy
- apply(Schema) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory.GenericDatumFactory
- apply(Schema) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory.ReflectDatumFactory
- apply(Schema) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory.SpecificDatumFactory
- apply(Schema) - Method in interface org.apache.beam.sdk.extensions.avro.io.AvroSink.DatumWriterFactory
- apply(Schema, Schema) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory.GenericDatumFactory
- apply(Schema, Schema) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory.ReflectDatumFactory
- apply(Schema, Schema) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory.SpecificDatumFactory
- apply(Schema, Schema) - Method in interface org.apache.beam.sdk.extensions.avro.io.AvroSource.DatumReaderFactory
- apply(ByteArray, Option<byte[]>, State<StateAndTimers>) - Method in class org.apache.beam.runners.spark.translation.streaming.ParDoStateUpdateFn
- apply(SqsMessage) - Method in class org.apache.beam.sdk.io.aws2.sqs.providers.SqsReadSchemaTransformProvider.SqsMessageToBeamRow
- apply(FileIO.ReadableFile, OffsetRange, Exception) - Method in class org.apache.beam.sdk.io.ReadAllViaFileBasedSource.ReadFileRangesFnExceptionHandler
- apply(HealthcareIOError<T>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOErrorToTableRow
- apply(PubsubMessage) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages.ParsePayloadAsPubsubMessageProto
- apply(Pipeline, String, RunnerApi.FunctionSpec, Map<String, PCollection<?>>) - Method in interface org.apache.beam.sdk.expansion.service.TransformProvider
- apply(Schema, Schema) - Method in class org.apache.beam.sdk.schemas.transforms.Cast.Narrowing
- apply(Schema, Schema) - Method in interface org.apache.beam.sdk.schemas.transforms.Cast.Validator
- apply(Schema, Schema) - Method in class org.apache.beam.sdk.schemas.transforms.Cast.Widening
- apply(Schema, Schema) - Method in class org.apache.beam.sdk.schemas.utils.SchemaZipFold
- apply(Materializations.IterableView<KV<K, V>>) - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryMapViewFn
- apply(Materializations.IterableView<KV<K, V>>) - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryMultimapViewFn
- apply(Materializations.IterableView<T>) - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryListViewFn
- apply(Materializations.IterableView<T>) - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableBackedListViewFn
- apply(Materializations.IterableView<T>) - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableViewFn2
- apply(Materializations.IterableView<T>) - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn2
- apply(Materializations.MultimapView<Long, PCollectionViews.ValueOrMetadata<T, OffsetRange>>) - Method in class org.apache.beam.sdk.values.PCollectionViews.ListViewFn2
- apply(Materializations.MultimapView<Void, KV<K, V>>) - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryMapFromVoidKeyViewFn
- apply(Materializations.MultimapView<Void, KV<K, V>>) - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryMultimapFromVoidKeyViewFn
- apply(Materializations.MultimapView<Void, KV<K, V>>) - Method in class org.apache.beam.sdk.values.PCollectionViews.MapViewFn
-
Deprecated.
- apply(Materializations.MultimapView<Void, KV<K, V>>) - Method in class org.apache.beam.sdk.values.PCollectionViews.MultimapViewFn
-
Deprecated.
- apply(Materializations.MultimapView<Void, T>) - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryListFromMultimapViewFn
- apply(Materializations.MultimapView<Void, T>) - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableViewFn
-
Deprecated.
- apply(Materializations.MultimapView<Void, T>) - Method in class org.apache.beam.sdk.values.PCollectionViews.ListViewFn
-
Deprecated.
- apply(Materializations.MultimapView<Void, T>) - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn
-
Deprecated.
- apply(Materializations.MultimapView<K, V>) - Method in class org.apache.beam.sdk.values.PCollectionViews.MapViewFn2
- apply(Materializations.MultimapView<K, V>) - Method in class org.apache.beam.sdk.values.PCollectionViews.MultimapViewFn2
- apply(PTransform<? super PBegin, OutputT>) - Method in class org.apache.beam.sdk.Pipeline
-
Like
Pipeline.apply(String, PTransform)but the transform node in thePipelinegraph will be named according toPTransform.getName(). - apply(PTransform<? super PBegin, OutputT>) - Method in class org.apache.beam.sdk.values.PBegin
-
Like
PBegin.apply(String, PTransform)but defaulting to the name of thePTransform. - apply(PTransform<? super PCollection<T>, OutputT>) - Method in class org.apache.beam.sdk.values.PCollection
-
of the
PTransform. - apply(PTransform<? super PCollectionRowTuple, OutputT>) - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
-
Like
PCollectionRowTuple.apply(String, PTransform)but defaulting to the name of thePTransform. - apply(PTransform<? super PCollectionTuple, OutputT>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
-
Like
PCollectionTuple.apply(String, PTransform)but defaulting to the name of thePTransform. - apply(PTransform<KeyedPCollectionTuple<K>, OutputT>) - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
-
Like
KeyedPCollectionTuple.apply(String, PTransform)but defaulting to the name provided by thePTransform. - apply(PTransform<PCollectionList<T>, OutputT>) - Method in class org.apache.beam.sdk.values.PCollectionList
-
Like
PCollectionList.apply(String, PTransform)but defaulting to the name of thePTransform. - apply(WithFailures.ExceptionElement<T>) - Method in class org.apache.beam.sdk.transforms.WithFailures.ExceptionAsMapHandler
- apply(WithFailures.ExceptionElement<T>) - Method in class org.apache.beam.sdk.transforms.WithFailures.ThrowableHandler
- apply(KV<String, Long>) - Method in class org.apache.beam.runners.spark.structuredstreaming.examples.WordCount.FormatAsTextFn
- apply(Row) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamSetOperatorsTransforms.BeamSqlRow2KvFn
- apply(Row) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BeamRowToBigtableMutation.ToBigtableRowFn
- apply(Row) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.GetMutationsFromBeamRow
- apply(ValueInSingleWindow<byte[]>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite.ParsePubsubMessageProtoAsPayloadFromWindowedValue
- apply(TopicPartition) - Method in class org.apache.beam.sdk.io.kafka.CheckStopReadingFnWrapper
- apply(Statement, Description) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
- apply(Statement, Description) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
- apply(Statement, Description) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal
- apply(Statement, Description) - Method in class org.apache.beam.sdk.testing.TestPipeline
- apply(PrimitiveViewT) - Method in class org.apache.beam.sdk.transforms.ViewFn
-
A function to adapt a primitive view type to a desired view type.
- apply(Tuple2<ByteArray, byte[]>) - Method in class org.apache.beam.runners.spark.coders.CoderHelpers.FromByteFunction
- apply(T) - Method in class org.apache.beam.sdk.testing.PAssert.MatcherCheckerFn
- apply(T1) - Method in interface org.apache.beam.sdk.function.ThrowingFunction
- apply(T1, T2) - Method in interface org.apache.beam.sdk.function.ThrowingBiFunction
- apply(V, V) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
-
Applies the binary operation to the two operands, returning the result.
- applyBeamPTransform(Map<String, ? extends DataSet<?>>, PTransform<PCollectionTuple, PCollection<OutputT>>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataSetAdapter
- applyBeamPTransform(Map<String, ? extends DataStream<?>>, PTransform<PCollectionTuple, PCollection<OutputT>>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataStreamAdapter
- applyBeamPTransform(DataSet<InputT>, PTransform<CollectionT, PCollection<OutputT>>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataSetAdapter
- applyBeamPTransform(ExecutionEnvironment, PTransform<PBegin, PCollection<OutputT>>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataSetAdapter
- applyBeamPTransform(DataStream<InputT>, PTransform<CollectionT, PCollection<OutputT>>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataStreamAdapter
- applyBeamPTransform(StreamExecutionEnvironment, PTransform<PBegin, PCollection<OutputT>>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataStreamAdapter
- applyInputWatermarkHold(long) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
-
Allows to apply a hold to the input watermark.
- applyInputWatermarkHold(long) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.ExecutableStageDoFnOperator
- applyMultiOutputBeamPTransform(Map<String, ? extends DataSet<?>>, PTransform<PCollectionTuple, PCollectionTuple>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataSetAdapter
- applyMultiOutputBeamPTransform(Map<String, ? extends DataStream<?>>, PTransform<PCollectionTuple, PCollectionTuple>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataStreamAdapter
- applyMultiOutputBeamPTransform(DataSet<InputT>, PTransform<CollectionT, PCollectionTuple>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataSetAdapter
- applyMultiOutputBeamPTransform(ExecutionEnvironment, PTransform<PBegin, PCollectionTuple>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataSetAdapter
- applyMultiOutputBeamPTransform(DataStream<InputT>, PTransform<CollectionT, PCollectionTuple>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataStreamAdapter
- applyMultiOutputBeamPTransform(StreamExecutionEnvironment, PTransform<PBegin, PCollectionTuple>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataStreamAdapter
- applyNoOutputBeamPTransform(Map<String, ? extends DataSet<?>>, PTransform<PCollectionTuple, PDone>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataSetAdapter
- applyNoOutputBeamPTransform(Map<String, ? extends DataStream<?>>, PTransform<PCollectionTuple, PDone>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataStreamAdapter
- applyNoOutputBeamPTransform(DataSet<InputT>, PTransform<CollectionT, PDone>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataSetAdapter
- applyNoOutputBeamPTransform(ExecutionEnvironment, PTransform<PBegin, PDone>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataSetAdapter
- applyNoOutputBeamPTransform(DataStream<InputT>, PTransform<CollectionT, PDone>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataStreamAdapter
- applyNoOutputBeamPTransform(StreamExecutionEnvironment, PTransform<PBegin, PDone>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataStreamAdapter
- applyOutputWatermarkHold(long, long) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
-
Allows to apply a hold to the output watermark before it is sent out.
- applyOutputWatermarkHold(long, long) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.ExecutableStageDoFnOperator
- applyRowMutations() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
-
Write
RowMutationmessages to BigQuery. - applySdkEnvironmentOverrides(RunnerApi.Pipeline, DataflowPipelineOptions) - Method in class org.apache.beam.runners.dataflow.DataflowRunner
- applyTransform(InputT, PTransform<? super InputT, OutputT>) - Static method in class org.apache.beam.sdk.Pipeline
-
For internal use only; no backwards-compatibility guarantees.
- applyTransform(String, InputT, PTransform<? super InputT, OutputT>) - Static method in class org.apache.beam.sdk.Pipeline
-
For internal use only; no backwards-compatibility guarantees.
- applyWindowing() - Method in class org.apache.beam.sdk.transforms.PeriodicImpulse
- ApproximateCountDistinct - Class in org.apache.beam.sdk.extensions.zetasketch
-
PTransforms for estimating the number of distinct elements in aPCollection, or the number of distinct values associated with each key in aPCollectionofKVs. - ApproximateCountDistinct() - Constructor for class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct
- ApproximateCountDistinct.Globally<T> - Class in org.apache.beam.sdk.extensions.zetasketch
-
PTransformfor estimating the number of distinct elements in aPCollection. - ApproximateCountDistinct.Globally.Builder<T> - Class in org.apache.beam.sdk.extensions.zetasketch
- ApproximateCountDistinct.PerKey<K,
V> - Class in org.apache.beam.sdk.extensions.zetasketch - ApproximateCountDistinct.PerKey.Builder<K,
V> - Class in org.apache.beam.sdk.extensions.zetasketch - ApproximateDistinct - Class in org.apache.beam.sdk.extensions.sketching
-
PTransforms for computing the approximate number of distinct elements in a stream. - ApproximateDistinct() - Constructor for class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct
- ApproximateDistinct.ApproximateDistinctFn<InputT> - Class in org.apache.beam.sdk.extensions.sketching
-
Implements the
Combine.CombineFnofApproximateDistincttransforms. - ApproximateDistinct.GloballyDistinct<InputT> - Class in org.apache.beam.sdk.extensions.sketching
-
Implementation of
ApproximateDistinct.globally(). - ApproximateDistinct.HyperLogLogPlusCoder - Class in org.apache.beam.sdk.extensions.sketching
-
Coder for
HyperLogLogPlusclass. - ApproximateDistinct.PerKeyDistinct<K,
V> - Class in org.apache.beam.sdk.extensions.sketching -
Implementation of
ApproximateDistinct.perKey(). - ApproximateQuantiles - Class in org.apache.beam.sdk.transforms
-
PTransforms for getting an idea of aPCollection's data distribution using approximateN-tiles (e.g. - ApproximateQuantiles.ApproximateQuantilesCombineFn<T,
ComparatorT> - Class in org.apache.beam.sdk.transforms -
The
ApproximateQuantilesCombineFncombiner gives an idea of the distribution of a collection of values using approximateN-tiles. - ApproximateUnique - Class in org.apache.beam.sdk.transforms
-
Deprecated.
- ApproximateUnique() - Constructor for class org.apache.beam.sdk.transforms.ApproximateUnique
-
Deprecated.
- ApproximateUnique.ApproximateUniqueCombineFn<T> - Class in org.apache.beam.sdk.transforms
-
Deprecated.
CombineFnthat computes an estimate of the number of distinct values that were combined. - ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique - Class in org.apache.beam.sdk.transforms
-
Deprecated.A heap utility class to efficiently track the largest added elements.
- ApproximateUnique.Globally<T> - Class in org.apache.beam.sdk.transforms
-
Deprecated.
PTransformfor estimating the number of distinct elements in aPCollection. - ApproximateUnique.PerKey<K,
V> - Class in org.apache.beam.sdk.transforms -
Deprecated.
PTransformfor estimating the number of distinct values associated with each key in aPCollectionofKVs. - ApproximateUniqueCombineFn(long, Coder<T>) - Constructor for class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
-
Deprecated.
- arbitrarily() - Static method in class org.apache.beam.sdk.transforms.Redistribute
- array() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
-
Returns the backing array.
- array(TableSchema.ColumnType) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- array(Schema.FieldType) - Static method in class org.apache.beam.sdk.schemas.Schema.FieldType
-
Create an array type for the given field type.
- array(Schema.FieldType, boolean) - Static method in class org.apache.beam.sdk.schemas.Schema.FieldType
-
Deprecated.Set the nullability on the elementType instead
- ARRAY - Enum constant in enum class org.apache.beam.sdk.io.clickhouse.TableSchema.TypeName
- ARRAY - Enum constant in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- ArrayAgg - Class in org.apache.beam.sdk.extensions.sql.impl.udaf
- ArrayAgg() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.udaf.ArrayAgg
- ArrayAgg.ArrayAggArray<T> - Class in org.apache.beam.sdk.extensions.sql.impl.udaf
- ArrayAggArray() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.udaf.ArrayAgg.ArrayAggArray
- arrayContaining(List<SerializableMatcher<? super T>>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcherwith identical criteria toMatchers.arrayContaining(List). - arrayContaining(Coder<T>, T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcherwith identical criteria toMatchers.arrayContaining(Object[]). - arrayContaining(SerializableMatcher<? super T>...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcherwith identical criteria toMatchers.arrayContaining(Matcher[]). - arrayContaining(T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcherwith identical criteria toMatchers.arrayContaining(Object[]). - arrayContainingInAnyOrder(Collection<SerializableMatcher<? super T>>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcherwith identical criteria toMatchers.arrayContainingInAnyOrder(Collection). - arrayContainingInAnyOrder(Coder<T>, T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcherwith identical criteria toMatchers.arrayContainingInAnyOrder(Object[]). - arrayContainingInAnyOrder(SerializableMatcher<? super T>...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcherwith identical criteria toMatchers.arrayContainingInAnyOrder(Matcher[]). - arrayContainingInAnyOrder(T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcherwith identical criteria toMatchers.arrayContainingInAnyOrder(Object[]). - ArrayCopyState() - Constructor for class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark.NewVsCopy.ArrayCopyState
- arrayElementType() - Method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- ArrayNewState() - Constructor for class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark.NewVsCopy.ArrayNewState
- ArrayOfNestedStringBundle() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.ArrayOfNestedStringBundle
- ArrayOfStringBundle() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.ArrayOfStringBundle
- arrayQualifier() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- arrayQualifier() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.ArrayQualifierListContext
- ArrayQualifierContext(ParserRuleContext, int) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.ArrayQualifierContext
- ArrayQualifierListContext(FieldSpecifierNotationParser.QualifierListContext) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.ArrayQualifierListContext
- arrayWithSize(int) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcherwith identical criteria toMatchers.arrayWithSize(int). - arrayWithSize(SerializableMatcher<? super Integer>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcherwith identical criteria toMatchers.arrayWithSize(Matcher). - ArrowConversion - Class in org.apache.beam.sdk.extensions.arrow
- ArrowConversion.ArrowSchemaTranslator - Class in org.apache.beam.sdk.extensions.arrow
-
Converts Arrow schema to Beam row schema.
- ArrowConversion.RecordBatchRowIterator - Class in org.apache.beam.sdk.extensions.arrow
- arrowSchemaFromInput(InputStream) - Static method in class org.apache.beam.sdk.extensions.arrow.ArrowConversion
- ArrowSchemaTranslator() - Constructor for class org.apache.beam.sdk.extensions.arrow.ArrowConversion.ArrowSchemaTranslator
- ArtifactDestination() - Constructor for class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService.ArtifactDestination
- ArtifactRetrievalService - Class in org.apache.beam.runners.fnexecution.artifact
-
An
ArtifactRetrievalServicethat usesFileSystemsas its backing storage. - ArtifactRetrievalService() - Constructor for class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
- ArtifactRetrievalService(int) - Constructor for class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
- ArtifactRetrievalService(ArtifactResolver) - Constructor for class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
- ArtifactRetrievalService(ArtifactResolver, int) - Constructor for class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
- ArtifactStagingService - Class in org.apache.beam.runners.fnexecution.artifact
- ArtifactStagingService(ArtifactStagingService.ArtifactDestinationProvider) - Constructor for class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService
- ArtifactStagingService.ArtifactDestination - Class in org.apache.beam.runners.fnexecution.artifact
-
A pairing of a newly created artifact type and an output stream that will be readable at that type.
- ArtifactStagingService.ArtifactDestinationProvider - Interface in org.apache.beam.runners.fnexecution.artifact
-
Provides a concrete location to which artifacts can be staged on retrieval.
- as(Class<T>) - Method in interface org.apache.beam.sdk.options.PipelineOptions
-
Transforms this object into an object of type
<T>saving each property that has been manipulated. - as(Class<T>) - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
-
Creates and returns an object that implements
<T>. - as(Class<T>) - Method in class org.apache.beam.sdk.options.PipelineOptionsFactory.Builder
-
Creates and returns an object that implements
<T>using the values configured on this builder during construction. - asCloudObject(Coder<?>, SdkComponents) - Static method in class org.apache.beam.runners.dataflow.util.CloudObjects
-
Convert the provided
Coderinto aCloudObject. - asInputStream(int, int) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
-
Returns an
InputStreamwrapper which supplies the portion of this backing byte buffer starting atoffsetand up tolengthbytes. - asIterable() - Static method in class org.apache.beam.sdk.transforms.View
-
Returns a
View.AsIterabletransform that takes aPCollectionas input and produces aPCollectionViewmapping each window to anIterableof the values in that window. - AsJsons<InputT> - Class in org.apache.beam.sdk.extensions.jackson
-
PTransformfor serializing objects to JSONStrings. - AsJsons.AsJsonsWithFailures<FailureT> - Class in org.apache.beam.sdk.extensions.jackson
-
A
PTransformthat adds exception handling toAsJsons. - asList() - Static method in class org.apache.beam.sdk.transforms.View
-
Returns a
View.AsListtransform that takes aPCollectionand returns aPCollectionViewmapping each window to aListcontaining all of the elements in the window. - asMap() - Method in class org.apache.beam.sdk.transforms.display.DisplayData
- asMap() - Static method in class org.apache.beam.sdk.transforms.View
-
Returns a
View.AsMaptransform that takes aPCollection<KV<K, V>>as input and produces aPCollectionViewmapping each window to aMap<K, V>. - asMultimap() - Static method in class org.apache.beam.sdk.transforms.View
-
Returns a
View.AsMultimaptransform that takes aPCollection<KV<K, V>>as input and produces aPCollectionViewmapping each window to its contents as aMap<K, Iterable<V>>for use as a side input. - asOutputReference(PValue, AppliedPTransform<?, ?, ?>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
-
Encode a PValue reference as an output reference.
- asOutputStream() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
-
Returns an output stream which writes to the backing buffer from the current position.
- asPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubSubscription
-
Returns the string representation of this subscription as a path used in the Cloud Pub/Sub API.
- asPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
-
Returns the string representation of this topic as a path used in the Cloud Pub/Sub API.
- asQueryable(QueryProvider, SchemaPlus, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteTable
- asResponseObserver() - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClient
- assertionError() - Method in class org.apache.beam.sdk.testing.SuccessOrFailure
- assertSourcesEqualReferenceSource(BoundedSource<T>, List<? extends BoundedSource<T>>, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
-
Given a reference
Sourceand a list ofSources, assert that the union of the records read from the list of sources is equal to the records read from the reference source. - assertSplitAtFractionBehavior(BoundedSource<T>, int, double, SourceTestUtils.ExpectedSplitOutcome, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
-
Asserts that the
source's reader either fails tosplitAtFraction(fraction)after readingnumItemsToReadBeforeSplititems, or succeeds in a way that is consistent according toSourceTestUtils.assertSplitAtFractionSucceedsAndConsistent(org.apache.beam.sdk.io.BoundedSource<T>, int, double, org.apache.beam.sdk.options.PipelineOptions). - assertSplitAtFractionExhaustive(BoundedSource<T>, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
-
Asserts that for each possible start position,
BoundedSource.BoundedReader.splitAtFraction(double)at every interesting fraction (halfway between two fractions that differ by at least one item) can be called successfully and the results are consistent if a split succeeds. - assertSplitAtFractionFails(BoundedSource<T>, int, double, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
-
Asserts that the
source's reader fails tosplitAtFraction(fraction)after readingnumItemsToReadBeforeSplititems. - assertSplitAtFractionSucceedsAndConsistent(BoundedSource<T>, int, double, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
-
Verifies some consistency properties of
BoundedSource.BoundedReader.splitAtFraction(double)on the given source. - assertSubscriptionEventuallyCreated(String, Duration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
-
Block until a subscription is created for this test topic in the specified project.
- assertThatAllRows(Schema) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
- assertThatTopicEventuallyReceives(Matcher<PubsubMessage>...) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
-
Repeatedly pull messages from
TestPubsub.subscriptionPath()until receiving one for each matcher (or timeout is reached), then assert that the received messages match the expectations. - assertUnstartedReaderReadsSameAsItsSource(BoundedSource.BoundedReader<T>, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
-
Assert that a
Readerreturns aSourcethat, when read from, produces the same records as the reader. - assign(BoundedWindow, Instant) - Method in enum class org.apache.beam.sdk.transforms.windowing.TimestampCombiner
-
Shorthand for
TimestampCombiner.merge(org.apache.beam.sdk.transforms.windowing.BoundedWindow, java.lang.Iterable<? extends org.joda.time.Instant>)with just one element, to place it into the context of a window. - assignableTo(Schema) - Method in class org.apache.beam.sdk.schemas.Schema
-
Returns true if this Schema can be assigned to another Schema.
- assignableToIgnoreNullable(Schema) - Method in class org.apache.beam.sdk.schemas.Schema
-
Returns true if this Schema can be assigned to another Schema, ignoring nullable.
- AssignContext() - Constructor for class org.apache.beam.sdk.transforms.windowing.WindowFn.AssignContext
- assignedWindows(WindowFn<T, W>, long) - Static method in class org.apache.beam.sdk.testing.WindowFnTestUtils
- assignedWindowsWithValue(WindowFn<T, W>, TimestampedValue<T>) - Static method in class org.apache.beam.sdk.testing.WindowFnTestUtils
- AssignShardFn(Integer) - Constructor for class org.apache.beam.sdk.transforms.Reshuffle.AssignShardFn
- assignShardKey(DestinationT, UserT, int) - Method in interface org.apache.beam.sdk.io.ShardingFunction
- assignsToOneWindow() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
- assignsToOneWindow() - Method in class org.apache.beam.sdk.transforms.windowing.PartitioningWindowFn
- assignsToOneWindow() - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
- assignsToOneWindow() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
-
Returns true if this
WindowFnalways assigns an element to exactly one window. - assignWindow(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
- assignWindow(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
- assignWindow(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
- assignWindow(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
- assignWindow(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.PartitioningWindowFn
-
Returns the single window to which elements with this timestamp belong.
- AssignWindowP<T> - Class in org.apache.beam.runners.jet.processors
-
/** * Jet
Processorimplementation for Beam's Windowing primitive. - assignWindows(WindowFn.AssignContext) - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
- assignWindows(WindowFn.AssignContext) - Method in class org.apache.beam.sdk.extensions.sql.impl.TVFSlidingWindowFn
- assignWindows(WindowFn.AssignContext) - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
- assignWindows(WindowFn.AssignContext) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
- assignWindows(WindowFn.AssignContext) - Method in class org.apache.beam.sdk.transforms.windowing.PartitioningWindowFn
- assignWindows(WindowFn.AssignContext) - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
-
Given a timestamp and element, returns the set of windows into which it should be placed.
- assignWindows(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
- AssignWindowsFunction<T> - Class in org.apache.beam.runners.twister2.translators.functions
-
Assign Windows function.
- AssignWindowsFunction(WindowFn<T, BoundedWindow>, PipelineOptions) - Constructor for class org.apache.beam.runners.twister2.translators.functions.AssignWindowsFunction
- AssignWindowTranslatorBatch<T> - Class in org.apache.beam.runners.twister2.translators.batch
-
Assign Window translator.
- AssignWindowTranslatorBatch() - Constructor for class org.apache.beam.runners.twister2.translators.batch.AssignWindowTranslatorBatch
- asSingleton() - Static method in class org.apache.beam.sdk.transforms.View
-
Returns a
View.AsSingletontransform that takes aPCollectionwith a single value per window as input and produces aPCollectionViewthat returns the value in the main input window when read as a side input. - asSingletonView() - Method in class org.apache.beam.sdk.transforms.Combine.Globally
-
Returns a
PTransformthat produces aPCollectionViewwhose elements are the result of combining elements per-window in the inputPCollection. - assumedRoleArn() - Method in class org.apache.beam.sdk.io.aws2.auth.StsAssumeRoleForFederatedCredentialsProvider
- assumeSingleMessageSchema() - Method in class org.apache.beam.sdk.extensions.sbe.SbeSchema.IrOptions
- ASTERISK - Static variable in class org.apache.beam.sdk.extensions.sql.impl.cep.Quantifier
- ASTERISK_RELUCTANT - Static variable in class org.apache.beam.sdk.extensions.sql.impl.cep.Quantifier
- asV1Beta1Path() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubSubscription
-
Deprecated.the v1beta1 API for Cloud Pub/Sub is deprecated.
- asV1Beta1Path() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
-
Deprecated.the v1beta1 API for Cloud Pub/Sub is deprecated.
- asV1Beta2Path() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubSubscription
-
Deprecated.the v1beta2 API for Cloud Pub/Sub is deprecated.
- asV1Beta2Path() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
-
Deprecated.the v1beta2 API for Cloud Pub/Sub is deprecated.
- AsyncBatchWriteHandler<RecT,
ResT> - Class in org.apache.beam.sdk.io.aws2.common -
Async handler that automatically retries unprocessed records in case of a partial success.
- AsyncBatchWriteHandler(int, FluentBackoff, AsyncBatchWriteHandler.Stats, Function<ResT, String>, BiFunction<String, List<RecT>, CompletableFuture<List<ResT>>>) - Constructor for class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
- AsyncBatchWriteHandler.Stats - Interface in org.apache.beam.sdk.io.aws2.common
-
Statistics on the batch request.
- AsyncWatermarkCache - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.cache
-
Asynchronously compute the earliest partition watermark and stores it in memory.
- AsyncWatermarkCache(PartitionMetadataDao, Duration) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.cache.AsyncWatermarkCache
- atMinimumTimestamp(V) - Static method in class org.apache.beam.sdk.values.TimestampedValue
-
Returns a new
TimestampedValuewith theminimum timestamp. - AtomicAccumulatorState() - Constructor for class org.apache.beam.sdk.io.kafka.jmh.KafkaIOUtilsBenchmark.AtomicAccumulatorState
- AtomicCoder<T> - Class in org.apache.beam.sdk.coders
- AtomicCoder() - Constructor for class org.apache.beam.sdk.coders.AtomicCoder
- AtomicLongFactory() - Constructor for class org.apache.beam.sdk.options.PipelineOptions.AtomicLongFactory
- atomicRead(KafkaIOUtilsBenchmark.AtomicAccumulatorState) - Method in class org.apache.beam.sdk.io.kafka.jmh.KafkaIOUtilsBenchmark
- atomicReadWhileWriting(KafkaIOUtilsBenchmark.AtomicAccumulatorState) - Method in class org.apache.beam.sdk.io.kafka.jmh.KafkaIOUtilsBenchmark
- atomicWrite(KafkaIOUtilsBenchmark.AtomicAccumulatorState, KafkaIOUtilsBenchmark.ProducerState) - Method in class org.apache.beam.sdk.io.kafka.jmh.KafkaIOUtilsBenchmark
- atomicWriteWhileReading(KafkaIOUtilsBenchmark.AtomicAccumulatorState, KafkaIOUtilsBenchmark.ProducerState) - Method in class org.apache.beam.sdk.io.kafka.jmh.KafkaIOUtilsBenchmark
- attached() - Method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
- attachValues(Object...) - Method in class org.apache.beam.sdk.values.Row.Builder
- attachValues(List<Object>) - Method in class org.apache.beam.sdk.values.Row.Builder
- attempted(MetricKey, T) - Static method in class org.apache.beam.sdk.metrics.MetricResult
- ATTRIBUTE_ARRAY_ENTRY_SCHEMA - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
- ATTRIBUTE_ARRAY_FIELD_TYPE - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
- ATTRIBUTE_MAP_FIELD_TYPE - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
- AttributeValueCoder - Class in org.apache.beam.sdk.io.aws2.dynamodb
-
A
Coderthat serializes and deserializes theAttributeValueobjects. - audience() - Method in class org.apache.beam.sdk.io.aws2.auth.StsAssumeRoleForFederatedCredentialsProvider
- AUTH_VALIDATION_GROUP - Static variable in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- AuthenticatedRetryInitializer(GoogleCredentials) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.AuthenticatedRetryInitializer
- Authentication - Search tag in class org.apache.beam.sdk.io.solace.SolaceIO
- Section
- Authentication - Search tag in class org.apache.beam.sdk.io.solace.SolaceIO
- Section
- AUTO - Enum constant in enum class org.apache.beam.sdk.io.CompressedSource.CompressionMode
-
Deprecated.
- AUTO - Enum constant in enum class org.apache.beam.sdk.io.Compression
-
When reading a file, automatically determine the compression type based on filename extension.
- AUTO - Enum constant in enum class org.apache.beam.sdk.io.TextIO.CompressionType
-
Deprecated.
- AUTO - Enum constant in enum class org.apache.beam.sdk.io.TFRecordIO.CompressionType
-
Deprecated.
- AUTO - Enum constant in enum class org.apache.beam.sdk.io.xml.XmlIO.Read.CompressionType
-
Deprecated.
- AUTO - Static variable in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- autoCastField(Schema.Field, Object) - Static method in class org.apache.beam.sdk.extensions.sql.impl.schema.BeamTableUtils
-
Attempt to cast an object to a specified Schema.Field.Type.
- autoLoadUserDefinedFunctions() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv.BeamSqlEnvBuilder
-
Load UDF/UDAFs from
UdfUdafProvider. - AutoScaler - Interface in org.apache.beam.sdk.io.jms
-
Enables users to specify their own `JMS` backlog reporters enabling
JmsIOto reportUnboundedSource.UnboundedReader.getTotalBacklogBytes(). - AUTOVALUE_CLASS - Static variable in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- AUTOVALUE_CLASS - Static variable in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
- AutoValueSchema - Class in org.apache.beam.sdk.schemas
-
A
SchemaProviderfor AutoValue classes. - AutoValueSchema() - Constructor for class org.apache.beam.sdk.schemas.AutoValueSchema
- AutoValueSchema.AbstractGetterTypeSupplier - Class in org.apache.beam.sdk.schemas
-
FieldValueTypeSupplierthat's based on AutoValue getters. - AutoValueUtils - Class in org.apache.beam.sdk.schemas.utils
-
Utilities for managing AutoValue schemas.
- AutoValueUtils() - Constructor for class org.apache.beam.sdk.schemas.utils.AutoValueUtils
- AVAILABLE_NOW - Static variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- AvailableParallelismFactory() - Constructor for class org.apache.beam.runners.direct.DirectOptions.AvailableParallelismFactory
- Available transforms - Search tag in class org.apache.beam.sdk.managed.Managed
- Section
- AVG - Enum constant in enum class org.apache.beam.sdk.extensions.sql.impl.cep.CEPKind
- AVRO_CODER_URN - Static variable in class org.apache.beam.sdk.extensions.avro.AvroGenericCoderRegistrar
- AvroCoder<T> - Class in org.apache.beam.sdk.extensions.avro.coders
-
A
Coderusing Avro binary format. - AvroCoder(Class<T>, Schema) - Constructor for class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
- AvroCoder(Class<T>, Schema, boolean) - Constructor for class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
- AvroCoder(AvroDatumFactory<T>, Schema) - Constructor for class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
- AvroConvertType(boolean) - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.AvroConvertType
- AvroDatumFactory<T> - Class in org.apache.beam.sdk.extensions.avro.io
-
Create
DatumReaderandDatumWriterfor given schemas. - AvroDatumFactory(Class<T>) - Constructor for class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory
- AvroDatumFactory.GenericDatumFactory - Class in org.apache.beam.sdk.extensions.avro.io
-
Specialized
AvroDatumFactoryforGenericRecord. - AvroDatumFactory.ReflectDatumFactory<T> - Class in org.apache.beam.sdk.extensions.avro.io
-
Specialized
AvroDatumFactoryfor java classes transforming to avro through reflection. - AvroDatumFactory.SpecificDatumFactory<T> - Class in org.apache.beam.sdk.extensions.avro.io
-
Specialized
AvroDatumFactoryforSpecificRecord. - AvroGenericCoder - Class in org.apache.beam.sdk.extensions.avro.coders
-
AvroCoder specialisation for GenericRecord, needed for cross-language transforms.
- AvroGenericCoderRegistrar - Class in org.apache.beam.sdk.extensions.avro
-
Coder registrar for AvroGenericCoder.
- AvroGenericCoderRegistrar() - Constructor for class org.apache.beam.sdk.extensions.avro.AvroGenericCoderRegistrar
- AvroGenericCoderTranslator - Class in org.apache.beam.sdk.extensions.avro
-
Coder translator for AvroGenericCoder.
- AvroGenericCoderTranslator() - Constructor for class org.apache.beam.sdk.extensions.avro.AvroGenericCoderTranslator
- AvroGenericRecordToStorageApiProto - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Utility methods for converting Avro
GenericRecordobjects to dynamic protocol message, for use with the Storage write API. - AvroGenericRecordToStorageApiProto() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.AvroGenericRecordToStorageApiProto
- AvroIO - Class in org.apache.beam.sdk.extensions.avro.io
-
PTransforms for reading and writing Avro files. - AvroIO.Parse<T> - Class in org.apache.beam.sdk.extensions.avro.io
- AvroIO.ParseAll<T> - Class in org.apache.beam.sdk.extensions.avro.io
-
Deprecated.See
AvroIO.parseAllGenericRecords(SerializableFunction)for details. - AvroIO.ParseFiles<T> - Class in org.apache.beam.sdk.extensions.avro.io
- AvroIO.Read<T> - Class in org.apache.beam.sdk.extensions.avro.io
-
Implementation of
AvroIO.read(java.lang.Class<T>)andAvroIO.readGenericRecords(org.apache.avro.Schema). - AvroIO.ReadAll<T> - Class in org.apache.beam.sdk.extensions.avro.io
-
Deprecated.See
AvroIO.readAll(Class)for details. - AvroIO.ReadFiles<T> - Class in org.apache.beam.sdk.extensions.avro.io
-
Implementation of
AvroIO.readFiles(java.lang.Class<T>). - AvroIO.RecordFormatter<ElementT> - Interface in org.apache.beam.sdk.extensions.avro.io
-
Deprecated.Users can achieve the same by providing this transform in a
ParDobefore using write in AvroIOAvroIO.write(Class). - AvroIO.Sink<ElementT> - Class in org.apache.beam.sdk.extensions.avro.io
- AvroIO.TypedWrite<UserT,
DestinationT, - Class in org.apache.beam.sdk.extensions.avro.ioOutputT> -
Implementation of
AvroIO.write(java.lang.Class<T>). - AvroIO.Write<T> - Class in org.apache.beam.sdk.extensions.avro.io
-
This class is used as the default return value of
AvroIO.write(java.lang.Class<T>) - AvroJavaTimeConversions - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
-
Avro 1.8 ships with joda time conversions only.
- AvroJavaTimeConversions() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions
- AvroJavaTimeConversions.DateConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroJavaTimeConversions.LocalTimestampMicros - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroJavaTimeConversions.LocalTimestampMicrosConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroJavaTimeConversions.LocalTimestampMillis - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroJavaTimeConversions.LocalTimestampMillisConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroJavaTimeConversions.TimeMicrosConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroJavaTimeConversions.TimeMillisConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroJavaTimeConversions.TimestampMicrosConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroJavaTimeConversions.TimestampMillisConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroJodaTimeConversions - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
-
Avro 1.8 invalid input: '&' 1.9 ship joda time conversions.
- AvroJodaTimeConversions() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions
- AvroJodaTimeConversions.DateConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroJodaTimeConversions.LossyTimeMicrosConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroJodaTimeConversions.LossyTimestampMicrosConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroJodaTimeConversions.TimeConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroJodaTimeConversions.TimeMicrosConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroJodaTimeConversions.TimestampConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroJodaTimeConversions.TimestampMicrosConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroPayloadSerializerProvider - Class in org.apache.beam.sdk.extensions.avro.schemas.io.payloads
- AvroPayloadSerializerProvider() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.io.payloads.AvroPayloadSerializerProvider
- AvroReader(AvroSource<T>) - Constructor for class org.apache.beam.sdk.extensions.avro.io.AvroSource.AvroReader
-
Reads Avro records of type
Tfrom the specified source. - AvroReadSchemaTransformFormatProvider - Class in org.apache.beam.sdk.io.fileschematransform
- AvroReadSchemaTransformFormatProvider() - Constructor for class org.apache.beam.sdk.io.fileschematransform.AvroReadSchemaTransformFormatProvider
- AvroRecordSchema - Class in org.apache.beam.sdk.extensions.avro.schemas
-
A
SchemaProviderfor AVRO generated SpecificRecords and POJOs. - AvroRecordSchema() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.AvroRecordSchema
- AvroSchemaInformationProvider - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroSchemaInformationProvider() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroSchemaInformationProvider
- AvroSchemaIOProvider - Class in org.apache.beam.sdk.extensions.avro.io
-
An implementation of
SchemaIOProviderfor reading and writing Avro files withAvroIO. - AvroSchemaIOProvider() - Constructor for class org.apache.beam.sdk.extensions.avro.io.AvroSchemaIOProvider
- AvroSink<UserT,
DestinationT, - Class in org.apache.beam.sdk.extensions.avro.ioOutputT> -
A
FileBasedSinkfor Avro files. - AvroSink.DatumWriterFactory<T> - Interface in org.apache.beam.sdk.extensions.avro.io
- AvroSource<T> - Class in org.apache.beam.sdk.extensions.avro.io
-
Do not use in pipelines directly: most users should use
AvroIO.Read. - AvroSource.AvroReader<T> - Class in org.apache.beam.sdk.extensions.avro.io
-
A
BlockBasedSource.BlockBasedReaderfor reading blocks from Avro files. - AvroSource.DatumReaderFactory<T> - Interface in org.apache.beam.sdk.extensions.avro.io
- AvroTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.avro
-
TableProviderforAvroIOfor consumption by Beam SQL. - AvroTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.avro.AvroTableProvider
- AvroUtils - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
-
Utils to convert AVRO records to Beam rows.
- AvroUtils.AvroConvertType - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroUtils.AvroConvertValueForGetter - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroUtils.AvroConvertValueForSetter - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroUtils.FixedBytesField - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
-
Wrapper for fixed byte fields.
- AvroUtils.TypeWithNullability - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroWriteRequest<T> - Class in org.apache.beam.sdk.io.gcp.bigquery
- AvroWriteRequest(T, Schema) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.AvroWriteRequest
- AvroWriteSchemaTransformFormatProvider - Class in org.apache.beam.sdk.io.fileschematransform
-
A
FileWriteSchemaTransformFormatProviderfor avro format. - AvroWriteSchemaTransformFormatProvider() - Constructor for class org.apache.beam.sdk.io.fileschematransform.AvroWriteSchemaTransformFormatProvider
- awaitCompletion() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver
-
Uses the callers thread to process all elements received until we receive the end of the stream from the upstream producer for all endpoints specified.
- awaitTermination(Duration) - Method in class org.apache.beam.runners.spark.SparkPipelineResult
- awaitTermination(Duration) - Method in class org.apache.beam.runners.spark.SparkRunnerDebugger.DebugSparkPipelineResult
- AwsBuilderFactory<PojoT,
BuilderT> - Class in org.apache.beam.sdk.io.aws2.schemas -
Builder factory for AWS
SdkPojoto avoid using reflection to instantiate a builder. - AwsBuilderFactory() - Constructor for class org.apache.beam.sdk.io.aws2.schemas.AwsBuilderFactory
- AwsModule - Class in org.apache.beam.sdk.io.aws2.options
-
A Jackson
Modulethat registers aJsonSerializerandJsonDeserializerforAwsCredentialsProviderand some subclasses. - AwsModule() - Constructor for class org.apache.beam.sdk.io.aws2.options.AwsModule
- AwsOptions - Interface in org.apache.beam.sdk.io.aws2.options
-
Options used to configure Amazon Web Services specific options such as credentials and region.
- AwsOptions.AwsRegionFactory - Class in org.apache.beam.sdk.io.aws2.options
-
Attempt to load default region.
- AwsOptions.AwsUserCredentialsFactory - Class in org.apache.beam.sdk.io.aws2.options
-
Return
DefaultCredentialsProvideras default provider. - AwsPipelineOptionsRegistrar - Class in org.apache.beam.sdk.io.aws2.options
-
A registrar containing the default AWS options.
- AwsPipelineOptionsRegistrar() - Constructor for class org.apache.beam.sdk.io.aws2.options.AwsPipelineOptionsRegistrar
- AwsRegionFactory() - Constructor for class org.apache.beam.sdk.io.aws2.options.AwsOptions.AwsRegionFactory
- AwsSchemaProvider - Class in org.apache.beam.sdk.io.aws2.schemas
-
Schema provider for AWS
SdkPojomodels using the provided field metadata (@seeSdkPojo.sdkFields()) rather than reflection. - AwsSchemaProvider() - Constructor for class org.apache.beam.sdk.io.aws2.schemas.AwsSchemaProvider
- AwsSchemaRegistrar - Class in org.apache.beam.sdk.io.aws2.schemas
- AwsSchemaRegistrar() - Constructor for class org.apache.beam.sdk.io.aws2.schemas.AwsSchemaRegistrar
- AwsSerializableUtils - Class in org.apache.beam.sdk.io.aws2.options
-
Utilities for working with AWS Serializables.
- AwsSerializableUtils() - Constructor for class org.apache.beam.sdk.io.aws2.options.AwsSerializableUtils
- AwsTypes - Class in org.apache.beam.sdk.io.aws2.schemas
- AwsTypes() - Constructor for class org.apache.beam.sdk.io.aws2.schemas.AwsTypes
- AwsUserCredentialsFactory() - Constructor for class org.apache.beam.sdk.io.aws2.options.AwsOptions.AwsUserCredentialsFactory
- AzureBlobStoreFileSystemRegistrar - Class in org.apache.beam.sdk.io.azure.blobstore
-
AutoServiceregistrar for theAzureBlobStoreFileSystem. - AzureBlobStoreFileSystemRegistrar() - Constructor for class org.apache.beam.sdk.io.azure.blobstore.AzureBlobStoreFileSystemRegistrar
- AzureModule - Class in org.apache.beam.sdk.io.azure.options
-
A Jackson
Modulethat registers aJsonSerializerandJsonDeserializerfor Azure credential providers. - AzureModule() - Constructor for class org.apache.beam.sdk.io.azure.options.AzureModule
- AzureOptions - Interface in org.apache.beam.sdk.io.azure.options
- AzureOptions.AzureUserCredentialsFactory - Class in org.apache.beam.sdk.io.azure.options
-
Attempts to load Azure credentials.
- AzurePipelineOptionsRegistrar - Class in org.apache.beam.sdk.io.azure.options
-
A registrar containing the default Azure options.
- AzurePipelineOptionsRegistrar() - Constructor for class org.apache.beam.sdk.io.azure.options.AzurePipelineOptionsRegistrar
- AzureUserCredentialsFactory() - Constructor for class org.apache.beam.sdk.io.azure.options.AzureOptions.AzureUserCredentialsFactory
B
- BACKLOG_UNKNOWN - Static variable in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
-
Constant representing an unknown amount of backlog.
- backlogBytes() - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
-
Gauge for source backlog in bytes.
- backlogBytesOfSplit(String) - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
-
Gauge for source split backlog in bytes.
- backlogElements() - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
-
Gauge for source backlog in elements.
- backlogElementsOfSplit(String) - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
-
Gauge for source split backlog in elements.
- BackOffAdapter - Class in org.apache.beam.sdk.extensions.gcp.util
-
An adapter for converting between Apache Beam and Google API client representations of backoffs.
- BackOffAdapter() - Constructor for class org.apache.beam.sdk.extensions.gcp.util.BackOffAdapter
- BAD_RECORD_TAG - Static variable in interface org.apache.beam.sdk.transforms.errorhandling.BadRecordRouter
- BadRecord - Class in org.apache.beam.sdk.transforms.errorhandling
- BadRecord() - Constructor for class org.apache.beam.sdk.transforms.errorhandling.BadRecord
- BadRecord.Builder - Class in org.apache.beam.sdk.transforms.errorhandling
- BadRecord.Failure - Class in org.apache.beam.sdk.transforms.errorhandling
- BadRecord.Failure.Builder - Class in org.apache.beam.sdk.transforms.errorhandling
- BadRecord.Record - Class in org.apache.beam.sdk.transforms.errorhandling
- BadRecord.Record.Builder - Class in org.apache.beam.sdk.transforms.errorhandling
- BadRecordErrorHandler(PTransform<PCollection<BadRecord>, OutputT>, Pipeline) - Constructor for class org.apache.beam.sdk.transforms.errorhandling.ErrorHandler.BadRecordErrorHandler
-
Constructs a new ErrorHandler for handling BadRecords.
- BadRecordRouter - Interface in org.apache.beam.sdk.transforms.errorhandling
- BadRecordRouter.RecordingBadRecordRouter - Class in org.apache.beam.sdk.transforms.errorhandling
- BadRecordRouter.ThrowingBadRecordRouter - Class in org.apache.beam.sdk.transforms.errorhandling
- bag() - Static method in class org.apache.beam.sdk.state.StateSpecs
- bag(Coder<T>) - Static method in class org.apache.beam.sdk.state.StateSpecs
-
Identical to
StateSpecs.bag(), but with an element coder explicitly supplied. - BagState<T> - Interface in org.apache.beam.sdk.state
-
A
ReadableStatecell containing a bag of values. - BagUserStateSpec() - Constructor for class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.BagUserStateSpec
- BASE_IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.FixedPrecisionNumeric
-
Identifier of the unspecified precision numeric type.
- baseBackoff() - Method in class org.apache.beam.sdk.io.aws2.common.RetryConfiguration
- baseBackoff(Duration) - Method in class org.apache.beam.sdk.io.aws2.common.RetryConfiguration.Builder
- BaseBeamTable - Class in org.apache.beam.sdk.extensions.sql.meta
-
Basic implementation of
BeamSqlTable. - BaseBeamTable() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.BaseBeamTable
- baseNameBuilder(String) - Static method in class org.apache.beam.sdk.metrics.LabeledMetricNameUtils.MetricNameBuilder
- baseUrl() - Method in class org.apache.beam.sdk.io.solr.SolrIO.ReplicaInfo
- BASIC - Enum constant in enum class org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions.AutoscalingAlgorithmType
- BASIC - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService.TableMetadataView
- BASIC_CONNECTION_INFO_VALIDATION_GROUP - Static variable in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- BasicAuthJcsmpSessionServiceFactory - Class in org.apache.beam.sdk.io.solace.broker
-
A factory for creating
JcsmpSessionServiceinstances. - BasicAuthJcsmpSessionServiceFactory() - Constructor for class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionServiceFactory
- BasicAuthJcsmpSessionServiceFactory.Builder - Class in org.apache.beam.sdk.io.solace.broker
- BasicAuthSempClient - Class in org.apache.beam.sdk.io.solace.broker
-
A class that manages REST calls to the Solace Element Management Protocol (SEMP) using basic authentication.
- BasicAuthSempClient(String, String, String, String, SerializableSupplier<HttpRequestFactory>) - Constructor for class org.apache.beam.sdk.io.solace.broker.BasicAuthSempClient
- BasicAuthSempClientFactory - Class in org.apache.beam.sdk.io.solace.broker
-
A factory for creating
BasicAuthSempClientinstances. - BasicAuthSempClientFactory() - Constructor for class org.apache.beam.sdk.io.solace.broker.BasicAuthSempClientFactory
- BasicAuthSempClientFactory.Builder - Class in org.apache.beam.sdk.io.solace.broker
- Basic Usage - Search tag in class org.apache.beam.io.requestresponse.RequestResponseIO
- Section
- BATCH - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead.QueryPriority
-
Specifies that a query should be run with a BATCH priority.
- BATCH - Enum constant in enum class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.ScanType
- BATCH_IMPORT - Enum constant in enum class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write.WriteMethod
-
Batch import write method.
- batchCombinePerKey(FlinkStreamingTranslationContext, PTransform<PCollection<KV<K, InputT>>, PCollection<KV<K, OutputT>>>, CombineFnBase.GlobalCombineFn<InputT, AccumT, OutputT>, Map<Integer, PCollectionView<?>>, List<PCollectionView<?>>) - Static method in class org.apache.beam.runners.flink.FlinkStreamingAggregationsTranslators
- batchCombinePerKeyNoSideInputs(FlinkStreamingTranslationContext, PTransform<PCollection<KV<K, InputT>>, PCollection<KV<K, OutputT>>>, CombineFnBase.GlobalCombineFn<InputT, AccumT, OutputT>) - Static method in class org.apache.beam.runners.flink.FlinkStreamingAggregationsTranslators
- BatchContextImpl - Class in org.apache.beam.sdk.io.cdap.context
-
Class for Batch, Sink and Stream CDAP wrapper classes that use it to provide common details.
- BatchContextImpl() - Constructor for class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- BATCHED - Enum constant in enum class org.apache.beam.sdk.io.solace.SolaceIO.WriterType
- batchGetDocuments() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.Read
-
Factory method to create a new type safe builder for
BatchGetDocumentsRequestoperations. - batchGroupByKey(FlinkStreamingTranslationContext, PTransform<PCollection<KV<K, InputT>>, PCollection<KV<K, Iterable<InputT>>>>) - Static method in class org.apache.beam.runners.flink.FlinkStreamingAggregationsTranslators
-
Creates a two-steps GBK operation.
- Batching and Grouping - Search tag in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
- Section
- BatchingParams() - Constructor for class org.apache.beam.sdk.transforms.GroupIntoBatches.BatchingParams
- BatchSideInputHandlerFactory - Class in org.apache.beam.runners.fnexecution.translation
-
StateRequestHandlerthat uses aBatchSideInputHandlerFactory.SideInputGetterto access side inputs. - BatchSideInputHandlerFactory.SideInputGetter - Interface in org.apache.beam.runners.fnexecution.translation
-
Returns the value for the side input with the given PCollection id from the runner.
- BatchSinkContextImpl - Class in org.apache.beam.sdk.io.cdap.context
-
Class for creating context object of different CDAP classes with batch sink type.
- BatchSinkContextImpl() - Constructor for class org.apache.beam.sdk.io.cdap.context.BatchSinkContextImpl
- batchSize() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportCatalogItems
- batchSize() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportUserEvents
- BatchSourceContextImpl - Class in org.apache.beam.sdk.io.cdap.context
-
Class for creating context object of different CDAP classes with batch source type.
- BatchSourceContextImpl() - Constructor for class org.apache.beam.sdk.io.cdap.context.BatchSourceContextImpl
- BatchStatefulParDoOverrides - Class in org.apache.beam.runners.dataflow
-
PTransformOverrideFactoriesthat expands to correctly implement statefulParDousing window-unawareBatchViewOverrides.GroupByKeyAndSortValuesOnlyto linearize processing per key. - BatchStatefulParDoOverrides() - Constructor for class org.apache.beam.runners.dataflow.BatchStatefulParDoOverrides
- BatchStatefulParDoOverrides.BatchStatefulDoFn<K,
V, - Class in org.apache.beam.runners.dataflowOutputT> -
A key-preserving
DoFnthat explodes an iterable that has been grouped by key and window. - BatchTransformTranslator<TransformT> - Interface in org.apache.beam.runners.twister2.translators
-
Batch TransformTranslator interface.
- batchWrite() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.Write
- batchWrite(String, List<RecT>) - Method in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
-
Asynchronously trigger a batch write request (unless already in error state).
- batchWrite(String, List<RecT>, boolean) - Method in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
-
Asynchronously trigger a batch write request (unless already in error state).
- BatchWriteWithSummary(JodaClock, FirestoreStatefulComponentFactory, RpcQosOptions, String, String) - Constructor for class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchWriteWithSummary
- Batch writing - Search tag in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO
- Section
- BEAM_INSTANCE_PROPERTY - Static variable in class org.apache.beam.io.debezium.KafkaSourceConsumerFn
- BeamAggregateProjectMergeRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
This rule is essentially a wrapper around Calcite's
AggregateProjectMergeRule. - BeamAggregateProjectMergeRule(Class<? extends Aggregate>, Class<? extends Project>, RelBuilderFactory) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rule.BeamAggregateProjectMergeRule
- BeamAggregationRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamRelNodeto replace aAggregatenode. - BeamAggregationRel(RelOptCluster, RelTraitSet, RelNode, ImmutableBitSet, List<ImmutableBitSet>, List<AggregateCall>, WindowFn<Row, IntervalWindow>, int) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamAggregationRel
- BeamAggregationRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
Rule to detect the window/trigger settings.
- BeamAggregationRule(Class<? extends Aggregate>, Class<? extends Project>, RelBuilderFactory) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rule.BeamAggregationRule
- BeamBasicAggregationRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
Aggregation rule that doesn't include projection.
- BeamBasicAggregationRule(Class<? extends Aggregate>, RelBuilderFactory) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rule.BeamBasicAggregationRule
- BeamBatchTSetEnvironment - Class in org.apache.beam.runners.twister2
-
This is a shell tset environment which is used on as a central driver model to fit what beam expects.
- BeamBatchTSetEnvironment() - Constructor for class org.apache.beam.runners.twister2.BeamBatchTSetEnvironment
- BeamBatchWorker - Class in org.apache.beam.runners.twister2
-
The Twister2 worker that will execute the job logic once the job is submitted from the run method.
- BeamBatchWorker() - Constructor for class org.apache.beam.runners.twister2.BeamBatchWorker
- BeamBigQuerySqlDialect - Class in org.apache.beam.sdk.extensions.sql.meta.provider.bigquery
- BeamBigQuerySqlDialect(SqlDialect.Context) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamBigQuerySqlDialect
- BeamBuiltinAggregations - Class in org.apache.beam.sdk.extensions.sql.impl.transform
-
Built-in aggregations functions for COUNT/MAX/MIN/SUM/AVG/VAR_POP/VAR_SAMP.
- BeamBuiltinAggregations() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations
- BeamBuiltinAggregations.BitXOr<T> - Class in org.apache.beam.sdk.extensions.sql.impl.transform
- BeamBuiltinAnalyticFunctions - Class in org.apache.beam.sdk.extensions.sql.impl.transform
-
Built-in Analytic Functions for the aggregation analytics functionality.
- BeamBuiltinAnalyticFunctions() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions
- BeamBuiltinAnalyticFunctions.PositionAwareCombineFn<InputT,
AccumT, - Class in org.apache.beam.sdk.extensions.sql.impl.transformOutputT> - BeamBuiltinFunctionProvider - Class in org.apache.beam.sdk.extensions.sql.impl.udf
-
BeamBuiltinFunctionClass interface.
- BeamBuiltinFunctionProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.udf.BeamBuiltinFunctionProvider
- BeamCalciteSchema - Class in org.apache.beam.sdk.extensions.sql.impl
- BeamCalciteTable - Class in org.apache.beam.sdk.extensions.sql.impl
-
Adapter from
BeamSqlTableto a calcite Table. - BeamCalcMergeRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
Planner rule to merge a
BeamCalcRelwith aBeamCalcRel. - BeamCalcMergeRule() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rule.BeamCalcMergeRule
- BeamCalcRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamRelNode to replace
ProjectandFilternode. - BeamCalcRel(RelOptCluster, RelTraitSet, RelNode, RexProgram) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel
- BeamCalcRel.WrappedList<T> - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
WrappedList translates
Liston access. - BeamCalcRel.WrappedMap<V> - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
WrappedMap translates
Mapon access. - BeamCalcRel.WrappedRow - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
WrappedRow translates
Rowon access. - BeamCalcRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
- BeamCalcSplittingRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
A
RelOptRulethat converts aLogicalCalcinto a chain ofAbstractBeamCalcRelnodes viaCalcRelSplitter. - BeamCalcSplittingRule(String) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rule.BeamCalcSplittingRule
- BeamCoGBKJoinRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
A
BeamJoinRelwhich does CoGBK Join - BeamCoGBKJoinRel(RelOptCluster, RelTraitSet, RelNode, RelNode, RexNode, Set<CorrelationId>, JoinRelType) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCoGBKJoinRel
- BeamCoGBKJoinRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
Rule to convert
LogicalJoinnode toBeamCoGBKJoinRelnode. - beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.AbstractBeamCalcRel
- beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamAggregationRel
- beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIntersectRel
- beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSinkRel
- beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
- beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
- beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMatchRel
- beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMinusRel
- beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamPushDownIOSourceRel
- beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in interface org.apache.beam.sdk.extensions.sql.impl.rel.BeamRelNode
-
This method is called by
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner.NonCumulativeCostImpl. - beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel
- beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamTableFunctionScanRel
- beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUncollectRel
- beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnionRel
- beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnnestRel
- beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamValuesRel
- beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamWindowRel
-
A dummy cost computation based on a fixed multiplier.
- BeamCostModel - Class in org.apache.beam.sdk.extensions.sql.impl.planner
-
VolcanoCostrepresents the cost of a plan node. - BeamCostModel.Factory - Class in org.apache.beam.sdk.extensions.sql.impl.planner
-
Implementation of
RelOptCostFactorythat createsBeamCostModels. - BeamEnumerableConverter - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamRelNode to replace a
Enumerablenode. - BeamEnumerableConverter(RelOptCluster, RelTraitSet, RelNode) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamEnumerableConverter
- BeamEnumerableConverterRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
- beamFilesystemArtifactDestinationProvider(String) - Static method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService
-
An ArtifactDestinationProvider that places new artifacts as files in a Beam filesystem.
- BeamFlinkDataSetAdapter - Class in org.apache.beam.runners.flink.adapter
-
An adapter class that allows one to apply Apache Beam PTransforms directly to Flink DataSets.
- BeamFlinkDataSetAdapter() - Constructor for class org.apache.beam.runners.flink.adapter.BeamFlinkDataSetAdapter
- BeamFlinkDataSetAdapter(PipelineOptions) - Constructor for class org.apache.beam.runners.flink.adapter.BeamFlinkDataSetAdapter
- BeamFlinkDataStreamAdapter - Class in org.apache.beam.runners.flink.adapter
-
An adapter class that allows one to apply Apache Beam PTransforms directly to Flink DataStreams.
- BeamFlinkDataStreamAdapter() - Constructor for class org.apache.beam.runners.flink.adapter.BeamFlinkDataStreamAdapter
- BeamFlinkDataStreamAdapter(PipelineOptions) - Constructor for class org.apache.beam.runners.flink.adapter.BeamFlinkDataStreamAdapter
- BeamFnDataGrpcMultiplexer - Class in org.apache.beam.sdk.fn.data
-
A gRPC multiplexer for a specific
Endpoints.ApiServiceDescriptor. - BeamFnDataGrpcMultiplexer(Endpoints.ApiServiceDescriptor, OutboundObserverFactory, OutboundObserverFactory.BasicFactory<BeamFnApi.Elements, BeamFnApi.Elements>) - Constructor for class org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer
- BeamFnDataInboundObserver - Class in org.apache.beam.sdk.fn.data
- BeamFnDataInboundObserver.CloseException - Exception Class in org.apache.beam.sdk.fn.data
- BeamFnDataOutboundAggregator - Class in org.apache.beam.sdk.fn.data
-
An outbound data buffering aggregator with size-based buffer and time-based buffer if corresponding options are set.
- BeamFnDataOutboundAggregator(PipelineOptions, Supplier<String>, StreamObserver<BeamFnApi.Elements>, boolean) - Constructor for class org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator
- BeamImpulseSource - Class in org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.impulse
-
A Beam
BoundedSourcefor Impulse Source. - BeamImpulseSource() - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.impulse.BeamImpulseSource
- BeamIntersectRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamRelNodeto replace aIntersectnode. - BeamIntersectRel(RelOptCluster, RelTraitSet, List<RelNode>, boolean) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIntersectRel
- BeamIntersectRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
ConverterRuleto replaceIntersectwithBeamIntersectRel. - BeamIOPushDownRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
- BeamIOPushDownRule(RelBuilderFactory) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rule.BeamIOPushDownRule
- BeamIOSinkRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamRelNode to replace a
TableModifynode. - BeamIOSinkRel(RelOptCluster, RelOptTable, Prepare.CatalogReader, RelNode, TableModify.Operation, List<String>, List<RexNode>, boolean, BeamSqlTable, Map<String, String>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSinkRel
- BeamIOSinkRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
- BeamIOSourceRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamRelNode to replace a
TableScannode. - BeamIOSourceRel(RelOptCluster, RelTraitSet, RelOptTable, BeamSqlTable, Map<String, String>, BeamCalciteTable) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
- BeamJavaTypeFactory - Class in org.apache.beam.sdk.extensions.sql.impl.planner
-
customized data type in Beam.
- BeamJoinAssociateRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
This is very similar to
JoinAssociateRule. - BeamJoinPushThroughJoinRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
This is exactly similar to
JoinPushThroughJoinRule. - BeamJoinRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
An abstract
BeamRelNodeto implement Join Rels. - BeamJoinRel(RelOptCluster, RelTraitSet, RelNode, RelNode, RexNode, Set<CorrelationId>, JoinRelType) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
- BeamJoinTransforms - Class in org.apache.beam.sdk.extensions.sql.impl.transform
-
Collections of
PTransformandDoFnused to perform JOIN operation. - BeamJoinTransforms() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamJoinTransforms
- BeamJoinTransforms.JoinAsLookup - Class in org.apache.beam.sdk.extensions.sql.impl.transform
-
Transform to execute Join as Lookup.
- BeamKafkaCSVTable - Class in org.apache.beam.sdk.extensions.sql.meta.provider.kafka
-
A Kafka topic that saves records as CSV format.
- BeamKafkaCSVTable(Schema, String, List<String>) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaCSVTable
- BeamKafkaCSVTable(Schema, String, List<String>, TimestampPolicyFactory) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaCSVTable
- BeamKafkaCSVTable(Schema, String, List<String>, CSVFormat, TimestampPolicyFactory) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaCSVTable
- BeamKafkaTable - Class in org.apache.beam.sdk.extensions.sql.meta.provider.kafka
-
BeamKafkaTablerepresent a Kafka topic, as source or target. - BeamKafkaTable(Schema) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
- BeamKafkaTable(Schema, String, List<String>) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
- BeamKafkaTable(Schema, String, List<String>, TimestampPolicyFactory) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
- BeamKafkaTable(Schema, List<TopicPartition>, String) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
- BeamKafkaTable(Schema, List<TopicPartition>, String, TimestampPolicyFactory) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
- BeamLogicalConvention - Enum Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
Convention for Beam SQL.
- BeamMatchRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamRelNodeto replace aMatchnode. - BeamMatchRel(RelOptCluster, RelTraitSet, RelNode, RelDataType, RexNode, boolean, boolean, Map<String, RexNode>, Map<String, RexNode>, RexNode, Map<String, ? extends SortedSet<String>>, boolean, ImmutableBitSet, RelCollation, RexNode) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMatchRel
- BeamMatchRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
ConverterRuleto replaceMatchwithBeamMatchRel. - BeamMinusRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamRelNodeto replace aMinusnode. - BeamMinusRel(RelOptCluster, RelTraitSet, List<RelNode>, boolean) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMinusRel
- BeamMinusRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
ConverterRuleto replaceMinuswithBeamMinusRel. - BeamPCollectionTable<InputT> - Class in org.apache.beam.sdk.extensions.sql.impl.schema
-
BeamPCollectionTableconverts aPCollection<Row>as a virtual table, then a downstream query can query directly. - BeamPCollectionTable(PCollection<InputT>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.schema.BeamPCollectionTable
- BeamPushDownIOSourceRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
- BeamPushDownIOSourceRel(RelOptCluster, RelTraitSet, RelOptTable, BeamSqlTable, List<String>, BeamSqlTableFilter, Map<String, String>, BeamCalciteTable) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamPushDownIOSourceRel
- BeamRelDataTypeSystem - Class in org.apache.beam.sdk.extensions.sql.impl.planner
-
customized data type in Beam.
- BeamRelMetadataQuery - Class in org.apache.beam.sdk.extensions.sql.impl.planner
- BeamRelNode - Interface in org.apache.beam.sdk.extensions.sql.impl.rel
-
A
RelNodethat can also give aPTransformthat implements the expression. - beamRow2CsvLine(Row, CSVFormat) - Static method in class org.apache.beam.sdk.extensions.sql.impl.schema.BeamTableUtils
- beamRowFromSourceRecordFn(Schema) - Static method in class org.apache.beam.io.debezium.KafkaConnectUtils
- Beam Rows - Search tag in class org.apache.beam.sdk.io.iceberg.IcebergIO
- Section
- BeamRowToBigtableMutation - Class in org.apache.beam.sdk.io.gcp.bigtable
- BeamRowToBigtableMutation(Map<String, Set<String>>) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BeamRowToBigtableMutation
- BeamRowToBigtableMutation.ToBigtableRowFn - Class in org.apache.beam.sdk.io.gcp.bigtable
- beamRowToIcebergRecord(Schema, Row) - Static method in class org.apache.beam.sdk.io.iceberg.IcebergUtils
-
Converts a Beam
Rowto an IcebergRecord. - BeamRowToStorageApiProto - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Utility methods for converting Beam
Rowobjects to dynamic protocol message, for use with the Storage write API. - BeamRowToStorageApiProto() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BeamRowToStorageApiProto
- BeamRuleSets - Class in org.apache.beam.sdk.extensions.sql.impl.planner
-
RuleSetused inBeamQueryPlanner. - BeamRuleSets() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRuleSets
- beamSchemaFromJsonSchema(String) - Static method in class org.apache.beam.sdk.schemas.utils.JsonUtils
- beamSchemaFromKafkaConnectSchema(Schema) - Static method in class org.apache.beam.io.debezium.KafkaConnectUtils
- beamSchemaToIcebergSchema(Schema) - Static method in class org.apache.beam.sdk.io.iceberg.IcebergUtils
-
Converts a Beam
Schemato an IcebergSchema. - beamSchemaTypeFromKafkaType(Schema) - Static method in class org.apache.beam.io.debezium.KafkaConnectUtils
- BeamSetOperatorRelBase - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
Delegate for Set operators:
BeamUnionRel,BeamIntersectRelandBeamMinusRel. - BeamSetOperatorRelBase(BeamRelNode, BeamSetOperatorRelBase.OpType, boolean) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSetOperatorRelBase
- BeamSetOperatorRelBase.OpType - Enum Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
Set operator type.
- BeamSetOperatorsTransforms - Class in org.apache.beam.sdk.extensions.sql.impl.transform
-
Collections of
PTransformandDoFnused to perform Set operations. - BeamSetOperatorsTransforms() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamSetOperatorsTransforms
- BeamSetOperatorsTransforms.BeamSqlRow2KvFn - Class in org.apache.beam.sdk.extensions.sql.impl.transform
-
Transform a
BeamSqlRowto aKV<BeamSqlRow, BeamSqlRow>. - BeamSetOperatorsTransforms.SetOperatorFilteringDoFn - Class in org.apache.beam.sdk.extensions.sql.impl.transform
-
Filter function used for Set operators.
- BeamSideInputJoinRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
A
BeamJoinRelwhich does sideinput Join - BeamSideInputJoinRel(RelOptCluster, RelTraitSet, RelNode, RelNode, RexNode, Set<CorrelationId>, JoinRelType) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSideInputJoinRel
- BeamSideInputJoinRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
Rule to convert
LogicalJoinnode toBeamSideInputJoinRelnode. - BeamSideInputLookupJoinRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
A
BeamJoinRelwhich does Lookup Join - BeamSideInputLookupJoinRel(RelOptCluster, RelTraitSet, RelNode, RelNode, RexNode, Set<CorrelationId>, JoinRelType) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSideInputLookupJoinRel
- BeamSideInputLookupJoinRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
Rule to convert
LogicalJoinnode toBeamSideInputLookupJoinRelnode. - BeamSideInputLookupJoinRule() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rule.BeamSideInputLookupJoinRule
- BeamSortRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamRelNodeto replace aSortnode. - BeamSortRel(RelOptCluster, RelTraitSet, RelNode, RelCollation, RexNode, RexNode) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel
- BeamSortRel.BeamSqlRowComparator - Class in org.apache.beam.sdk.extensions.sql.impl.rel
- BeamSortRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
ConverterRuleto replaceSortwithBeamSortRel. - beamSource - Variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSource
- BeamSqlCli - Class in org.apache.beam.sdk.extensions.sql
-
BeamSqlCliprovides methods to execute Beam SQL with an interactive client. - BeamSqlCli() - Constructor for class org.apache.beam.sdk.extensions.sql.BeamSqlCli
- BeamSqlDataCatalogExample - Class in org.apache.beam.sdk.extensions.sql.example
-
Example pipeline that uses Google Cloud Data Catalog to retrieve the table metadata.
- BeamSqlDataCatalogExample() - Constructor for class org.apache.beam.sdk.extensions.sql.example.BeamSqlDataCatalogExample
- BeamSqlDataCatalogExample.DCExamplePipelineOptions - Interface in org.apache.beam.sdk.extensions.sql.example
-
Pipeline options to specify the query and the output for the example.
- Beam SQL DSL usage: - Search tag in class org.apache.beam.sdk.extensions.sql.SqlTransform
- Section
- BeamSqlEnv - Class in org.apache.beam.sdk.extensions.sql.impl
-
Contains the metadata of tables/UDF functions, and exposes APIs to query/validate/optimize/translate SQL statements.
- BeamSqlEnv.BeamSqlEnvBuilder - Class in org.apache.beam.sdk.extensions.sql.impl
-
BeamSqlEnv's Builder.
- BeamSqlOutputToConsoleFn - Class in org.apache.beam.sdk.extensions.sql.impl.transform
-
A test PTransform to display output in console.
- BeamSqlOutputToConsoleFn(String) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamSqlOutputToConsoleFn
- BeamSqlParser - Class in org.apache.beam.sdk.extensions.sql.impl.parser
- BeamSqlPipelineOptions - Interface in org.apache.beam.sdk.extensions.sql.impl
-
Options used to configure BeamSQL.
- BeamSqlPipelineOptionsRegistrar - Class in org.apache.beam.sdk.extensions.sql.impl
-
AutoServiceregistrar forBeamSqlPipelineOptions. - BeamSqlPipelineOptionsRegistrar() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.BeamSqlPipelineOptionsRegistrar
- BeamSqlRelUtils - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
Utilities for
BeamRelNode. - BeamSqlRelUtils() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils
- BeamSqlRow2KvFn() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamSetOperatorsTransforms.BeamSqlRow2KvFn
- BeamSqlRowComparator(List<Integer>, List<Boolean>, List<Boolean>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel.BeamSqlRowComparator
- BeamSqlSeekableTable - Interface in org.apache.beam.sdk.extensions.sql
-
A seekable table converts a JOIN operator to an inline lookup.
- BeamSqlTable - Interface in org.apache.beam.sdk.extensions.sql.meta
-
This interface defines a Beam Sql Table.
- BeamSqlTableFilter - Interface in org.apache.beam.sdk.extensions.sql.meta
-
This interface defines Beam SQL Table Filter.
- BeamSqlUdf - Interface in org.apache.beam.sdk.extensions.sql
-
Interface to create a UDF in Beam SQL.
- BeamSqlUnparseContext - Class in org.apache.beam.sdk.extensions.sql.meta.provider.bigquery
- BeamSqlUnparseContext(IntFunction<SqlNode>) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamSqlUnparseContext
- BeamStoppableFunction - Interface in org.apache.beam.runners.flink.translation.wrappers.streaming.io
-
Custom StoppableFunction for backward compatibility.
- BeamTableFunctionScanRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamRelNode to replace
TableFunctionScan. - BeamTableFunctionScanRel(RelOptCluster, RelTraitSet, List<RelNode>, RexNode, Type, RelDataType, Set<RelColumnMapping>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamTableFunctionScanRel
- BeamTableFunctionScanRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
This is the conveter rule that converts a Calcite
TableFunctionScanto BeamTableFunctionScanRel. - BeamTableStatistics - Class in org.apache.beam.sdk.extensions.sql.impl
-
This class stores row count statistics.
- BeamTableUtils - Class in org.apache.beam.sdk.extensions.sql.impl.schema
-
Utility methods for working with
BeamTable. - BeamTableUtils() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.schema.BeamTableUtils
- BeamUncollectRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamRelNodeto implement an uncorrelatedUncollect, aka UNNEST. - BeamUncollectRel(RelOptCluster, RelTraitSet, RelNode, boolean) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUncollectRel
- BeamUncollectRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
- BeamUnionRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamRelNodeto replace aUnion. - BeamUnionRel(RelOptCluster, RelTraitSet, List<RelNode>, boolean) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnionRel
- BeamUnionRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
- BeamUnnestRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
- BeamUnnestRel(RelOptCluster, RelTraitSet, RelNode, RelDataType, List<Integer>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnnestRel
- BeamUnnestRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
- BeamValuesRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamRelNodeto replace aValuesnode. - BeamValuesRel(RelOptCluster, RelDataType, ImmutableList<ImmutableList<RexLiteral>>, RelTraitSet) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamValuesRel
- BeamValuesRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
ConverterRuleto replaceValueswithBeamValuesRel. - BeamWindowRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamRelNodeto replace aWindownode. - BeamWindowRel(RelOptCluster, RelTraitSet, RelNode, List<RexLiteral>, RelDataType, List<Window.Group>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamWindowRel
- BeamWindowRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
- BeamWorkerStatusGrpcService - Class in org.apache.beam.runners.fnexecution.status
-
A Fn Status service which can collect run-time status information from SDK harnesses for debugging purpose.
- before_initial_sequence - Enum constant in enum class org.apache.beam.sdk.extensions.ordered.UnprocessedEvent.Reason
- beforeEach(ExtensionContext) - Method in class org.apache.beam.sdk.testing.TestPipelineExtension
- beforeProcessing(PipelineOptions) - Method in interface org.apache.beam.sdk.harness.JvmInitializer
-
Implement beforeProcessing to run some custom initialization after basic services such as logging, but before data processing begins.
- beforeProcessing(PipelineOptions) - Method in class org.apache.beam.sdk.io.kafka.KafkaIOInitializer
- beforeStart(ClientCallStreamObserver<RespT>) - Method in class org.apache.beam.sdk.fn.stream.ForwardingClientResponseObserver
- begin() - Method in class org.apache.beam.sdk.Pipeline
-
Returns a
PBeginowned by this Pipeline. - beginningOnDay(int) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
- beginningOnDay(int, int) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
- benchmarkHadoopLineReader(TextSourceBenchmark.Data) - Method in class org.apache.beam.sdk.jmh.io.TextSourceBenchmark
- benchmarkTextSource(TextSourceBenchmark.Data) - Method in class org.apache.beam.sdk.jmh.io.TextSourceBenchmark
- BIG_INT - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
- BIG_QUERY_INSERT_ERROR_ERROR_CONTAINER - Static variable in interface org.apache.beam.sdk.io.gcp.bigquery.ErrorContainer
- BigDecimalCoder - Class in org.apache.beam.sdk.coders
-
A
BigDecimalCoderencodes aBigDecimalas an integer scale encoded withVarIntCoderand aBigIntegerencoded usingBigIntegerCoder. - BigDecimalConverter - Class in org.apache.beam.sdk.extensions.sql.impl.utils
-
Provides converters from
BigDecimalto other numeric types based on the inputSchema.TypeName. - BigDecimalConverter() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.utils.BigDecimalConverter
- bigdecimals() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
The
TypeDescriptorfor BigDecimal. - BigEndianIntegerCoder - Class in org.apache.beam.sdk.coders
-
A
BigEndianIntegerCoderencodesIntegersin 4 bytes, big-endian. - BigEndianLongCoder - Class in org.apache.beam.sdk.coders
-
A
BigEndianLongCoderencodesLongsin 8 bytes, big-endian. - BigEndianShortCoder - Class in org.apache.beam.sdk.coders
-
A
BigEndianShortCoderencodesShortsin 2 bytes, big-endian. - BigIntegerCoder - Class in org.apache.beam.sdk.coders
-
A
BigIntegerCoderencodes aBigIntegeras a byte array containing the big endian two's-complement representation, encoded viaByteArrayCoder. - bigintegers() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
The
TypeDescriptorfor BigInteger. - BIGQUERY - Static variable in class org.apache.beam.sdk.managed.Managed
- BIGQUERY_EARLY_ROLLOUT_REGION - Static variable in interface org.apache.beam.sdk.io.gcp.bigquery.TestBigQueryOptions
- BIGQUERY_JOB_TEMPLATE - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
-
Template for BigQuery jobs created by BigQueryIO.
- BigqueryClient - Class in org.apache.beam.sdk.io.gcp.testing
-
A wrapper class to call Bigquery API calls.
- BigqueryClient(String) - Constructor for class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
- BigQueryCoderProviderRegistrar - Class in org.apache.beam.sdk.io.gcp.bigquery
-
A
CoderProviderRegistrarfor standard types used withBigQueryIO. - BigQueryCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryCoderProviderRegistrar
- BigQuery Concepts - Search tag in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
- Section
- BigQueryDirectReadSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration
- BigQueryDirectReadSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
-
An implementation of
TypedSchemaTransformProviderfor BigQuery Storage Read API jobs configured viaBigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration. - BigQueryDirectReadSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider
- BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransform - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
-
A
SchemaTransformfor BigQuery Storage Read API, configured withBigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfigurationand instantiated byBigQueryDirectReadSchemaTransformProvider. - BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
-
Configuration for reading from BigQuery with Storage Read API.
- BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
- BigQueryDlqProvider - Class in org.apache.beam.sdk.io.gcp.bigquery
- BigQueryDlqProvider() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryDlqProvider
- BigQueryExportReadSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Configuration for reading from BigQuery.
- BigQueryExportReadSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration
- BigQueryExportReadSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.bigquery
- BigQueryExportReadSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.bigquery
-
An implementation of
TypedSchemaTransformProviderfor BigQuery read jobs configured usingBigQueryExportReadSchemaTransformConfiguration. - BigQueryExportReadSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformProvider
- BigQueryExportReadSchemaTransformProvider.BigQueryExportSchemaTransform - Class in org.apache.beam.sdk.io.gcp.bigquery
-
An implementation of
SchemaTransformfor BigQuery read jobs configured usingBigQueryExportReadSchemaTransformConfiguration. - BigQueryFileLoadsSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
-
An implementation of
TypedSchemaTransformProviderfor BigQuery write jobs configured usingBigQueryWriteConfiguration. - BigQueryFileLoadsSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryFileLoadsSchemaTransformProvider
- BigQueryFileLoadsSchemaTransformProvider.BigQueryFileLoadsSchemaTransform - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
- BigQueryFilter - Class in org.apache.beam.sdk.extensions.sql.meta.provider.bigquery
- BigQueryFilter(List<RexNode>) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryFilter
- BigQueryHelpers - Class in org.apache.beam.sdk.io.gcp.bigquery
-
A set of helper functions and classes used by
BigQueryIO. - BigQueryHelpers() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
- BigQueryInsertError - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Model definition for BigQueryInsertError.
- BigQueryInsertError(TableRow, TableDataInsertAllResponse.InsertErrors, TableReference) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertError
- BigQueryInsertErrorCoder - Class in org.apache.beam.sdk.io.gcp.bigquery
-
A
Coderthat encodes BigQueryBigQueryInsertErrorobjects. - BigQueryInsertErrorCoder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
- BigQueryIO - Class in org.apache.beam.sdk.io.gcp.bigquery
-
PTransforms for reading and writing BigQuery tables. - BigQueryIO.Read - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Implementation of
BigQueryIO.read(). - BigQueryIO.TypedRead<T> - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Implementation of
BigQueryIO.read(SerializableFunction). - BigQueryIO.TypedRead.Method - Enum Class in org.apache.beam.sdk.io.gcp.bigquery
-
Determines the method used to read data from BigQuery.
- BigQueryIO.TypedRead.QueryPriority - Enum Class in org.apache.beam.sdk.io.gcp.bigquery
-
An enumeration type for the priority of a query.
- BigQueryIO.Write<T> - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Implementation of
BigQueryIO.write(). - BigQueryIO.Write.CreateDisposition - Enum Class in org.apache.beam.sdk.io.gcp.bigquery
-
An enumeration type for the BigQuery create disposition strings.
- BigQueryIO.Write.Method - Enum Class in org.apache.beam.sdk.io.gcp.bigquery
-
Determines the method used to insert data in BigQuery.
- BigQueryIO.Write.SchemaUpdateOption - Enum Class in org.apache.beam.sdk.io.gcp.bigquery
-
An enumeration type for the BigQuery schema update options strings.
- BigQueryIO.Write.WriteDisposition - Enum Class in org.apache.beam.sdk.io.gcp.bigquery
-
An enumeration type for the BigQuery write disposition strings.
- BigQueryIOTranslation - Class in org.apache.beam.sdk.io.gcp.bigquery
- BigQueryIOTranslation() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOTranslation
- BigQueryIOTranslation.ReadRegistrar - Class in org.apache.beam.sdk.io.gcp.bigquery
- BigQueryIOTranslation.WriteRegistrar - Class in org.apache.beam.sdk.io.gcp.bigquery
- BigqueryMatcher - Class in org.apache.beam.sdk.io.gcp.testing
-
A matcher to verify data in BigQuery by processing given query and comparing with content's checksum.
- BigqueryMatcher.TableAndQuery - Class in org.apache.beam.sdk.io.gcp.testing
- BigQueryOptions - Interface in org.apache.beam.sdk.io.gcp.bigquery
-
Properties needed when using Google BigQuery with the Apache Beam SDK.
- BigQuerySchemaIOProvider - Class in org.apache.beam.sdk.io.gcp.bigquery
-
An implementation of
SchemaIOProviderfor reading and writing to BigQuery withBigQueryIO. - BigQuerySchemaIOProvider() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaIOProvider
- BigQuerySchemaRetrievalException - Exception Class in org.apache.beam.sdk.io.gcp.bigquery
-
Exception to signal that BigQuery schema retrieval failed.
- BigQuerySchemaTransformTranslation - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
- BigQuerySchemaTransformTranslation() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQuerySchemaTransformTranslation
- BigQuerySchemaTransformTranslation.BigQueryStorageReadSchemaTransformTranslator - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
- BigQuerySchemaTransformTranslation.BigQueryWriteSchemaTransformTranslator - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
- BigQuerySchemaTransformTranslation.ReadWriteRegistrar - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
- BigQueryServices - Interface in org.apache.beam.sdk.io.gcp.bigquery
-
An interface for real, mock, or fake implementations of Cloud BigQuery services.
- BigQueryServices.BigQueryServerStream<T> - Interface in org.apache.beam.sdk.io.gcp.bigquery
-
Container for reading data from streaming endpoints.
- BigQueryServices.DatasetService - Interface in org.apache.beam.sdk.io.gcp.bigquery
-
An interface to get, create and delete Cloud BigQuery datasets and tables.
- BigQueryServices.DatasetService.TableMetadataView - Enum Class in org.apache.beam.sdk.io.gcp.bigquery
- BigQueryServices.JobService - Interface in org.apache.beam.sdk.io.gcp.bigquery
-
An interface for the Cloud BigQuery load service.
- BigQueryServices.StorageClient - Interface in org.apache.beam.sdk.io.gcp.bigquery
-
An interface representing a client object for making calls to the BigQuery Storage API.
- BigQueryServices.StreamAppendClient - Interface in org.apache.beam.sdk.io.gcp.bigquery
-
An interface for appending records to a Storage API write stream.
- BigQueryServices.WriteStreamService - Interface in org.apache.beam.sdk.io.gcp.bigquery
-
An interface to get, create and flush Cloud BigQuery STORAGE API write streams.
- BigQueryServicesImpl - Class in org.apache.beam.sdk.io.gcp.bigquery
-
An implementation of
BigQueryServicesthat actually communicates with the cloud BigQuery service. - BigQueryServicesImpl() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl
- BigQueryServicesImpl.DatasetServiceImpl - Class in org.apache.beam.sdk.io.gcp.bigquery
- BigQueryServicesImpl.WriteStreamServiceImpl - Class in org.apache.beam.sdk.io.gcp.bigquery
- BigQuerySinkMetrics - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Helper class to create perworker metrics for BigQuery Sink stages.
- BigQuerySinkMetrics() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
- BigQuerySinkMetrics.RpcMethod - Enum Class in org.apache.beam.sdk.io.gcp.bigquery
- BigQueryStorageApiInsertError - Class in org.apache.beam.sdk.io.gcp.bigquery
- BigQueryStorageApiInsertError(TableRow) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertError
- BigQueryStorageApiInsertError(TableRow, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertError
- BigQueryStorageApiInsertErrorCoder - Class in org.apache.beam.sdk.io.gcp.bigquery
- BigQueryStorageApiInsertErrorCoder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertErrorCoder
- BigQueryStorageReadSchemaTransformTranslator() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQuerySchemaTransformTranslation.BigQueryStorageReadSchemaTransformTranslator
- BigQueryStorageTableSource<T> - Class in org.apache.beam.sdk.io.gcp.bigquery
-
A
Sourcerepresenting reading from a table. - BigQueryStorageWriteApiSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
-
An implementation of
TypedSchemaTransformProviderfor BigQuery Storage Write API jobs configured viaBigQueryWriteConfiguration. - BigQueryStorageWriteApiSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider
- BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
-
A
SchemaTransformfor BigQuery Storage Write API, configured withBigQueryWriteConfigurationand instantiated byBigQueryStorageWriteApiSchemaTransformProvider. - BigQueryTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.bigquery
-
BigQuery table provider.
- BigQueryTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTableProvider
- BigQueryUtils - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Utility methods for BigQuery related operations.
- BigQueryUtils() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
- BigQueryUtils.ConversionOptions - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Options for how to convert BigQuery data to Beam data.
- BigQueryUtils.ConversionOptions.Builder - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Builder for
BigQueryUtils.ConversionOptions. - BigQueryUtils.ConversionOptions.TruncateTimestamps - Enum Class in org.apache.beam.sdk.io.gcp.bigquery
-
Controls whether to truncate timestamps to millisecond precision lossily, or to crash when truncation would result.
- BigQueryUtils.SchemaConversionOptions - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Options for how to convert BigQuery schemas to Beam schemas.
- BigQueryUtils.SchemaConversionOptions.Builder - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Builder for
BigQueryUtils.SchemaConversionOptions. - BigQueryWriteConfiguration - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
-
Configuration for writing to BigQuery with SchemaTransforms.
- BigQueryWriteConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
- BigQueryWriteConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
-
Builder for
BigQueryWriteConfiguration. - BigQueryWriteConfiguration.ErrorHandling - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
- BigQueryWriteConfiguration.ErrorHandling.Builder - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
- BigQueryWriteSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
-
A BigQuery Write SchemaTransformProvider that routes to either
BigQueryFileLoadsSchemaTransformProviderorBigQueryStorageWriteApiSchemaTransformProvider. - BigQueryWriteSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteSchemaTransformProvider
- BigQueryWriteSchemaTransformProvider.BigQueryWriteSchemaTransform - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
- BigQueryWriteSchemaTransformTranslator() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQuerySchemaTransformTranslation.BigQueryWriteSchemaTransformTranslator
- BigtableChangeStreamAccessor - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao
-
This is probably a temporary solution to what is a bigger migration from cloud-bigtable-client-core to java-bigtable.
- BigtableChangeStreamTestOptions - Interface in org.apache.beam.sdk.io.gcp.bigtable.changestreams
- BigtableClientOverride - Interface in org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao
-
Override the configuration of Cloud Bigtable data and admin client.
- BigtableConfig - Class in org.apache.beam.sdk.io.gcp.bigtable
-
Configuration for a Cloud Bigtable client.
- BigtableConfig() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
- BigtableIO - Class in org.apache.beam.sdk.io.gcp.bigtable
-
Transformsfor reading from and writing to Google Cloud Bigtable. - BigtableIO.ExistingPipelineOptions - Enum Class in org.apache.beam.sdk.io.gcp.bigtable
-
Overwrite options to determine what to do if change stream name is being reused and there exists metadata of the same change stream name.
- BigtableIO.Read - Class in org.apache.beam.sdk.io.gcp.bigtable
-
A
PTransformthat reads from Google Cloud Bigtable. - BigtableIO.ReadChangeStream - Class in org.apache.beam.sdk.io.gcp.bigtable
- BigtableIO.Write - Class in org.apache.beam.sdk.io.gcp.bigtable
-
A
PTransformthat writes to Google Cloud Bigtable. - BigtableIO.WriteWithResults - Class in org.apache.beam.sdk.io.gcp.bigtable
-
A
PTransformthat writes to Google Cloud Bigtable and emits aBigtableWriteResultfor each batch written. - BigtableReadSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration
- BigtableReadSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.bigtable
-
An implementation of
TypedSchemaTransformProviderfor Bigtable Read jobs configured viaBigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration. - BigtableReadSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider
- BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.gcp.bigtable
-
Configuration for reading from Bigtable.
- BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.bigtable
- BigtableRowToBeamRow - Class in org.apache.beam.sdk.io.gcp.bigtable
- BigtableRowToBeamRow(Schema) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableRowToBeamRow
- BigtableRowToBeamRowFlat - Class in org.apache.beam.sdk.io.gcp.bigtable
- BigtableRowToBeamRowFlat(Schema, Map<String, Set<String>>) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableRowToBeamRowFlat
- BigtableTable - Class in org.apache.beam.sdk.extensions.sql.meta.provider.bigtable
- BigtableTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.bigtable
- BigtableTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.bigtable.BigtableTableProvider
- BigtableUtils - Class in org.apache.beam.sdk.io.gcp.testing
- BigtableUtils() - Constructor for class org.apache.beam.sdk.io.gcp.testing.BigtableUtils
- BigtableWriteResult - Class in org.apache.beam.sdk.io.gcp.bigtable
-
The result of writing a batch of rows to Bigtable.
- BigtableWriteResult() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResult
- BigtableWriteResultCoder - Class in org.apache.beam.sdk.io.gcp.bigtable
-
A coder for
BigtableWriteResult. - BigtableWriteResultCoder() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResultCoder
- BigtableWriteSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration
- BigtableWriteSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.bigtable
-
An implementation of
TypedSchemaTransformProviderfor Bigtable Write jobs configured viaBigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration. - BigtableWriteSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider
- BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.gcp.bigtable
-
Configuration for writing to Bigtable.
- BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.bigtable
- BigtableWriteSchemaTransformProvider.GetMutationsFromBeamRow - Class in org.apache.beam.sdk.io.gcp.bigtable
- BinaryCombineDoubleFn() - Constructor for class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
- BinaryCombineFn() - Constructor for class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
- BinaryCombineIntegerFn() - Constructor for class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
- BinaryCombineLongFn() - Constructor for class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
- bind(String, StateBinder) - Method in interface org.apache.beam.sdk.state.StateSpec
-
For internal use only; no backwards-compatibility guarantees.
- bindBag(String, StateSpec<BagState<T>>, Coder<T>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.EarlyBinder
- bindBag(String, StateSpec<BagState<T>>, Coder<T>) - Method in interface org.apache.beam.sdk.state.StateBinder
- bindCombining(String, StateSpec<CombiningState<InputT, AccumT, OutputT>>, Coder<AccumT>, Combine.CombineFn<InputT, AccumT, OutputT>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.EarlyBinder
- bindCombining(String, StateSpec<CombiningState<InputT, AccumT, OutputT>>, Coder<AccumT>, Combine.CombineFn<InputT, AccumT, OutputT>) - Method in interface org.apache.beam.sdk.state.StateBinder
- bindCombiningWithContext(String, StateSpec<CombiningState<InputT, AccumT, OutputT>>, Coder<AccumT>, CombineWithContext.CombineFnWithContext<InputT, AccumT, OutputT>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.EarlyBinder
- bindCombiningWithContext(String, StateSpec<CombiningState<InputT, AccumT, OutputT>>, Coder<AccumT>, CombineWithContext.CombineFnWithContext<InputT, AccumT, OutputT>) - Method in interface org.apache.beam.sdk.state.StateBinder
- Binding AMQP and receive messages - Search tag in class org.apache.beam.sdk.io.amqp.AmqpIO
- Section
- bindMap(String, StateSpec<MapState<KeyT, ValueT>>, Coder<KeyT>, Coder<ValueT>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.EarlyBinder
- bindMap(String, StateSpec<MapState<KeyT, ValueT>>, Coder<KeyT>, Coder<ValueT>) - Method in interface org.apache.beam.sdk.state.StateBinder
- bindMultimap(String, StateSpec<MultimapState<KeyT, ValueT>>, Coder<KeyT>, Coder<ValueT>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.EarlyBinder
- bindMultimap(String, StateSpec<MultimapState<KeyT, ValueT>>, Coder<KeyT>, Coder<ValueT>) - Method in interface org.apache.beam.sdk.state.StateBinder
- bindOrderedList(String, StateSpec<OrderedListState<T>>, Coder<T>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.EarlyBinder
- bindOrderedList(String, StateSpec<OrderedListState<T>>, Coder<T>) - Method in interface org.apache.beam.sdk.state.StateBinder
- bindSet(String, StateSpec<SetState<T>>, Coder<T>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.EarlyBinder
- bindSet(String, StateSpec<SetState<T>>, Coder<T>) - Method in interface org.apache.beam.sdk.state.StateBinder
- bindValue(String, StateSpec<ValueState<T>>, Coder<T>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.EarlyBinder
- bindValue(String, StateSpec<ValueState<T>>, Coder<T>) - Method in interface org.apache.beam.sdk.state.StateBinder
- bindWatermark(String, StateSpec<WatermarkHoldState>, TimestampCombiner) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.EarlyBinder
- bindWatermark(String, StateSpec<WatermarkHoldState>, TimestampCombiner) - Method in interface org.apache.beam.sdk.state.StateBinder
-
Bind to a watermark
StateSpec. - BitSetCoder - Class in org.apache.beam.sdk.coders
-
Coder for
BitSet. - BitXOr() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations.BitXOr
- BlackholeOutput() - Constructor for class org.apache.beam.sdk.jmh.util.VarIntBenchmark.BlackholeOutput
- BlobstoreClientBuilderFactory - Interface in org.apache.beam.sdk.io.azure.options
-
Construct BlobServiceClientBuilder from Azure pipeline options.
- BlobstoreOptions - Interface in org.apache.beam.sdk.io.azure.options
-
Options used to configure Microsoft Azure Blob Storage.
- Block() - Constructor for class org.apache.beam.sdk.io.BlockBasedSource.Block
- BlockBasedReader(BlockBasedSource<T>) - Constructor for class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
- BlockBasedSource<T> - Class in org.apache.beam.sdk.io
-
A
BlockBasedSourceis aFileBasedSourcewhere a file consists of blocks of records. - BlockBasedSource(String, long) - Constructor for class org.apache.beam.sdk.io.BlockBasedSource
-
Like
BlockBasedSource(String, EmptyMatchTreatment, long)but with a defaultEmptyMatchTreatmentofEmptyMatchTreatment.DISALLOW. - BlockBasedSource(String, EmptyMatchTreatment, long) - Constructor for class org.apache.beam.sdk.io.BlockBasedSource
-
Creates a
BlockBasedSourcebased on a file name or pattern. - BlockBasedSource(MatchResult.Metadata, long, long, long) - Constructor for class org.apache.beam.sdk.io.BlockBasedSource
-
Creates a
BlockBasedSourcefor a single file. - BlockBasedSource(ValueProvider<String>, long) - Constructor for class org.apache.beam.sdk.io.BlockBasedSource
- BlockBasedSource(ValueProvider<String>, EmptyMatchTreatment, long) - Constructor for class org.apache.beam.sdk.io.BlockBasedSource
- BlockBasedSource.Block<T> - Class in org.apache.beam.sdk.io
-
A
Blockrepresents a block of records that can be read. - BlockBasedSource.BlockBasedReader<T> - Class in org.apache.beam.sdk.io
-
A
Readerthat reads records from aBlockBasedSource. - BlockingCommitterImpl - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
- BlockTracker(OffsetRange, long, long) - Constructor for class org.apache.beam.sdk.io.parquet.ParquetIO.ReadFiles.BlockTracker
- BOOL - Enum constant in enum class org.apache.beam.sdk.io.clickhouse.TableSchema.TypeName
- BOOL - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- BOOLEAN - Enum constant in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- BOOLEAN - Enum constant in enum class org.apache.beam.sdk.transforms.display.DisplayData.Type
- BOOLEAN - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
- BOOLEAN - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
-
The type of boolean fields.
- BooleanCoder - Class in org.apache.beam.sdk.coders
- BooleanCoder() - Constructor for class org.apache.beam.sdk.coders.BooleanCoder
- booleans() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
The
TypeDescriptorfor Boolean. - booleanToByteArray(boolean) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigtableUtils
- borrowDataset(PTransform<? extends PValue, ?>) - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
- borrowDataset(PValue) - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
- BOTH - Enum constant in enum class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider.PushDownOptions
- bounded(String, BoundedSource<T>, SerializablePipelineOptions, int) - Static method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSource
- Bounded(SparkContext, BoundedSource<T>, SerializablePipelineOptions, String) - Constructor for class org.apache.beam.runners.spark.io.SourceRDD.Bounded
- BOUNDED - Enum constant in enum class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker.IsBounded
-
Indicates that a
Restrictionrepresents a bounded amount of work. - BOUNDED - Enum constant in enum class org.apache.beam.sdk.values.PCollection.IsBounded
-
Indicates that a
PCollectioncontains a bounded number of elements. - BOUNDED_UNKNOWN - Static variable in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
- BoundedDataset<T> - Class in org.apache.beam.runners.spark.translation
-
Holds an RDD or values for deferred conversion to an RDD if needed.
- BoundedDatasetFactory - Class in org.apache.beam.runners.spark.structuredstreaming.io
- boundedImpulse() - Static method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSource
- boundedness - Variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSource
- Boundedness - Search tag in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
- Section
- BoundedReader() - Constructor for class org.apache.beam.sdk.io.BoundedSource.BoundedReader
- BoundedReadFromUnboundedSource<T> - Class in org.apache.beam.sdk.io
-
PTransformthat reads a bounded amount of data from anUnboundedSource, specified as one or both of a maximum number of elements or a maximum period of time to read. - BoundedSource<T> - Class in org.apache.beam.sdk.io
-
A
Sourcethat reads a finite amount of input and, because of that, supports some additional operations. - BoundedSource() - Constructor for class org.apache.beam.sdk.io.BoundedSource
- BoundedSource.BoundedReader<T> - Class in org.apache.beam.sdk.io
-
A
Readerthat reads a bounded amount of input and supports some additional operations, such as progress estimation and dynamic work rebalancing. - BoundedSourceP<T> - Class in org.apache.beam.runners.jet.processors
-
Jet
Processorimplementation for reading from a bounded Beam source. - boundedTrie(Class<?>, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
-
Create a metric that accumulates and reports set of unique string values bounded to a max limit.
- boundedTrie(String, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
-
Create a metric that accumulates and reports set of unique string values bounded to a max limit.
- BoundedTrie - Interface in org.apache.beam.sdk.metrics
-
Internal: For internal use only and not for public consumption.
- BoundedTrieImpl - Class in org.apache.beam.runners.jet.metrics
-
Implementation of
BoundedTrie. - BoundedTrieImpl(MetricName) - Constructor for class org.apache.beam.runners.jet.metrics.BoundedTrieImpl
- BoundedTrieResult - Class in org.apache.beam.sdk.metrics
-
Internal: For internal use only and not for public consumption.
- BoundedTrieResult() - Constructor for class org.apache.beam.sdk.metrics.BoundedTrieResult
- BoundedWindow - Class in org.apache.beam.sdk.transforms.windowing
-
A
BoundedWindowrepresents window information assigned to data elements. - BoundedWindow() - Constructor for class org.apache.beam.sdk.transforms.windowing.BoundedWindow
- boxIfPrimitive(TypeDescriptor) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
- bqServices - Variable in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
- broadcast(JavaSparkContext) - Method in class org.apache.beam.runners.spark.util.SideInputBroadcast
- broadcast(T) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
- BrokerResponse - Class in org.apache.beam.sdk.io.solace.broker
- BrokerResponse(int, String, InputStream) - Constructor for class org.apache.beam.sdk.io.solace.broker.BrokerResponse
- bucketAccessible(GcsPath) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
-
Returns whether the GCS bucket exists and is accessible.
- bucketOwner(GcsPath) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
-
Returns the project number of the project which owns this bucket.
- buffer(BufferedElement) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.stableinput.KeyedBufferingElementsHandler
- buffer(BufferedElement) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.stableinput.NonKeyedBufferingElementsHandler
- buffered - Enum constant in enum class org.apache.beam.sdk.extensions.ordered.UnprocessedEvent.Reason
- BufferedElement - Interface in org.apache.beam.runners.flink.translation.wrappers.streaming.stableinput
-
An interface for elements buffered during a checkpoint when using @RequiresStableInput.
- BufferedExternalSorter - Class in org.apache.beam.sdk.extensions.sorter
-
Sorterthat will use in memory sorting until the values can't fit into memory and will then fall back to external sorting. - BufferedExternalSorter.Options - Class in org.apache.beam.sdk.extensions.sorter
-
Contains configuration for the sorter.
- bufferingDoFnRunner - Variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- BufferingDoFnRunner<InputT,
OutputT> - Class in org.apache.beam.runners.flink.translation.wrappers.streaming.stableinput -
A
DoFnRunnerwhich buffers data for supportingDoFn.RequiresStableInput. - BufferingStreamObserver<T> - Class in org.apache.beam.sdk.fn.stream
-
A thread safe
StreamObserverwhich uses a bounded queue to pass elements to a processing thread responsible for interacting with the underlyingCallStreamObserver. - BufferingStreamObserver(Phaser, CallStreamObserver<T>, ExecutorService, int) - Constructor for class org.apache.beam.sdk.fn.stream.BufferingStreamObserver
- build() - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration.Builder
- build() - Method in class org.apache.beam.io.requestresponse.Monitoring.Builder
- build() - Method in class org.apache.beam.sdk.expansion.service.WindowIntoTransformProvider.Configuration.Builder
- build() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsCreateOptions.Builder
- build() - Method in class org.apache.beam.sdk.extensions.gcp.util.CustomHttpErrors.Builder
- build() - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.CreateOptions.Builder
- build() - Method in class org.apache.beam.sdk.extensions.ml.AnnotateText.Builder
- build() - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText.Builder
- build() - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText.Builder
- build() - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText.Builder
- build() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus.Builder
- build() - Method in class org.apache.beam.sdk.extensions.sbe.SbeSchema.IrOptions.Builder
- build() - Method in class org.apache.beam.sdk.extensions.sbe.UnsignedOptions.Builder
- build() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv.BeamSqlEnvBuilder
-
Build function to create an instance of BeamSqlEnv based on preset fields.
- build() - Method in class org.apache.beam.sdk.extensions.sql.impl.UdfImplReflectiveFunctionBase.ParameterListBuilder
- build() - Method in class org.apache.beam.sdk.extensions.sql.impl.utils.SerializableRexNode.Builder
- build() - Method in class org.apache.beam.sdk.extensions.sql.meta.Table.Builder
- build() - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.Globally.Builder
- build() - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.PerKey.Builder
- build() - Method in class org.apache.beam.sdk.fn.test.TestStreams.Builder
- build() - Method in class org.apache.beam.sdk.io.aws2.auth.StsAssumeRoleForFederatedCredentialsProvider.Builder
-
Validates and fully initializes a
StsAssumeRoleForFederatedCredentialsProviderinstance. - build() - Method in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.aws2.common.RetryConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.RecordAggregation.Builder
- build() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.aws2.s3.SSECustomerKey.Builder
- build() - Method in class org.apache.beam.sdk.io.aws2.sqs.providers.SqsReadConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.cdap.ConfigWrapper
- build() - Method in class org.apache.beam.sdk.io.cdap.Plugin.Builder
- build() - Method in class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider.CsvWriteConfiguration.Builder
-
Builds a
CsvWriteTransformProvider.CsvWriteConfigurationinstance. - build() - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.CsvConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.ParquetConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.XmlConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.fs.CreateOptions.StandardCreateOptions.Builder
- build() - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration.Builder
-
Builds the
BigQueryExportReadSchemaTransformConfigurationconfiguration. - build() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.SchemaConversionOptions.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.Builder
-
Builds a
BigQueryWriteConfigurationinstance. - build() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.ErrorHandling.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration.Builder
-
Builds a
BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfigurationinstance. - build() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchGetDocuments.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchWriteWithDeadLetterQueue.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchWriteWithSummary.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.ListCollectionIds.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.ListDocuments.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.PartitionQuery.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.RunQuery.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
-
Create a new instance of
RpcQosOptionsfrom the current builder state. - build() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.ErrorHandling.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.ErrorHandling.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PublisherOptions.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.SubscriberOptions.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
-
Builds the
ChangeStreamRecordMetadata. - build() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
-
Builds a
PartitionMetadatafrom the given fields. - build() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig.Builder
- build() - Method in class org.apache.beam.sdk.io.iceberg.IcebergDestination.Builder
- build() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
- build() - Method in class org.apache.beam.sdk.io.iceberg.IcebergTableCreateConfig.Builder
- build() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration.Builder
- build() - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo.Builder
- build() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.json.providers.JsonWriteTransformProvider.JsonWriteConfiguration.Builder
-
Builds a
JsonWriteTransformProvider.JsonWriteConfigurationinstance. - build() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
-
Builds a
KafkaReadSchemaTransformConfigurationinstance. - build() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadConfiguration.Builder
-
Builds the
SingleStoreSchemaTransformReadConfigurationconfiguration. - build() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteConfiguration.Builder
-
Builds the
SingleStoreSchemaTransformWriteConfigurationconfiguration. - build() - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionServiceFactory.Builder
- build() - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthSempClientFactory.Builder
- build() - Method in class org.apache.beam.sdk.io.solace.broker.GCPSecretSessionServiceFactory.Builder
- build() - Method in class org.apache.beam.sdk.io.solace.data.Solace.CorrelationKey.Builder
- build() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Destination.Builder
- build() - Method in class org.apache.beam.sdk.io.solace.data.Solace.PublishResult.Builder
- build() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record.Builder
- build() - Method in class org.apache.beam.sdk.io.sparkreceiver.ReceiverBuilder
- build() - Method in class org.apache.beam.sdk.io.TextRowCountEstimator.Builder
- build() - Method in class org.apache.beam.sdk.io.TFRecordReadSchemaTransformConfiguration.Builder
-
Builds the
TFRecordReadSchemaTransformConfigurationconfiguration. - build() - Method in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformConfiguration.Builder
-
Builds the
TFRecordWriteSchemaTransformConfigurationconfiguration. - build() - Method in class org.apache.beam.sdk.managed.testing.TestSchemaTransformProvider.Config.Builder
- build() - Method in class org.apache.beam.sdk.metrics.MetricsFilter.Builder
- build() - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Rate.Builder
- build() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.Builder
- build() - Method in class org.apache.beam.sdk.schemas.io.Failure.Builder
- build() - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- build() - Method in class org.apache.beam.sdk.schemas.Schema.Field.Builder
- build() - Method in class org.apache.beam.sdk.schemas.Schema.Options.Builder
- build() - Method in class org.apache.beam.sdk.schemas.transforms.providers.ErrorHandling.Builder
- build() - Method in class org.apache.beam.sdk.schemas.transforms.providers.FlattenTransformProvider.Configuration.Builder
- build() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider.Configuration.Builder
- build() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider.Configuration.Builder
- build() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider.Configuration.Builder
- build() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf.Configuration.Builder
- build() - Method in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider.Configuration.Builder
- build() - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Builder
- build() - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Failure.Builder
- build() - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Record.Builder
- build() - Method in class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn.ParseWithError.Builder
- build() - Method in class org.apache.beam.sdk.values.Row.Builder
- build() - Method in class org.apache.beam.sdk.values.Row.FieldValueBuilder
- build() - Method in class org.apache.beam.sdk.values.WindowedValues.Builder
- build(String) - Method in class org.apache.beam.sdk.metrics.LabeledMetricNameUtils.MetricNameBuilder
- build(BeamSqlEnv.BeamSqlEnvBuilder, boolean, PipelineOptions) - Method in class org.apache.beam.sdk.extensions.sql.BeamSqlCli
- buildBeamSqlNullableSchema(Object...) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableUtils
- buildBeamSqlSchema(Object...) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableUtils
-
Create a RowsBuilder with the specified row type info.
- buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTableProvider
- buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigtable.BigtableTableProvider
- buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
- buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datagen.DataGeneratorTableProvider
-
Instantiates the
DataGeneratorTablewhen aCREATE EXTERNAL TABLEstatement withTYPE 'datagen'is executed. - buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.iceberg.IcebergMetastore
- buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.KafkaTableProvider
- buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTableProvider
- buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.parquet.ParquetTableProvider
- buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.pubsublite.PubsubLiteTableProvider
- buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.ReadOnlyTableProvider
- buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.SchemaIOTableProviderWrapper
- buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.seqgen.GenerateSequenceTableProvider
- buildBeamSqlTable(Table) - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider
-
Build a
BeamSqlTableusing the given table meta info. - buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider
- buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTableProvider
- buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.store.InMemoryMetaStore
- buildClient(AwsOptions, BuilderT, ClientConfiguration) - Static method in interface org.apache.beam.sdk.io.aws2.common.ClientBuilderFactory
-
Utility to directly build a client of type
ClientBuilderFactoryusing builder ofClientBuilderFactory. - buildDatasource() - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
- buildDatasource() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
Builds
SnowflakeBasicDataSourcebased on the current configuration. - builder() - Static method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration
- builder() - Static method in class org.apache.beam.io.requestresponse.Monitoring
- builder() - Static method in class org.apache.beam.runners.jobsubmission.JobPreparation
- builder() - Static method in class org.apache.beam.sdk.expansion.service.WindowIntoTransformProvider.Configuration
- builder() - Static method in class org.apache.beam.sdk.extensions.gcp.storage.GcsCreateOptions
-
Returns a
GcsCreateOptions.Builder. - builder() - Static method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.CreateOptions
- builder() - Static method in class org.apache.beam.sdk.extensions.sbe.SbeSchema.IrOptions
- builder() - Static method in class org.apache.beam.sdk.extensions.sbe.UnsignedOptions
- builder() - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall
- builder() - Static method in class org.apache.beam.sdk.extensions.sql.impl.UdfImplReflectiveFunctionBase
-
Creates a ParameterListBuilder.
- builder() - Static method in class org.apache.beam.sdk.extensions.sql.meta.Table
- builder() - Static method in class org.apache.beam.sdk.io.aws2.auth.StsAssumeRoleForFederatedCredentialsProvider
-
Creates a builder for the type
StsAssumeRoleForFederatedCredentialsProvider. - builder() - Static method in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration
- builder() - Static method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration
- builder() - Static method in class org.apache.beam.sdk.io.aws2.common.RetryConfiguration
- builder() - Static method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.RecordAggregation
- builder() - Static method in class org.apache.beam.sdk.io.aws2.kinesis.WatermarkParameters
- builder() - Static method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
-
Creates a new uninitialized
S3FileSystemConfiguration.Builder. - builder() - Static method in class org.apache.beam.sdk.io.aws2.s3.SSECustomerKey
- builder() - Static method in class org.apache.beam.sdk.io.aws2.sqs.providers.SqsReadConfiguration
- builder() - Static method in class org.apache.beam.sdk.io.cdap.Plugin
-
Creates a plugin builder instance.
- builder() - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- builder() - Static method in class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider.CsvWriteConfiguration
- builder() - Static method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration
- builder() - Static method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration
- builder() - Static method in class org.apache.beam.sdk.io.fs.CreateOptions.StandardCreateOptions
-
Returns a
CreateOptions.StandardCreateOptions.Builder. - builder() - Static method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata
- builder() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration
-
Instantiates a
BigQueryExportReadSchemaTransformConfiguration.Builder. - builder() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions
- builder() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.SchemaConversionOptions
- builder() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration
-
Instantiates a
BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration.Builderinstance. - builder() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
-
Instantiates a
BigQueryWriteConfiguration.Builderinstance. - builder() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.ErrorHandling
- builder() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration
- builder() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration
-
Instantiates a
BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration.Builderinstance. - builder() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
- builder() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.ErrorHandling
- builder() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration
- builder() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.ErrorHandling
- builder() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
- builder() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
- builder() - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration
- builder() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration
- builder() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema
- builder() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration
- builder() - Static method in class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig
- builder() - Static method in class org.apache.beam.sdk.io.iceberg.IcebergDestination
- builder() - Static method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- builder() - Static method in class org.apache.beam.sdk.io.iceberg.IcebergTableCreateConfig
- builder() - Static method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration
- builder() - Static method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo
- builder() - Static method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
- builder() - Static method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
- builder() - Static method in class org.apache.beam.sdk.io.json.providers.JsonWriteTransformProvider.JsonWriteConfiguration
- builder() - Static method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
-
Instantiates a
KafkaReadSchemaTransformConfiguration.Builderinstance. - builder() - Static method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration
- builder() - Static method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadConfiguration
-
Instantiates a
SingleStoreSchemaTransformReadConfiguration.Builder. - builder() - Static method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteConfiguration
-
Instantiates a
SingleStoreSchemaTransformWriteConfiguration.Builder. - builder() - Static method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionServiceFactory
- builder() - Static method in class org.apache.beam.sdk.io.solace.broker.BasicAuthSempClientFactory
- builder() - Static method in class org.apache.beam.sdk.io.solace.broker.GCPSecretSessionServiceFactory
- builder() - Static method in class org.apache.beam.sdk.io.solace.data.Semp.Queue
- builder() - Static method in class org.apache.beam.sdk.io.solace.data.Semp.QueueData
- builder() - Static method in class org.apache.beam.sdk.io.solace.data.Solace.CorrelationKey
- builder() - Static method in class org.apache.beam.sdk.io.solace.data.Solace.Destination
- builder() - Static method in class org.apache.beam.sdk.io.solace.data.Solace.PublishResult
- builder() - Static method in class org.apache.beam.sdk.io.solace.data.Solace.Record
- builder() - Static method in class org.apache.beam.sdk.io.solace.RetryCallableManager
- builder() - Static method in class org.apache.beam.sdk.io.TextRowCountEstimator
- builder() - Static method in class org.apache.beam.sdk.io.TFRecordReadSchemaTransformConfiguration
-
Instantiates a
TFRecordReadSchemaTransformConfiguration.Builderinstance. - builder() - Static method in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformConfiguration
-
Instantiates a
TFRecordWriteSchemaTransformConfiguration.Builderinstance. - builder() - Static method in class org.apache.beam.sdk.managed.testing.TestSchemaTransformProvider.Config
- builder() - Static method in class org.apache.beam.sdk.metrics.MetricsFilter
- builder() - Static method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration
- builder() - Static method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Rate
- builder() - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor
- builder() - Static method in class org.apache.beam.sdk.schemas.Schema
- builder() - Static method in class org.apache.beam.sdk.schemas.Schema.Options
- builder() - Static method in class org.apache.beam.sdk.schemas.transforms.providers.ErrorHandling
- builder() - Static method in class org.apache.beam.sdk.schemas.transforms.providers.FlattenTransformProvider.Configuration
- builder() - Static method in class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider.Configuration
- builder() - Static method in class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider.Configuration
- builder() - Static method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider.Configuration
- builder() - Static method in class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf.Configuration
- builder() - Static method in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider.Configuration
- builder() - Static method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord
- builder() - Static method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Failure
- builder() - Static method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Record
- builder() - Static method in class org.apache.beam.sdk.values.WindowedValues
- builder(Dialect) - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema
- builder(CatalogManager) - Static method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
-
Creates a builder with the default schema backed by the catalog manager.
- builder(TableProvider) - Static method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
-
Creates a builder with the default schema backed by the table provider.
- builder(WindowedValue<T>) - Static method in class org.apache.beam.sdk.values.WindowedValues
-
Create a Builder that takes element metadata from the provideed delegate.
- builder(RexNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.SerializableRexNode
- builder(T) - Method in class org.apache.beam.sdk.testing.TestOutputReceiver
- builder(T) - Method in interface org.apache.beam.sdk.transforms.DoFn.OutputReceiver
- Builder() - Constructor for class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.io.requestresponse.Monitoring.Builder
- Builder() - Constructor for class org.apache.beam.sdk.expansion.service.WindowIntoTransformProvider.Configuration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.extensions.gcp.storage.GcsCreateOptions.Builder
- Builder() - Constructor for class org.apache.beam.sdk.extensions.gcp.util.CustomHttpErrors.Builder
- Builder() - Constructor for class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.CreateOptions.Builder
- Builder() - Constructor for class org.apache.beam.sdk.extensions.ml.AnnotateText.Builder
- Builder() - Constructor for class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText.Builder
- Builder() - Constructor for class org.apache.beam.sdk.extensions.ml.DLPInspectText.Builder
- Builder() - Constructor for class org.apache.beam.sdk.extensions.ml.DLPReidentifyText.Builder
- Builder() - Constructor for class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus.Builder
- Builder() - Constructor for class org.apache.beam.sdk.extensions.sbe.SbeSchema.IrOptions.Builder
- Builder() - Constructor for class org.apache.beam.sdk.extensions.sbe.UnsignedOptions.Builder
- Builder() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.Table.Builder
- Builder() - Constructor for class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.Globally.Builder
- Builder() - Constructor for class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.PerKey.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.aws2.auth.StsAssumeRoleForFederatedCredentialsProvider.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.aws2.common.ClientConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.aws2.common.RetryConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.RecordAggregation.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.aws2.sqs.providers.SqsReadConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.cdap.Plugin.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider.CsvWriteConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.CsvConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.ParquetConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.XmlConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.fs.CreateOptions.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.fs.CreateOptions.StandardCreateOptions.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.fs.MatchResult.Metadata.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.SchemaConversionOptions.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.ErrorHandling.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.ErrorHandling.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.ErrorHandling.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PublisherOptions.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.SubscriberOptions.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergDestination.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergTableCreateConfig.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.iceberg.SnapshotInfo.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.json.providers.JsonWriteTransformProvider.JsonWriteConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionServiceFactory.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.solace.broker.BasicAuthSempClientFactory.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.solace.broker.GCPSecretSessionServiceFactory.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.solace.data.Solace.CorrelationKey.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.solace.data.Solace.Destination.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.solace.data.Solace.PublishResult.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.solace.data.Solace.Record.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.solace.RetryCallableManager.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.splunk.SplunkEvent.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.splunk.SplunkWriteError.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.TextRowCountEstimator.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.TFRecordReadSchemaTransformConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.managed.testing.TestSchemaTransformProvider.Config.Builder
- Builder() - Constructor for class org.apache.beam.sdk.metrics.MetricsFilter.Builder
- Builder() - Constructor for class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Rate.Builder
- Builder() - Constructor for class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.Builder
- Builder() - Constructor for class org.apache.beam.sdk.schemas.FieldValueTypeInformation.Builder
- Builder() - Constructor for class org.apache.beam.sdk.schemas.io.Failure.Builder
- Builder() - Constructor for class org.apache.beam.sdk.schemas.Schema.Builder
- Builder() - Constructor for class org.apache.beam.sdk.schemas.Schema.Field.Builder
- Builder() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.ErrorHandling.Builder
- Builder() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.FlattenTransformProvider.Configuration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider.Configuration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider.Configuration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider.Configuration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf.Configuration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider.Configuration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.transforms.DoFnSchemaInformation.Builder
- Builder() - Constructor for class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Builder
- Builder() - Constructor for class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Failure.Builder
- Builder() - Constructor for class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Record.Builder
- Builder() - Constructor for class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn.Builder
- Builder() - Constructor for class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn.ParseWithError.Builder
- Builder() - Constructor for class org.apache.beam.sdk.transforms.JsonToRow.ParseResult.Builder
- Builder() - Constructor for class org.apache.beam.sdk.values.WindowedValues.Builder
- Builder(JodaClock, FirestoreStatefulComponentFactory, RpcQosOptions, boolean, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.PartitionQuery.Builder
- Builder(JodaClock, FirestoreStatefulComponentFactory, RpcQosOptions, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchGetDocuments.Builder
- Builder(RexNode) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.utils.SerializableRexNode.Builder
- builderForType(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct
- builderFrom(S3Options) - Static method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
-
Creates a new
S3FileSystemConfiguration.Builderwith values initialized by the properties ofs3Options. - buildExternal(ConfigT) - Method in interface org.apache.beam.sdk.transforms.ExternalTransformBuilder
-
Builds the transform after it has been configured.
- buildExternal(DebeziumTransformRegistrar.ReadBuilder.Configuration) - Method in class org.apache.beam.io.debezium.DebeziumTransformRegistrar.ReadBuilder
- buildExternal(KinesisTransformRegistrar.ReadDataBuilder.Configuration) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.ReadDataBuilder
- buildExternal(KinesisTransformRegistrar.WriteBuilder.Configuration) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.WriteBuilder
- buildExternal(ExternalRead.Configuration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead.ReadBuilder
- buildExternal(ExternalWrite.Configuration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite.WriteBuilder
- buildExternal(SpannerTransformRegistrar.ChangeStreamReaderBuilder.Configuration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ChangeStreamReaderBuilder
- buildExternal(SpannerTransformRegistrar.ReadBuilder.Configuration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder
- buildExternal(SpannerTransformRegistrar.WriteBuilder.Configuration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.DeleteBuilder
- buildExternal(SpannerTransformRegistrar.WriteBuilder.Configuration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.InsertBuilder
- buildExternal(SpannerTransformRegistrar.WriteBuilder.Configuration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.InsertOrUpdateBuilder
- buildExternal(SpannerTransformRegistrar.WriteBuilder.Configuration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReplaceBuilder
- buildExternal(SpannerTransformRegistrar.WriteBuilder.Configuration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.UpdateBuilder
- buildExternal(ReadBuilder.Configuration) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.ReadBuilder
- buildExternal(WriteBuilder.Configuration) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.WriteBuilder
- buildFrom(DescriptorProtos.FileDescriptorSet) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoDomain
- buildFrom(Descriptors.Descriptor) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoDomain
- buildFrom(Descriptors.FileDescriptor) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoDomain
- buildFrom(InputStream) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoDomain
- Building a Managed turnkey transform - Search tag in class org.apache.beam.sdk.managed.Managed
- Section
- buildIOReader(PBegin) - Method in class org.apache.beam.sdk.extensions.sql.impl.schema.BeamPCollectionTable
- buildIOReader(PBegin) - Method in interface org.apache.beam.sdk.extensions.sql.meta.BeamSqlTable
-
create a
PCollection<Row>from source. - buildIOReader(PBegin) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigtable.BigtableTable
- buildIOReader(PBegin) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datagen.DataGeneratorTable
- buildIOReader(PBegin) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.hcatalog.HCatalogTable
- buildIOReader(PBegin) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
- buildIOReader(PBegin) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTable
- buildIOReader(PBegin) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestBoundedTable
- buildIOReader(PBegin) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestUnboundedTable
- buildIOReader(PBegin) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTable
- buildIOReader(PBegin, BeamSqlTableFilter, List<String>) - Method in class org.apache.beam.sdk.extensions.sql.meta.BaseBeamTable
- buildIOReader(PBegin, BeamSqlTableFilter, List<String>) - Method in interface org.apache.beam.sdk.extensions.sql.meta.BeamSqlTable
-
create a
PCollection<Row>from source with predicate and/or project pushed-down. - buildIOReader(PBegin, BeamSqlTableFilter, List<String>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigtable.BigtableTable
- buildIOReader(PBegin, BeamSqlTableFilter, List<String>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datagen.DataGeneratorTable
- buildIOReader(PBegin, BeamSqlTableFilter, List<String>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTable
- buildIOWriter(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.impl.schema.BeamPCollectionTable
- buildIOWriter(PCollection<Row>) - Method in interface org.apache.beam.sdk.extensions.sql.meta.BeamSqlTable
-
create a
IO.write()instance to write to target. - buildIOWriter(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigtable.BigtableTable
- buildIOWriter(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datagen.DataGeneratorTable
- buildIOWriter(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.hcatalog.HCatalogTable
- buildIOWriter(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
- buildIOWriter(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTable
- buildIOWriter(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestBoundedTable
- buildIOWriter(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTable
- buildIOWriter(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTable
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamAggregationRel
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCoGBKJoinRel
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIntersectRel
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSinkRel
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMatchRel
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMinusRel
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamPushDownIOSourceRel
- buildPTransform() - Method in interface org.apache.beam.sdk.extensions.sql.impl.rel.BeamRelNode
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSideInputJoinRel
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSideInputLookupJoinRel
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamTableFunctionScanRel
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUncollectRel
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnionRel
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnnestRel
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamValuesRel
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamWindowRel
- buildPTransform(PTransform<PCollection<Row>, ? extends POutput>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel
- buildPTransform(PTransform<PCollection<Row>, ? extends POutput>) - Method in interface org.apache.beam.sdk.extensions.sql.impl.rel.BeamRelNode
- buildReader() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
- buildReader() - Method in interface org.apache.beam.sdk.schemas.io.SchemaIO
-
Returns a schema aware reader.
- buildRows(Schema, List<?>) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableUtils
-
Convenient way to build a
BeamSqlRows. - buildSchemaWithAttributes(Schema, List<String>, String) - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
-
Builds a new
Schemaby adding additional optional attributes and map field to the provided schema. - buildTemporaryFilename(ResourceId, String) - Static method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
-
Constructs a temporary file resource given the temporary directory and a filename.
- buildTransform(FileReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.fileschematransform.AvroReadSchemaTransformFormatProvider
- buildTransform(FileReadSchemaTransformConfiguration) - Method in interface org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformFormatProvider
- buildTransform(FileReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.fileschematransform.JsonReadSchemaTransformFormatProvider
- buildTransform(FileReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.fileschematransform.LineReadSchemaTransformFormatProvider
- buildTransform(FileReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.fileschematransform.ParquetReadSchemaTransformFormatProvider
- buildTransform(FileWriteSchemaTransformConfiguration, Schema) - Method in class org.apache.beam.sdk.io.fileschematransform.AvroWriteSchemaTransformFormatProvider
-
Builds a
PTransformthat transforms aRowPCollectioninto resultPCollectionTuplewith two tags, one for file names written usingAvroIO.Write, another for errored-out rows. - buildTransform(FileWriteSchemaTransformConfiguration, Schema) - Method in class org.apache.beam.sdk.io.fileschematransform.CsvWriteSchemaTransformFormatProvider
- buildTransform(FileWriteSchemaTransformConfiguration, Schema) - Method in interface org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformFormatProvider
-
Builds a
PTransformthat writes aRowPCollectionand outputs the resultingPCollectionTuplewith two tags, one for the file names, and another errored-out rows. - buildTransform(FileWriteSchemaTransformConfiguration, Schema) - Method in class org.apache.beam.sdk.io.fileschematransform.JsonWriteSchemaTransformFormatProvider
-
Builds a
PTransformthat transforms aRowPCollectioninto resultPCollectionTuplewith two tags, one for file names written usingTextIO.Write, another for errored-out rows. - buildTransform(FileWriteSchemaTransformConfiguration, Schema) - Method in class org.apache.beam.sdk.io.fileschematransform.ParquetWriteSchemaTransformFormatProvider
-
Builds a
PTransformthat transforms aRowPCollectioninto resultPCollectionTuplewith two tags, one for file names written usingParquetIO.SinkandFileIO.Write, another for errored-out rows. - buildTransform(FileWriteSchemaTransformConfiguration, Schema) - Method in class org.apache.beam.sdk.io.fileschematransform.XmlWriteSchemaTransformFormatProvider
-
Builds a
PTransformthat transforms aRowPCollectioninto resultPCollectionTuplewith two tags, one for file names written usingXmlIO.SinkandFileIO.Write, another for errored-out rows. - buildTwoInputStream(KeyedStream<WindowedValue<KV<K, InputT>>, FlinkKey>, DataStream<RawUnionValue>, String, WindowDoFnOperator<K, InputT, OutputT>, TypeInformation<WindowedValue<KV<K, OutputT>>>) - Static method in class org.apache.beam.runners.flink.FlinkStreamingAggregationsTranslators
- buildWriter() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
- buildWriter() - Method in interface org.apache.beam.sdk.schemas.io.SchemaIO
-
Returns a schema aware writer.
- BUILTIN_AGGREGATOR_FACTORIES - Static variable in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations
- BUILTIN_ANALYTIC_FACTORIES - Static variable in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions
- BuiltinHashFunctions - Class in org.apache.beam.sdk.extensions.sql.impl.udf
-
Hash Functions.
- BuiltinHashFunctions() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinHashFunctions
- BuiltinStringFunctions - Class in org.apache.beam.sdk.extensions.sql.impl.udf
-
BuiltinStringFunctions.
- BuiltinStringFunctions() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinStringFunctions
- BuiltinTrigonometricFunctions - Class in org.apache.beam.sdk.extensions.sql.impl.udf
-
TrigonometricFunctions.
- BuiltinTrigonometricFunctions() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinTrigonometricFunctions
- bulkIO() - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO
- BulkIO() - Constructor for class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BulkIO
- Bulk reading of a single query or table - Search tag in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
- Section
- Bulk reading of multiple queries or tables - Search tag in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
- Section
- Bundle<T,
CollectionT> - Interface in org.apache.beam.runners.local -
An immutable collection of elements which are part of a
PCollection. - BUNDLE - Enum constant in enum class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Import.ContentStructure
-
The source file contains one or more lines of newline-delimited JSON (ndjson).
- BundleCheckpointHandler - Interface in org.apache.beam.runners.fnexecution.control
-
A handler which is invoked when the SDK returns
BeamFnApi.DelayedBundleApplications as part of the bundle completion. - BundleCheckpointHandlers - Class in org.apache.beam.runners.fnexecution.control
-
Utility methods for creating
BundleCheckpointHandlers. - BundleCheckpointHandlers() - Constructor for class org.apache.beam.runners.fnexecution.control.BundleCheckpointHandlers
- BundleCheckpointHandlers.StateAndTimerBundleCheckpointHandler<T> - Class in org.apache.beam.runners.fnexecution.control
-
A
BundleCheckpointHandlerwhich usesTimerInternals.TimerDataandValueStateto rescheduleBeamFnApi.DelayedBundleApplication. - BundleFinalizationHandler - Interface in org.apache.beam.runners.fnexecution.control
-
A handler for the runner when a finalization request has been received.
- BundleFinalizationHandlers - Class in org.apache.beam.runners.fnexecution.control
-
Utility methods for creating
BundleFinalizationHandlers. - BundleFinalizationHandlers() - Constructor for class org.apache.beam.runners.fnexecution.control.BundleFinalizationHandlers
- BundleFinalizationHandlers.InMemoryFinalizer - Class in org.apache.beam.runners.fnexecution.control
- bundleFinalizer() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator.FlinkStepContext
- BundleProcessorCacheTimeoutFactory() - Constructor for class org.apache.beam.sdk.options.SdkHarnessOptions.BundleProcessorCacheTimeoutFactory
- BundleProgressHandler - Interface in org.apache.beam.runners.fnexecution.control
-
A handler for bundle progress messages, both during bundle execution and on its completion.
- BundleSplitHandler - Interface in org.apache.beam.runners.fnexecution.control
-
A handler which is invoked whenever an active bundle is split.
- by(Contextful<Contextful.Fn<UserT, DestinationT>>) - Method in class org.apache.beam.sdk.io.FileIO.Write
-
Like
FileIO.Write.by(org.apache.beam.sdk.transforms.SerializableFunction<UserT, DestinationT>), but with access to context such as side inputs. - by(SerializableFunction<UserT, DestinationT>) - Method in class org.apache.beam.sdk.io.FileIO.Write
-
Specifies how to partition elements into groups ("destinations").
- by(PredicateT) - Static method in class org.apache.beam.sdk.transforms.Filter
-
Returns a
PTransformthat takes an inputPCollection<T>and returns aPCollection<T>with elements that satisfy the given predicate. - By() - Constructor for class org.apache.beam.sdk.schemas.transforms.CoGroup.By
- byFieldAccessDescriptor(FieldAccessDescriptor) - Static method in class org.apache.beam.sdk.schemas.transforms.Group
-
Returns a transform that groups all elements in the input
PCollectionkeyed by the fields specified. - byFieldIds(Integer...) - Static method in class org.apache.beam.sdk.schemas.transforms.Group
-
Returns a transform that groups all elements in the input
PCollectionkeyed by the list of fields specified. - byFieldIds(Iterable<Integer>) - Static method in class org.apache.beam.sdk.schemas.transforms.Group
-
Same as
Group.byFieldIds(Integer...). - byFieldNames(Iterable<String>) - Static method in class org.apache.beam.sdk.schemas.transforms.Group
-
Same as
Group.byFieldNames(String...). - byFieldNames(String...) - Static method in class org.apache.beam.sdk.schemas.transforms.Group
-
Returns a transform that groups all elements in the input
PCollectionkeyed by the list of fields specified. - ByFields() - Constructor for class org.apache.beam.sdk.schemas.transforms.Group.ByFields
- byId(int, int, RetryConfiguration, AsyncBatchWriteHandler.Stats, BiFunction<String, List<RecT>, CompletableFuture<List<ErrT>>>, Function<ErrT, String>, Function<RecT, String>, Function<ErrT, String>) - Static method in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
-
AsyncBatchWriteHandler that correlates records and results by id, all results are erroneous.
- byId(int, FluentBackoff, AsyncBatchWriteHandler.Stats, BiFunction<String, List<RecT>, CompletableFuture<List<ErrT>>>, Function<ErrT, String>, Function<RecT, String>, Function<ErrT, String>) - Static method in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
-
AsyncBatchWriteHandler that correlates records and results by id, all results are erroneous.
- byKey() - Static method in class org.apache.beam.sdk.transforms.Redistribute
- byOffsetShard(Integer) - Static method in class org.apache.beam.sdk.io.kafka.KafkaReadRedistribute
- byPosition(int, int, RetryConfiguration, AsyncBatchWriteHandler.Stats, BiFunction<String, List<RecT>, CompletableFuture<List<ResT>>>, Function<ResT, String>) - Static method in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
-
AsyncBatchWriteHandler that correlates records and results by position in the respective list.
- byPosition(int, FluentBackoff, AsyncBatchWriteHandler.Stats, BiFunction<String, List<RecT>, CompletableFuture<List<ResT>>>, Function<ResT, String>) - Static method in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
-
AsyncBatchWriteHandler that correlates records and results by position in the respective list.
- byRecordKey(Integer) - Static method in class org.apache.beam.sdk.io.kafka.KafkaReadRedistribute
- BYTE - Enum constant in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- BYTE - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
-
The type of byte fields.
- ByteArray - Class in org.apache.beam.runners.spark.util
-
Serializable byte array.
- ByteArray(byte[]) - Constructor for class org.apache.beam.runners.spark.util.ByteArray
- ByteArrayCoder - Class in org.apache.beam.sdk.coders
-
A
Coderforbyte[]. - ByteArrayKey(byte[]) - Constructor for class org.apache.beam.runners.jet.Utils.ByteArrayKey
- ByteBuddyUtils - Class in org.apache.beam.sdk.schemas.utils
- ByteBuddyUtils() - Constructor for class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils
- ByteBuddyUtils.ConvertType - Class in org.apache.beam.sdk.schemas.utils
-
Give a Java type, returns the Java type expected for use with Row.
- ByteBuddyUtils.ConvertValueForGetter - Class in org.apache.beam.sdk.schemas.utils
-
Takes a
StackManipulationthat returns a value. - ByteBuddyUtils.ConvertValueForSetter - Class in org.apache.beam.sdk.schemas.utils
-
Row is going to call the setter with its internal Java type, however the user object being set might have a different type internally.
- ByteBuddyUtils.DefaultTypeConversionsFactory - Class in org.apache.beam.sdk.schemas.utils
- ByteBuddyUtils.InjectPackageStrategy - Class in org.apache.beam.sdk.schemas.utils
-
A naming strategy for ByteBuddy classes.
- ByteBuddyUtils.TransformingMap<K1,
V1, - Class in org.apache.beam.sdk.schemas.utilsK2, V2> - ByteBuddyUtils.TypeConversion<T> - Class in org.apache.beam.sdk.schemas.utils
- ByteBuddyUtils.TypeConversionsFactory - Interface in org.apache.beam.sdk.schemas.utils
- ByteBufferBundle() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.ByteBufferBundle
- ByteCoder - Class in org.apache.beam.sdk.coders
- ByteKey - Class in org.apache.beam.sdk.io.range
-
A class representing a key consisting of an array of bytes.
- ByteKeyRange - Class in org.apache.beam.sdk.io.range
-
A class representing a range of
ByteKeys. - ByteKeyRangeTracker - Class in org.apache.beam.sdk.io.range
- ByteKeyRangeTracker - Class in org.apache.beam.sdk.transforms.splittabledofn
- bytes() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
The
TypeDescriptorfor Byte. - Bytes() - Constructor for class org.apache.beam.sdk.jmh.util.VarIntBenchmark.Bytes
- BYTES - Enum constant in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- BYTES - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
-
The type of bytes fields.
- BytesBundle() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.BytesBundle
- bytesRead() - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
-
Counter of bytes read by a source.
- bytesReadBySplit(String) - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
-
Counter of bytes read by a source split.
- BytesThroughputEstimator<T> - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator
-
An estimator to provide an estimate on the byte throughput of the outputted elements.
- BytesThroughputEstimator<T> - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator
-
An estimator to provide an estimate on the throughput of the outputted elements.
- BytesThroughputEstimator(int, SizeEstimator<T>) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.BytesThroughputEstimator
- BytesThroughputEstimator(SizeEstimator<T>, int, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator.BytesThroughputEstimator
- BytesThroughputEstimator(SizeEstimator<T>, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator.BytesThroughputEstimator
- bytesToRowFn(SchemaProvider, TypeDescriptor<T>, Coder<? extends T>) - Static method in class org.apache.beam.sdk.schemas.RowMessages
- bytesToRowFn(SchemaProvider, TypeDescriptor<T>, ProcessFunction<byte[], ? extends T>) - Static method in class org.apache.beam.sdk.schemas.RowMessages
- byteString(byte[]) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.RowUtils
- byteString(byte[]) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigtableUtils
- ByteStringCoder - Class in org.apache.beam.runners.fnexecution.wire
-
A duplicate of
ByteStringCoderthat uses the Apache Beam vendored protobuf. - ByteStringCoder - Class in org.apache.beam.sdk.extensions.protobuf
-
A
CoderforByteStringobjects based on their encoded Protocol Buffer form. - ByteStringOutput() - Constructor for class org.apache.beam.sdk.jmh.util.VarIntBenchmark.ByteStringOutput
- ByteStringOutputStreamBenchmark - Class in org.apache.beam.sdk.jmh.util
-
Benchmarks for
ByteStringOutputStream. - ByteStringOutputStreamBenchmark() - Constructor for class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
- ByteStringOutputStreamBenchmark.NewVsCopy - Class in org.apache.beam.sdk.jmh.util
-
These benchmarks below provide good details as to the cost of creating a new buffer vs copying a subset of the existing one and re-using the larger one.
- ByteStringOutputStreamBenchmark.NewVsCopy.ArrayCopyState - Class in org.apache.beam.sdk.jmh.util
- ByteStringOutputStreamBenchmark.NewVsCopy.ArrayNewState - Class in org.apache.beam.sdk.jmh.util
- ByteStringOutputStreamBenchmark.ProtobufByteStringOutputStream - Class in org.apache.beam.sdk.jmh.util
- ByteStringOutputStreamBenchmark.SdkCoreByteStringOutputStream - Class in org.apache.beam.sdk.jmh.util
- ByteStringRangeHelper - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams
-
Helper functions to evaluate the completeness of collection of ByteStringRanges.
- ByteStringRangeHelper() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ByteStringRangeHelper
- byteStringUtf8(String) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.RowUtils
- byteStringUtf8(String) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigtableUtils
- bytesWritten() - Static method in class org.apache.beam.sdk.metrics.SinkMetrics
-
Counter of bytes written to a sink.
- ByteToElemFunction<V> - Class in org.apache.beam.runners.twister2.translators.functions
-
ByteToWindow function.
- ByteToElemFunction() - Constructor for class org.apache.beam.runners.twister2.translators.functions.ByteToElemFunction
- ByteToElemFunction(WindowedValues.WindowedValueCoder<V>) - Constructor for class org.apache.beam.runners.twister2.translators.functions.ByteToElemFunction
- ByteToWindowFunction<K,
V> - Class in org.apache.beam.runners.twister2.translators.functions -
ByteToWindow function.
- ByteToWindowFunction() - Constructor for class org.apache.beam.runners.twister2.translators.functions.ByteToWindowFunction
- ByteToWindowFunction(Coder<K>, WindowedValues.WindowedValueCoder<V>) - Constructor for class org.apache.beam.runners.twister2.translators.functions.ByteToWindowFunction
- ByteToWindowFunctionPrimitive<K,
V> - Class in org.apache.beam.runners.twister2.translators.functions -
ByteToWindow function.
- ByteToWindowFunctionPrimitive() - Constructor for class org.apache.beam.runners.twister2.translators.functions.ByteToWindowFunctionPrimitive
- ByteToWindowFunctionPrimitive(Coder<K>, WindowedValues.WindowedValueCoder<V>) - Constructor for class org.apache.beam.runners.twister2.translators.functions.ByteToWindowFunctionPrimitive
- BZIP2 - Enum constant in enum class org.apache.beam.sdk.io.CompressedSource.CompressionMode
-
Deprecated.
- BZIP2 - Enum constant in enum class org.apache.beam.sdk.io.Compression
-
BZip compression.
- BZIP2 - Enum constant in enum class org.apache.beam.sdk.io.FileBasedSink.CompressionType
-
Deprecated.
- BZIP2 - Enum constant in enum class org.apache.beam.sdk.io.TextIO.CompressionType
-
Deprecated.
- BZIP2 - Enum constant in enum class org.apache.beam.sdk.io.xml.XmlIO.Read.CompressionType
-
Deprecated.
C
- cache(String, Coder<?>) - Method in class org.apache.beam.runners.spark.translation.BoundedDataset
- cache(String, Coder<?>) - Method in interface org.apache.beam.runners.spark.translation.Dataset
- cache(String, Coder<?>) - Method in class org.apache.beam.runners.spark.translation.streaming.UnboundedDataset
- Cache - Class in org.apache.beam.io.requestresponse
-
Transforms for reading and writing request/response associations to a cache.
- Cache() - Constructor for class org.apache.beam.io.requestresponse.Cache
- Cache.Pair<RequestT,
ResponseT> - Class in org.apache.beam.io.requestresponse -
A simple POJO that holds both cache read and write
PTransforms. - CACHED_CREATORS - Static variable in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
- CACHED_CREATORS - Static variable in class org.apache.beam.sdk.schemas.utils.POJOUtils
- CachedSideInputReader - Class in org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions
-
SideInputReader that caches results for costly
Materializations. - CachedSideInputReader - Class in org.apache.beam.runners.spark.util
-
SideInputReaderthat caches materialized views. - CacheFactory - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.cache
- CacheFactory(DaoFactory, Duration) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.cache.CacheFactory
- CachingFactory<CreatedT> - Class in org.apache.beam.sdk.schemas
-
A wrapper around a
Factorythat assumes the schema parameter never changes. - CachingFactory(Factory<CreatedT>) - Constructor for class org.apache.beam.sdk.schemas.CachingFactory
- CalciteConnectionWrapper - Class in org.apache.beam.sdk.extensions.sql.impl
-
Abstract wrapper for
CalciteConnectionto simplify extension. - CalciteConnectionWrapper(CalciteConnection) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- CalciteFactoryWrapper - Class in org.apache.beam.sdk.extensions.sql.impl
-
Wrapper for
CalciteFactory. - CalciteQueryPlanner - Class in org.apache.beam.sdk.extensions.sql.impl
-
The core component to handle through a SQL statement, from explain execution plan, to generate a Beam pipeline.
- CalciteQueryPlanner(JdbcConnection, Collection<RuleSet>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner
-
Called by
BeamSqlEnv.instantiatePlanner() reflectively. - CalciteQueryPlanner.NonCumulativeCostImpl - Class in org.apache.beam.sdk.extensions.sql.impl
- CalciteUtils - Class in org.apache.beam.sdk.extensions.sql.impl.utils
-
Utility methods for Calcite related operations.
- CalciteUtils() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
- CalciteUtils.TimeWithLocalTzType - Class in org.apache.beam.sdk.extensions.sql.impl.utils
-
A LogicalType corresponding to TIME_WITH_LOCAL_TIME_ZONE.
- CalcRelSplitter - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
CalcRelSplitter operates on a
Calcwith multipleRexCallsub-expressions that cannot all be implemented by a single concreteRelNode. - CalcRelSplitter(Calc, RelBuilder, CalcRelSplitter.RelType[]) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter
-
Constructs a CalcRelSplitter.
- CalcRelSplitter.RelType - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
Type of relational expression.
- calculateRanges(PartitionT, PartitionT, Long) - Method in interface org.apache.beam.sdk.io.jdbc.JdbcReadWithPartitionsHelper
-
Calculate the range of each partition from the lower and upper bound, and number of partitions.
- CalendarWindows - Class in org.apache.beam.sdk.transforms.windowing
-
A collection of
WindowFns that windows values into calendar-based windows such as spans of days, months, or years. - CalendarWindows() - Constructor for class org.apache.beam.sdk.transforms.windowing.CalendarWindows
- CalendarWindows.DaysWindows - Class in org.apache.beam.sdk.transforms.windowing
-
A
WindowFnthat windows elements into periods measured by days. - CalendarWindows.MonthsWindows - Class in org.apache.beam.sdk.transforms.windowing
-
A
WindowFnthat windows elements into periods measured by months. - CalendarWindows.YearsWindows - Class in org.apache.beam.sdk.transforms.windowing
-
A
WindowFnthat windows elements into periods measured by years. - call() - Method in class org.apache.beam.runners.spark.translation.streaming.SparkRunnerStreamingContextFactory
- call(Iterator<WindowedValue<InputT>>) - Method in class org.apache.beam.runners.spark.translation.MultiDoFnFunction
- call(K, Iterator<WindowedValue<KV<K, InputT>>>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.GroupAlsoByWindowViaOutputBufferFn
- call(WindowedValue<KV<K, Iterable<InputT>>>) - Method in class org.apache.beam.runners.spark.translation.TranslationUtils.CombineGroupedValues
- call(WindowedValue<KV<K, V>>) - Method in class org.apache.beam.runners.spark.translation.ReifyTimestampsAndWindowsFunction
- call(WindowedValue<T>) - Method in class org.apache.beam.runners.spark.translation.SparkAssignWindowFn
- call(RequestT) - Method in interface org.apache.beam.io.requestresponse.Caller
- call(Tuple2<ByteArray, byte[]>) - Method in class org.apache.beam.runners.spark.coders.CoderHelpers.FromByteFunction
- call(Tuple2<TupleTag<V>, WindowedValue<?>>) - Method in class org.apache.beam.runners.spark.translation.TranslationUtils.TupleTagFilter
- Caller<RequestT,
ResponseT> - Interface in org.apache.beam.io.requestresponse -
Callerinterfaces user custom code intended for API calls. - CallShouldBackoff<ResponseT> - Interface in org.apache.beam.io.requestresponse
-
Informs whether a call to an API should backoff.
- cancel() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
- cancel() - Method in class org.apache.beam.runners.dataflow.util.DataflowTemplateJob
- cancel() - Method in class org.apache.beam.runners.direct.DirectRunner.DirectPipelineResult
- cancel() - Method in class org.apache.beam.runners.flink.FlinkDetachedRunnerResult
- cancel() - Method in class org.apache.beam.runners.flink.FlinkRunnerResult
- cancel() - Method in class org.apache.beam.runners.flink.translation.functions.ImpulseSourceFunction
- cancel() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.StreamingImpulseSource
-
Deprecated.
- cancel() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.TestStreamSource
- cancel() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.UnboundedSourceWrapper
- cancel() - Method in class org.apache.beam.runners.jet.FailedRunningPipelineResults
- cancel() - Method in class org.apache.beam.runners.jet.JetPipelineResult
- cancel() - Method in class org.apache.beam.runners.jobsubmission.JobInvocation
-
Cancel the job.
- cancel() - Method in class org.apache.beam.runners.spark.SparkPipelineResult
- cancel() - Method in class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingPipelineResult
- cancel() - Method in class org.apache.beam.runners.twister2.Twister2PipelineResult
- cancel() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.BigQueryServerStream
-
Cancels the stream, releasing any client- and server-side resources.
- cancel() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices.FakeBigQueryServerStream
- cancel() - Method in interface org.apache.beam.sdk.PipelineResult
-
Cancels the pipeline execution.
- cancel(Exception) - Method in class org.apache.beam.sdk.fn.CancellableQueue
-
Causes any pending and future
CancellableQueue.put(T)andCancellableQueue.take()invocations to throw an exception. - cancel(JobApi.CancelJobRequest, StreamObserver<JobApi.CancelJobResponse>) - Method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
- CancellableQueue<T> - Class in org.apache.beam.sdk.fn
-
A simplified
ThreadSafeblocking queue that can be cancelled freeing any blockedThreads and preventing futureThreads from blocking. - CancellableQueue(int) - Constructor for class org.apache.beam.sdk.fn.CancellableQueue
-
Creates a
ThreadSafeblocking queue with a maximum capacity. - cancelled() - Method in interface org.apache.beam.runners.local.PipelineMessageReceiver
-
Report that the pipeline has been cancelled.
- CANCELLED - Enum constant in enum class org.apache.beam.sdk.PipelineResult.State
-
The job has been explicitly cancelled.
- canConvertConvention(Convention) - Method in enum class org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention
- canEqual(Object) - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeInformation
- canEqual(Object) - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueTypeInformation
- canImplement(LogicalCalc, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter
-
Returns whether a relational expression can be implemented solely in a given
CalcRelSplitter.RelType. - canImplement(RexCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter.RelType
- canImplement(RexDynamicParam) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter.RelType
- canImplement(RexFieldAccess) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter.RelType
- canImplement(RexLiteral) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter.RelType
- canImplement(RexNode, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter.RelType
-
Returns whether this
RelTypecan implement a given expression. - canImplement(RexProgram) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter.RelType
-
Returns whether this tester's
RelTypecan implement a given program. - CannotProvideCoderException - Exception Class in org.apache.beam.sdk.coders
-
The exception thrown when a
CoderRegistryorCoderProvidercannot provide aCoderthat has been requested. - CannotProvideCoderException(String) - Constructor for exception class org.apache.beam.sdk.coders.CannotProvideCoderException
- CannotProvideCoderException(String, Throwable) - Constructor for exception class org.apache.beam.sdk.coders.CannotProvideCoderException
- CannotProvideCoderException(String, Throwable, CannotProvideCoderException.ReasonCode) - Constructor for exception class org.apache.beam.sdk.coders.CannotProvideCoderException
- CannotProvideCoderException(String, CannotProvideCoderException.ReasonCode) - Constructor for exception class org.apache.beam.sdk.coders.CannotProvideCoderException
- CannotProvideCoderException(Throwable) - Constructor for exception class org.apache.beam.sdk.coders.CannotProvideCoderException
- CannotProvideCoderException(Throwable, CannotProvideCoderException.ReasonCode) - Constructor for exception class org.apache.beam.sdk.coders.CannotProvideCoderException
- CannotProvideCoderException.ReasonCode - Enum Class in org.apache.beam.sdk.coders
-
Indicates the reason that
Coderinference failed. - canStopPolling(Instant, StateT) - Method in interface org.apache.beam.sdk.transforms.Watch.Growth.TerminationCondition
-
Called by the
Watchtransform to determine whether the given termination state signals thatWatchshould stop callingWatch.Growth.PollFnfor the current input, regardless of whether the lastWatch.Growth.PollResultwas complete or incomplete. - canTranslate(TransformT) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator
-
Checks if a composite / primitive transform can be translated.
- CassandraIO - Class in org.apache.beam.sdk.io.cassandra
-
An IO to read and write from/to Apache Cassandra
- CassandraIO.MutationType - Enum Class in org.apache.beam.sdk.io.cassandra
-
Specify the mutation type: either write or delete.
- CassandraIO.Read<T> - Class in org.apache.beam.sdk.io.cassandra
-
A
PTransformto read data from Apache Cassandra. - CassandraIO.ReadAll<T> - Class in org.apache.beam.sdk.io.cassandra
-
A
PTransformto read data from Apache Cassandra. - CassandraIO.Write<T> - Class in org.apache.beam.sdk.io.cassandra
-
A
PTransformto mutate into Apache Cassandra. - Cassandra Socket Options - Search tag in class org.apache.beam.sdk.io.cassandra.CassandraIO
- Section
- Cast<T> - Class in org.apache.beam.sdk.schemas.transforms
-
Set of utilities for casting rows between schemas.
- Cast() - Constructor for class org.apache.beam.sdk.schemas.transforms.Cast
- Cast.CompatibilityError - Class in org.apache.beam.sdk.schemas.transforms
-
Describes compatibility errors during casting.
- Cast.Narrowing - Class in org.apache.beam.sdk.schemas.transforms
-
Narrowing changes type without guarantee to preserve data.
- Cast.Validator - Interface in org.apache.beam.sdk.schemas.transforms
-
Interface for statically validating casts.
- Cast.Widening - Class in org.apache.beam.sdk.schemas.transforms
-
Widening changes to type that can represent any possible value of the original type.
- castNumber(Number, Schema.TypeName, Schema.TypeName) - Static method in class org.apache.beam.sdk.schemas.transforms.Cast
- castRow(Row, Schema, Schema) - Static method in class org.apache.beam.sdk.schemas.transforms.Cast
- castValue(Object, Schema.FieldType, Schema.FieldType) - Static method in class org.apache.beam.sdk.schemas.transforms.Cast
- catalog() - Method in class org.apache.beam.sdk.extensions.sql.impl.TableName
- catalog() - Method in class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig
- Catalog - Interface in org.apache.beam.sdk.extensions.sql.meta.catalog
-
Represents a named and configurable container for managing tables.
- catalogManager(CatalogManager) - Method in class org.apache.beam.sdk.extensions.sql.BeamSqlCli
- CatalogManager - Interface in org.apache.beam.sdk.extensions.sql.meta.catalog
-
Top-level authority that manages
Catalogs. - CatalogManagerSchema - Class in org.apache.beam.sdk.extensions.sql.impl
-
A Calcite
Schemathat corresponds to aCatalogManager. - catalogName() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAICreateCatalogItem
- catalogName() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportCatalogItems
- catalogName() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportUserEvents
- catalogName() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIPredict
- catalogName() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIWriteUserEvent
- CatalogRegistrar - Interface in org.apache.beam.sdk.extensions.sql.meta.catalog
-
Over-arching registrar to capture available
Catalogs. - catalogs() - Method in interface org.apache.beam.sdk.extensions.sql.meta.catalog.CatalogManager
- catalogs() - Method in class org.apache.beam.sdk.extensions.sql.meta.catalog.EmptyCatalogManager
- catalogs() - Method in class org.apache.beam.sdk.extensions.sql.meta.catalog.InMemoryCatalogManager
- CatalogSchema - Class in org.apache.beam.sdk.extensions.sql.impl
-
A Calcite
Schemathat corresponds to aCatalog. - catchUpToNow - Variable in class org.apache.beam.sdk.transforms.PeriodicSequence.SequenceDefinition
- catchUpToNow(boolean) - Method in class org.apache.beam.sdk.transforms.PeriodicImpulse
-
For internal use only; no backwards-compatibility guarantees.
- CdapIO - Class in org.apache.beam.sdk.io.cdap
-
A
CdapIOis a Transform for reading data from source or writing data to sink of a Cdap Plugin. - CdapIO() - Constructor for class org.apache.beam.sdk.io.cdap.CdapIO
- CdapIO.Read<K,
V> - Class in org.apache.beam.sdk.io.cdap -
A
PTransformto read from CDAP source. - CdapIO.Write<K,
V> - Class in org.apache.beam.sdk.io.cdap -
A
PTransformto write to CDAP sink. - cdapPluginObj - Variable in class org.apache.beam.sdk.io.cdap.Plugin
- CELL_SCHEMA - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider
- CEPCall - Class in org.apache.beam.sdk.extensions.sql.impl.cep
-
A
CEPCallinstance represents an operation (node) that contains an operator and a list of operands. - CEPFieldRef - Class in org.apache.beam.sdk.extensions.sql.impl.cep
-
A
CEPFieldRefinstance represents a node that points to a specified field in aRow. - CEPKind - Enum Class in org.apache.beam.sdk.extensions.sql.impl.cep
-
CEPKindcorresponds to Calcite'sSqlKind. - CEPLiteral - Class in org.apache.beam.sdk.extensions.sql.impl.cep
-
CEPLiteralrepresents a literal node. - CEPMeasure - Class in org.apache.beam.sdk.extensions.sql.impl.cep
-
The
CEPMeasureclass represents the Measures clause and contains information about output columns. - CEPMeasure(Schema, String, CEPOperation) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.cep.CEPMeasure
- CEPOperation - Class in org.apache.beam.sdk.extensions.sql.impl.cep
-
CEPOperationis the base class for the evaluation operations defined in theDEFINEsyntax ofMATCH_RECOGNIZE. - CEPOperation() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.cep.CEPOperation
- CEPOperator - Class in org.apache.beam.sdk.extensions.sql.impl.cep
-
The
CEPOperatorrecords the operators (i.e. - CEPPattern - Class in org.apache.beam.sdk.extensions.sql.impl.cep
-
Core pattern class that stores the definition of a single pattern.
- CEPUtils - Class in org.apache.beam.sdk.extensions.sql.impl.cep
-
Some utility methods for transforming Calcite's constructs into our own Beam constructs (for serialization purpose).
- CEPUtils() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.cep.CEPUtils
- CF_CONTINUATION_TOKEN - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
- CF_INITIAL_TOKEN - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
- CF_LOCK - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
- CF_MISSING_PARTITIONS - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
- CF_PARENT_LOW_WATERMARKS - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
- CF_PARENT_PARTITIONS - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
- CF_SHOULD_DELETE - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
- CF_VERSION - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
- CF_WATERMARK - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
- CHANGE_SQN_COLUMN - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiCDC
- CHANGE_STREAM_MUTATION_GC_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
-
Counter for the total number of ChangeStreamMutations that are initiated by garbage collection (not user initiated) identified during the execution of the Connector.
- CHANGE_STREAM_MUTATION_USER_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
-
Counter for the total number of ChangeStreamMutations that are initiated by users (not garbage collection) identified during the execution of the Connector.
- CHANGE_TYPE_COLUMN - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiCDC
- changeStreamAction(ChangeStreamMetrics) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ActionFactory
-
Creates and returns a singleton instance of an action class for processing individual ChangeStreamMutation in
ReadChangeStreamPartitionDoFn. - ChangeStreamAction - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.action
-
This class is responsible for processing individual ChangeStreamRecord.
- ChangeStreamAction(ChangeStreamMetrics) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ChangeStreamAction
-
Constructs ChangeStreamAction to process individual ChangeStreamRecord.
- ChangeStreamContinuationTokenHelper - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams
- ChangeStreamContinuationTokenHelper() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamContinuationTokenHelper
- ChangeStreamDao - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao
-
Data access object to list and read stream partitions of a table.
- ChangeStreamDao - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
-
Responsible for making change stream queries for a given partition.
- ChangeStreamDao(BigtableDataClient, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.ChangeStreamDao
- ChangeStreamMetrics - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams
-
Class to aggregate metrics related functionality.
- ChangeStreamMetrics - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams
-
Class to aggregate metrics related functionality.
- ChangeStreamMetrics() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
- ChangeStreamMetrics() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Constructs a ChangeStreamMetrics instance with the following metrics enabled by default.
- ChangeStreamMetrics(Set<MetricName>) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Constructs a ChangeStreamMetrics instance with the given metrics enabled.
- changeStreamQuery(String, Timestamp, Timestamp, long) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamDao
-
Performs a change stream query.
- ChangeStreamReaderBuilder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ChangeStreamReaderBuilder
- ChangeStreamRecord - Interface in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
Represents a Spanner Change Stream Record.
- changeStreamRecordMapper() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper.MapperFactory
-
Creates and returns a singleton instance of a mapper class capable of transforming a
Structinto aListofChangeStreamRecordsubclasses. - ChangeStreamRecordMapper - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper
- ChangeStreamRecordMetadata - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
Holds internal execution metrics / metadata for the processed
ChangeStreamRecord. - ChangeStreamRecordMetadata.Builder - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
- ChangeStreamResultSet - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
-
Decorator class over a
ResultSetthat provides telemetry for the streamed records. - ChangeStreamResultSetMetadata - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
-
Represents telemetry metadata gathered during the consumption of a change stream query.
- ChangeStreamsConstants - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams
-
Single place for defining the constants used in the
Spanner.readChangeStreams()connector. - ChangeStreamsConstants() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
- channelNames - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
- CHAR - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
- characters() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
The
TypeDescriptorfor Character. - check(RelNode) - Method in interface org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall.JoinChecker
- checkClientTrusted(X509Certificate[], String) - Method in class org.apache.beam.sdk.io.splunk.CustomX509TrustManager
- checkConfiguration(ClientConfiguration, AwsOptions) - Method in interface org.apache.beam.sdk.io.aws2.common.ClientBuilderFactory
-
Check if all necessary configuration is available to create clients.
- checkConfiguration(ClientConfiguration, AwsOptions) - Method in class org.apache.beam.sdk.io.aws2.common.ClientBuilderFactory.DefaultClientBuilder
- checkDone() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.ReadChangeStreamPartitionProgressTracker
-
This is to signal to the runner that this restriction has completed.
- checkDone() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
-
Checks if the restriction has been processed successfully.
- checkDone() - Method in class org.apache.beam.sdk.transforms.PeriodicSequence.OutputRangeTracker
- checkDone() - Method in class org.apache.beam.sdk.transforms.splittabledofn.ByteKeyRangeTracker
- checkDone() - Method in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
- checkDone() - Method in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker
-
Checks whether the restriction has been fully processed.
- checkExceptionAndMaybeThrow() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- checkForAsyncFailure() - Method in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
-
Check if any failure happened async.
- checkIdleTimeoutAndMaybeStartCountdown() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- checkIfAnySubscriptionExists(String, Duration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
- checkpoint(long) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.stableinput.BufferingDoFnRunner
-
Should be called when a checkpoint is created.
- Checkpoint - Class in org.apache.beam.runners.spark.translation.streaming
-
Checkpoint data to make it available in future pipeline runs.
- Checkpoint() - Constructor for class org.apache.beam.runners.spark.translation.streaming.Checkpoint
- Checkpoint.CheckpointDir - Class in org.apache.beam.runners.spark.translation.streaming
-
Checkpoint dir tree.
- checkpointCompleted(long) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.stableinput.BufferingDoFnRunner
-
Should be called when a checkpoint is completed.
- CheckpointDir(String) - Constructor for class org.apache.beam.runners.spark.translation.streaming.Checkpoint.CheckpointDir
- checkpointIfNeeded(DStream<?>, SerializablePipelineOptions) - Static method in class org.apache.beam.runners.spark.translation.TranslationUtils
-
Checkpoints the given DStream if checkpointing is enabled in the pipeline options.
- CheckpointMarkImpl - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
- CheckpointStats - Class in org.apache.beam.runners.flink.translation.utils
-
Helpers for reporting checkpoint durations.
- CheckpointStats(Supplier<DistributionCell>) - Constructor for class org.apache.beam.runners.flink.translation.utils.CheckpointStats
- checkServerTrusted(X509Certificate[], String) - Method in class org.apache.beam.sdk.io.splunk.CustomX509TrustManager
- CheckStopReadingFn - Interface in org.apache.beam.sdk.io.kafka
- CheckStopReadingFnWrapper - Class in org.apache.beam.sdk.io.kafka
- checksum() - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata
-
An optional checksum to identify the contents of a file.
- ChildPartition - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
A child partition represents a new partition that should be queried.
- ChildPartition(String, String) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartition
-
Constructs a child partition, which will have its own token and the parent that it originated from.
- ChildPartition(String, HashSet<String>) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartition
-
Constructs a child partition, which will have its own token and the parents that it originated from.
- ChildPartitionsRecord - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
Represents a ChildPartitionsRecord.
- ChildPartitionsRecord(Timestamp, String, List<ChildPartition>, ChangeStreamRecordMetadata) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
-
Constructs a child partitions record containing one or more child partitions.
- childPartitionsRecordAction(PartitionMetadataDao, ChangeStreamMetrics) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.ActionFactory
-
Creates and returns a singleton instance of an action class capable of process
ChildPartitionsRecords. - ChildPartitionsRecordAction - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.action
-
This class is part of the process for
ReadChangeStreamPartitionDoFnSDF. - Choosing an End Point (ICEBERG_CDC only) - Search tag in class org.apache.beam.sdk.io.iceberg.IcebergIO
- Section
- Choosing a Starting Point (ICEBERG_CDC only) - Search tag in class org.apache.beam.sdk.io.iceberg.IcebergIO
- Section
- CivilTimeEncoder - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Encoder for TIME and DATETIME values, according to civil_time encoding.
- classesToTranslators() - Method in interface org.apache.beam.runners.dataflow.util.CoderCloudObjectTranslatorRegistrar
- classesToTranslators() - Method in class org.apache.beam.runners.dataflow.util.DefaultCoderCloudObjectTranslatorRegistrar
- ClassLoaderFileSystem - Class in org.apache.beam.sdk.io
-
A read-only
FileSystemimplementation looking up resources using a ClassLoader. - ClassLoaderFileSystem.ClassLoaderFileSystemRegistrar - Class in org.apache.beam.sdk.io
-
AutoServiceregistrar for theClassLoaderFileSystem. - ClassLoaderFileSystem.ClassLoaderResourceId - Class in org.apache.beam.sdk.io
- ClassLoaderFileSystemRegistrar() - Constructor for class org.apache.beam.sdk.io.ClassLoaderFileSystem.ClassLoaderFileSystemRegistrar
- classNamesToTranslators() - Method in interface org.apache.beam.runners.dataflow.util.CoderCloudObjectTranslatorRegistrar
-
Gets a map from the name returned by
CloudObject.getClassName()to a translator that can convert into the equivalentCoder. - classNamesToTranslators() - Method in class org.apache.beam.runners.dataflow.util.DefaultCoderCloudObjectTranslatorRegistrar
- ClassWithSchema() - Constructor for class org.apache.beam.sdk.schemas.utils.ReflectUtils.ClassWithSchema
- CleanTmpFilesFromGcsFn(ValueProvider<String>, String) - Constructor for class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read.CleanTmpFilesFromGcsFn
-
Created object that will remove temp files from stage.
- cleanup() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
- cleanUp() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.ExecutableStageDoFnOperator
- cleanUpPrefix() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
-
Delete all the metadata rows starting with the change stream name prefix, except for detect new partition row because it signals the existence of a pipeline with the change stream name.
- CleanUpReadChangeStreamDoFn - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn
- CleanUpReadChangeStreamDoFn(DaoFactory) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.CleanUpReadChangeStreamDoFn
- clear() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.stableinput.KeyedBufferingElementsHandler
- clear() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.stableinput.NonKeyedBufferingElementsHandler
- clear() - Static method in class org.apache.beam.runners.spark.metrics.MetricsAccumulator
- clear() - Static method in class org.apache.beam.runners.spark.structuredstreaming.metrics.MetricsAccumulator
- clear() - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
- clear() - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
- clear() - Method in interface org.apache.beam.sdk.state.State
-
Clear out the state location.
- clear() - Method in interface org.apache.beam.sdk.state.Timer
-
Clears a timer.
- clear(K, W) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.BagUserStateHandler
-
Clears the bag user state for the given key and window.
- clearCache() - Static method in class org.apache.beam.runners.spark.io.MicrobatchSource
- clearGlobalState() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals
-
Allows to clear all state for the global watermark when the maximum watermark arrives.
- clearOutputElements() - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
Deprecated.Use
TestPipelinewith theDirectRunner. - clearOutputElements(TupleTag<T>) - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
Deprecated.Use
TestPipelinewith theDirectRunner. - clearProvidedSparkContext() - Static method in class org.apache.beam.runners.spark.translation.SparkContextFactory
- clearRange(Instant, Instant) - Method in interface org.apache.beam.sdk.state.OrderedListState
-
Clear a timestamp-limited subrange of the list.
- clearState(ReduceFn.Context) - Method in class org.apache.beam.runners.twister2.translators.functions.internal.SystemReduceFnBuffering
- clearWarnings() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- ClickHouseIO - Class in org.apache.beam.sdk.io.clickhouse
-
An IO to write to ClickHouse.
- ClickHouseIO() - Constructor for class org.apache.beam.sdk.io.clickhouse.ClickHouseIO
- ClickHouseIO.Write<T> - Class in org.apache.beam.sdk.io.clickhouse
-
A
PTransformto write to ClickHouse. - ClickHouseWriter - Class in org.apache.beam.sdk.io.clickhouse
-
Writes Rows and field values using
ClickHousePipedOutputStream. - ClickHouseWriter() - Constructor for class org.apache.beam.sdk.io.clickhouse.ClickHouseWriter
- clientBuffered(ExecutorService) - Static method in class org.apache.beam.sdk.fn.stream.OutboundObserverFactory
-
Create a buffering
OutboundObserverFactoryfor client-side RPCs with the specifiedExecutorServiceand the default buffer size. - clientBuffered(ExecutorService, int) - Static method in class org.apache.beam.sdk.fn.stream.OutboundObserverFactory
-
Create a buffering
OutboundObserverFactoryfor client-side RPCs with the specifiedExecutorServiceand buffer size. - ClientBuilderFactory - Interface in org.apache.beam.sdk.io.aws2.common
-
Factory to build and configure any
AwsClientBuilderusing a specificClientConfigurationor the globally provided settings inAwsOptionsas fallback. - ClientBuilderFactory.DefaultClientBuilder - Class in org.apache.beam.sdk.io.aws2.common
-
Default implementation of
ClientBuilderFactory. - ClientBuilderFactory.SkipCertificateVerificationTrustManagerProvider - Class in org.apache.beam.sdk.io.aws2.common
-
Trust provider to skip certificate verification.
- ClientConfiguration - Class in org.apache.beam.sdk.io.aws2.common
-
AWS client configuration.
- ClientConfiguration() - Constructor for class org.apache.beam.sdk.io.aws2.common.ClientConfiguration
- ClientConfiguration.Builder - Class in org.apache.beam.sdk.io.aws2.common
- clientDirect() - Static method in class org.apache.beam.sdk.fn.stream.OutboundObserverFactory
-
Create the default
OutboundObserverFactoryfor client-side RPCs, which uses basic unbuffered flow control. - Client-side rate limiting - Search tag in class org.apache.beam.sdk.io.googleads.GoogleAdsV19
- Section
- Clock - Interface in org.apache.beam.runners.direct
-
Access to the current time.
- clone() - Method in class org.apache.beam.runners.dataflow.util.CloudObject
- clone() - Method in class org.apache.beam.runners.flink.metrics.MetricsAccumulator
- CLONE_ONCE - Enum constant in enum class org.apache.beam.sdk.transforms.DoFnTester.CloningBehavior
-
Deprecated.
- CLONE_PER_BUNDLE - Enum constant in enum class org.apache.beam.sdk.transforms.DoFnTester.CloningBehavior
-
Deprecated.Clone the
DoFnand callDoFn.Setupevery time a bundle starts; callDoFn.Teardownevery time a bundle finishes. - clonesOf(OutT) - Static method in class org.apache.beam.sdk.transforms.SerializableFunctions
- close() - Method in class org.apache.beam.runners.dataflow.util.PackageUtil
- close() - Method in class org.apache.beam.runners.flink.metrics.FileReporter
- close() - Method in class org.apache.beam.runners.flink.translation.functions.FlinkDoFnFunction
- close() - Method in class org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction
- close() - Method in class org.apache.beam.runners.flink.translation.functions.FlinkStatefulDoFnFunction
- close() - Method in class org.apache.beam.runners.flink.translation.utils.Locker
- close() - Method in class org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat
- close() - Method in class org.apache.beam.runners.flink.translation.wrappers.SourceInputFormat
- close() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- close() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- close() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceSplitEnumerator
- close() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.LazyFlinkSourceSplitEnumerator
- close() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.UnboundedSourceWrapper
- close() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.SplittableDoFnOperator
- close() - Method in class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
- close() - Method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService
- close() - Method in class org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext
- close() - Method in class org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory
- close() - Method in class org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.WrappedSdkHarnessClient
- close() - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClient
- close() - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService
- close() - Method in interface org.apache.beam.runners.fnexecution.control.RemoteBundle
-
Closes this bundle.
- close() - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient.BundleProcessor.ActiveBundle
-
Blocks until bundle processing is finished.
- close() - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient
- close() - Method in class org.apache.beam.runners.fnexecution.control.SingleEnvironmentInstanceJobBundleFactory
-
Deprecated.
- close() - Method in class org.apache.beam.runners.fnexecution.data.GrpcDataService
- close() - Method in class org.apache.beam.runners.fnexecution.environment.ProcessEnvironment
- close() - Method in interface org.apache.beam.runners.fnexecution.environment.RemoteEnvironment
-
.
- close() - Method in class org.apache.beam.runners.fnexecution.environment.StaticRemoteEnvironment
- close() - Method in class org.apache.beam.runners.fnexecution.logging.GrpcLoggingService
- close() - Method in class org.apache.beam.runners.fnexecution.provisioning.StaticGrpcProvisionService
- close() - Method in class org.apache.beam.runners.fnexecution.state.GrpcStateService
- close() - Method in class org.apache.beam.runners.fnexecution.status.BeamWorkerStatusGrpcService
- close() - Method in class org.apache.beam.runners.jet.processors.ParDoP
- close() - Method in class org.apache.beam.runners.jet.processors.StatefulParDoP
- close() - Method in class org.apache.beam.runners.jet.processors.BoundedSourceP
- close() - Method in class org.apache.beam.runners.jet.processors.UnboundedSourceP
- close() - Method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
- close() - Method in class org.apache.beam.runners.portability.CloseableResource
-
Closes the underlying resource.
- close() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
- close() - Method in class org.apache.beam.runners.twister2.translators.functions.DoFnFunction
- close() - Method in class org.apache.beam.runners.twister2.translators.functions.Twister2SinkFunction
- close() - Method in class org.apache.beam.sdk.expansion.service.ExpansionServer
- close() - Method in class org.apache.beam.sdk.expansion.service.ExpansionService
- close() - Method in class org.apache.beam.sdk.extensions.arrow.ArrowConversion.RecordBatchRowIterator
- close() - Method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingReadableByteChannel
- close() - Method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingSeekableByteChannel
- close() - Method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingWritableByteChannel
- close() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- close() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
- close() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer
- close() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver
- close() - Method in interface org.apache.beam.sdk.fn.data.CloseableFnDataReceiver
-
.
- close() - Method in interface org.apache.beam.sdk.fn.server.FnService
-
.
- close() - Method in class org.apache.beam.sdk.fn.server.GrpcFnServer
- close() - Method in class org.apache.beam.sdk.fn.stream.DataStreams.ElementDelimitedOutputStream
- close() - Method in class org.apache.beam.sdk.io.aws2.auth.StsAssumeRoleForFederatedCredentialsProvider
- close() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
-
Closes the channel and returns the bundle result.
- close() - Method in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
-
Closes any
ReadableByteChannelcreated for the current reader. - close() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StorageClient
-
Close the client object.
- close() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
- close() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.WriteStreamServiceImpl
- close() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.BigtableChangeStreamAccessor
- close() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.DaoFactory
- close() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
-
Gracefully close the underlying netty channel.
- close() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
- close() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- close() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.BlockingCommitterImpl
- close() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.ManagedFactoryImpl
- close() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
- close() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSet
-
Closes the current change stream
ResultSet. - close() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor
- close() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- close() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
- close() - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantDeserializer
- close() - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantSerializer
- close() - Method in class org.apache.beam.sdk.io.solace.broker.JcsmpSessionService
- close() - Method in interface org.apache.beam.sdk.io.solace.broker.MessageProducer
-
Closes the message producer.
- close() - Method in interface org.apache.beam.sdk.io.solace.broker.MessageReceiver
-
Closes the message receiver.
- close() - Method in class org.apache.beam.sdk.io.solace.broker.SessionService
-
Gracefully closes the connection to the service.
- close() - Method in class org.apache.beam.sdk.io.solace.broker.SolaceMessageProducer
- close() - Method in class org.apache.beam.sdk.io.solace.broker.SolaceMessageReceiver
- close() - Method in class org.apache.beam.sdk.io.Source.Reader
-
Closes the reader.
- close() - Method in class org.apache.beam.sdk.io.thrift.ThriftIO.ThriftWriter
- close() - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
Deprecated.Use
TestPipelinewith theDirectRunner. - close() - Method in class org.apache.beam.sdk.transforms.errorhandling.ErrorHandler.DefaultErrorHandler
- close() - Method in class org.apache.beam.sdk.transforms.errorhandling.ErrorHandler.PTransformErrorHandler
- close(T) - Method in interface org.apache.beam.runners.portability.CloseableResource.Closer
- CloseableFnDataReceiver<T> - Interface in org.apache.beam.sdk.fn.data
-
A receiver of streamed data that can be closed.
- CloseableResource<T> - Class in org.apache.beam.runners.portability
-
An
AutoCloseablethat wraps a resource that needs to be cleaned up but does not implementAutoCloseableitself. - CloseableResource.CloseException - Exception Class in org.apache.beam.runners.portability
-
An exception that wraps errors thrown while a resource is being closed.
- CloseableResource.Closer<T> - Interface in org.apache.beam.runners.portability
-
A function that knows how to clean up after a resource.
- CloseableThrowingConsumer<ExceptionT,
T> - Interface in org.apache.beam.sdk.function -
A
ThrowingConsumerthat can be closed. - CLOSESTREAM_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
-
Counter for the total number of heartbeats identified during the execution of the Connector.
- closeTo(double, double) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcherwith identical criteria toMatchers.closeTo(double,double). - CloudObject - Class in org.apache.beam.runners.dataflow.util
-
A representation of an arbitrary Java object to be instantiated by Dataflow workers.
- cloudObjectClassName() - Method in interface org.apache.beam.runners.dataflow.util.CloudObjectTranslator
-
Gets the class name that will represent the
CloudObjectcreated by thisCloudObjectTranslator. - cloudObjectClassName() - Method in class org.apache.beam.runners.dataflow.util.RowCoderCloudObjectTranslator
- cloudObjectClassName() - Method in class org.apache.beam.runners.dataflow.util.SchemaCoderCloudObjectTranslator
- CloudObjects - Class in org.apache.beam.runners.dataflow.util
-
Utilities for converting an object to a
CloudObject. - CloudObjectTranslator<T> - Interface in org.apache.beam.runners.dataflow.util
-
A translator that takes an object and creates a
CloudObjectwhich can be converted back to the original object. - CloudPubsubTransforms - Class in org.apache.beam.sdk.io.gcp.pubsublite
-
A class providing transforms between Cloud Pub/Sub and Pub/Sub Lite message types.
- CloudResourceManagerOptions - Interface in org.apache.beam.sdk.extensions.gcp.options
-
Properties needed when using Google CloudResourceManager with the Apache Beam SDK.
- CloudVision - Class in org.apache.beam.sdk.extensions.ml
-
Factory class for implementations of
AnnotateImages. - CloudVision() - Constructor for class org.apache.beam.sdk.extensions.ml.CloudVision
- CloudVision.AnnotateImagesFromBytes - Class in org.apache.beam.sdk.extensions.ml
-
Accepts
ByteString(encoded image contents) with optionalDoFn.SideInputwith aMapofImageContextto the image. - CloudVision.AnnotateImagesFromBytesWithContext - Class in org.apache.beam.sdk.extensions.ml
- CloudVision.AnnotateImagesFromGcsUri - Class in org.apache.beam.sdk.extensions.ml
-
Accepts
String(image URI on GCS) with optionalDoFn.SideInputwith aMapofImageContextto the image. - CloudVision.AnnotateImagesFromGcsUriWithContext - Class in org.apache.beam.sdk.extensions.ml
- CO_GBK_RESULT_SCHEMA - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- CodahaleCsvSink - Class in org.apache.beam.runners.spark.structuredstreaming.metrics.sink
-
A
Sinkfor Spark's metric system reporting metrics (including Beam step metrics) to a CSV file. - CodahaleCsvSink(Properties, MetricRegistry) - Constructor for class org.apache.beam.runners.spark.structuredstreaming.metrics.sink.CodahaleCsvSink
-
Constructor for Spark 3.2.x and later.
- CodahaleCsvSink(Properties, MetricRegistry, SecurityManager) - Constructor for class org.apache.beam.runners.spark.structuredstreaming.metrics.sink.CodahaleCsvSink
-
Constructor for Spark 3.1.x and earlier.
- CodahaleGraphiteSink - Class in org.apache.beam.runners.spark.structuredstreaming.metrics.sink
-
A
Sinkfor Spark's metric system reporting metrics (including Beam step metrics) to Graphite. - CodahaleGraphiteSink(Properties, MetricRegistry) - Constructor for class org.apache.beam.runners.spark.structuredstreaming.metrics.sink.CodahaleGraphiteSink
-
Constructor for Spark 3.2.x and later.
- CodahaleGraphiteSink(Properties, MetricRegistry, SecurityManager) - Constructor for class org.apache.beam.runners.spark.structuredstreaming.metrics.sink.CodahaleGraphiteSink
-
Constructor for Spark 3.1.x and earlier.
- coder - Variable in class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceTransform
- coder() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.TimerSpec
- Coder<T> - Class in org.apache.beam.sdk.coders
-
A
Coder<T>defines how to encode and decode values of typeTinto byte streams. - Coder() - Constructor for class org.apache.beam.sdk.coders.Coder
- Coder() - Constructor for class org.apache.beam.sdk.io.range.OffsetRange.Coder
- Coder.Context - Class in org.apache.beam.sdk.coders
-
Deprecated.To implement a coder, do not use any
Coder.Context. Just implement only those abstract methods which do not accept aCoder.Contextand leave the default implementations for methods accepting aCoder.Context. - Coder.NonDeterministicException - Exception Class in org.apache.beam.sdk.coders
-
Exception thrown by
Coder.verifyDeterministic()if the encoding is not deterministic, including details of why the encoding is not deterministic. - CoderCloudObjectTranslatorRegistrar - Interface in org.apache.beam.runners.dataflow.util
-
Coderauthors have the ability to automatically have theirCoderregistered with the Dataflow Runner by creating aServiceLoaderentry and a concrete implementation of this interface. - coderConsistentWithEquals(Coder<T>, T, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given
Coder<T>and values of typeT, the values are equal if and only if the encoded bytes are equal. - coderConsistentWithEqualsInContext(Coder<T>, Coder.Context, T, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given
Coder<T>,Coder.Context, and values of typeT, the values are equal if and only if the encoded bytes are equal, in anyCoder.Context. - coderDecodeEncodeContentsEqual(Coder<CollectionT>, CollectionT) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given
Coder<Collection<T>>, and value of typeCollection<T>, encoding followed by decoding yields an equal value of typeCollection<T>, in anyCoder.Context. - coderDecodeEncodeContentsEqualInContext(Coder<CollectionT>, Coder.Context, CollectionT) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given
Coder<Collection<T>>, and value of typeCollection<T>, encoding followed by decoding yields an equal value of typeCollection<T>, in the givenCoder.Context. - coderDecodeEncodeContentsInSameOrder(Coder<IterableT>, IterableT) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given
Coder<Collection<T>>, and value of typeCollection<T>, encoding followed by decoding yields an equal value of typeCollection<T>, in anyCoder.Context. - coderDecodeEncodeContentsInSameOrderInContext(Coder<IterableT>, Coder.Context, IterableT) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given
Coder<Iterable<T>>, and value of typeIterable<T>, encoding followed by decoding yields an equal value of typeCollection<T>, in the givenCoder.Context. - coderDecodeEncodeEqual(Coder<T>, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given
Coder<T>, and value of typeT, encoding followed by decoding yields an equal value of typeT, in anyCoder.Context. - coderDecodeEncodeEqualInContext(Coder<T>, Coder.Context, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given
Coder<T>,Coder.Context, and value of typeT, encoding followed by decoding yields an equal value of typeT. - coderDecodeEncodeInContext(Coder<T>, Coder.Context, T, Matcher<T>) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given
Coder<T>,Coder.Context, and value of typeT, encoding followed by decoding yields a value of typeTand tests that the matcher succeeds on the values. - coderDecodesBase64(Coder<T>, String, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
- coderDecodesBase64(Coder<T>, List<String>, List<T>) - Static method in class org.apache.beam.sdk.testing.CoderProperties
- coderDecodesBase64ContentsEqual(Coder<IterableT>, String, IterableT) - Static method in class org.apache.beam.sdk.testing.CoderProperties
- coderDecodesBase64ContentsEqual(Coder<IterableT>, List<String>, List<IterableT>) - Static method in class org.apache.beam.sdk.testing.CoderProperties
- coderDeterministic(Coder<T>, T, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given
Coder<T>, and values of typeT, if the values are equal then the encoded bytes are equal, in anyCoder.Context. - coderDeterministicInContext(Coder<T>, Coder.Context, T, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given
Coder<T>,Coder.Context, and values of typeT, if the values are equal then the encoded bytes are equal. - coderEncodesBase64(Coder<T>, List<T>, List<String>) - Static method in class org.apache.beam.sdk.testing.CoderProperties
- coderEncodesBase64(Coder<T>, T, String) - Static method in class org.apache.beam.sdk.testing.CoderProperties
- CoderException - Exception Class in org.apache.beam.sdk.coders
-
An
Exceptionthrown if there is a problem encoding or decoding a value. - CoderException(String) - Constructor for exception class org.apache.beam.sdk.coders.CoderException
- CoderException(String, Throwable) - Constructor for exception class org.apache.beam.sdk.coders.CoderException
- CoderException(Throwable) - Constructor for exception class org.apache.beam.sdk.coders.CoderException
- coderFor(TypeDescriptor<T>, List<? extends Coder<?>>) - Method in class org.apache.beam.sdk.coders.CoderProvider
-
Returns a
Coder<T>to use for values of a particular type, given the Coders for each of the type's generic parameter types. - coderFor(TypeDescriptor<T>, List<? extends Coder<?>>) - Method in class org.apache.beam.sdk.coders.DefaultCoder.DefaultCoderProviderRegistrar.DefaultCoderProvider
-
Returns the
Coderreturned according to theCoderProviderfrom anyDefaultCoderannotation on the given class. - coderForFieldType(Schema.FieldType) - Static method in class org.apache.beam.sdk.schemas.SchemaCoder
- coderFromCloudObject(CloudObject) - Static method in class org.apache.beam.runners.dataflow.util.CloudObjects
- CoderHelpers - Class in org.apache.beam.runners.spark.coders
-
Serialization utility class.
- CoderHelpers - Class in org.apache.beam.runners.spark.structuredstreaming.translation.helpers
-
Serialization utility class.
- CoderHelpers.FromByteFunction<K,
V> - Class in org.apache.beam.runners.spark.coders -
A function for converting a byte array pair to a key-value pair.
- CoderProperties - Class in org.apache.beam.sdk.testing
-
Properties for use in
Codertests. - CoderProperties() - Constructor for class org.apache.beam.sdk.testing.CoderProperties
- CoderProperties.TestElementByteSizeObserver - Class in org.apache.beam.sdk.testing
-
An
ElementByteSizeObserverthat records the observed element sizes for testing purposes. - CoderProvider - Class in org.apache.beam.sdk.coders
-
A
CoderProviderprovidesCoders. - CoderProvider() - Constructor for class org.apache.beam.sdk.coders.CoderProvider
- CoderProviderRegistrar - Interface in org.apache.beam.sdk.coders
-
Codercreators have the ability to automatically have theircodersregistered with this SDK by creating aServiceLoaderentry and a concrete implementation of this interface. - CoderProviders - Class in org.apache.beam.sdk.coders
-
Static utility methods for creating and working with
CoderProviders. - CoderRegistry - Class in org.apache.beam.sdk.coders
- coderSerializable(Coder<T>) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that the given
Coder<T>can be correctly serialized and deserialized. - CoderSizeEstimator<T> - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator
-
This class is used to estimate the size in bytes of a given element.
- CoderSizeEstimator(Coder<T>) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator.CoderSizeEstimator
- CoderTypeInformation<T> - Class in org.apache.beam.runners.flink.translation.types
-
Flink
TypeInformationfor BeamCoders. - CoderTypeInformation(Coder<T>, SerializablePipelineOptions) - Constructor for class org.apache.beam.runners.flink.translation.types.CoderTypeInformation
- CoderTypeInformation(Coder<T>, PipelineOptions) - Constructor for class org.apache.beam.runners.flink.translation.types.CoderTypeInformation
- CoderTypeSerializer<T> - Class in org.apache.beam.runners.flink.translation.types
-
Flink
TypeSerializerfor BeamCoders. - CoderTypeSerializer(Coder<T>, boolean) - Constructor for class org.apache.beam.runners.flink.translation.types.CoderTypeSerializer
- CoderTypeSerializer(Coder<T>, SerializablePipelineOptions) - Constructor for class org.apache.beam.runners.flink.translation.types.CoderTypeSerializer
- CoGbkResult - Class in org.apache.beam.sdk.transforms.join
-
A row result of a
CoGroupByKey. - CoGbkResult(CoGbkResultSchema, Iterable<RawUnionValue>) - Constructor for class org.apache.beam.sdk.transforms.join.CoGbkResult
-
A row in the
PCollectionresulting from aCoGroupByKeytransform. - CoGbkResult(CoGbkResultSchema, Iterable<RawUnionValue>, int, int) - Constructor for class org.apache.beam.sdk.transforms.join.CoGbkResult
- CoGbkResult.CoGbkResultCoder - Class in org.apache.beam.sdk.transforms.join
-
A
CoderforCoGbkResults. - CoGbkResultSchema - Class in org.apache.beam.sdk.transforms.join
-
A schema for the results of a
CoGroupByKey. - CoGbkResultSchema(TupleTagList) - Constructor for class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
-
Builds a schema from a tuple of
TupleTag<?>s. - CoGroup - Class in org.apache.beam.sdk.schemas.transforms
-
A transform that performs equijoins across multiple schema
PCollections. - CoGroup() - Constructor for class org.apache.beam.sdk.schemas.transforms.CoGroup
- CoGroup.By - Class in org.apache.beam.sdk.schemas.transforms
-
Defines the set of fields to extract for the join key, as well as other per-input join options.
- CoGroup.ExpandCrossProduct - Class in org.apache.beam.sdk.schemas.transforms
-
A
PTransformthat calculates the cross-product join. - CoGroup.Impl - Class in org.apache.beam.sdk.schemas.transforms
-
The implementing PTransform.
- CoGroup.Result - Class in org.apache.beam.sdk.schemas.transforms
- CoGroupByKey<K> - Class in org.apache.beam.sdk.transforms.join
-
A
PTransformthat performs aCoGroupByKeyon a tuple of tables. - collect(String, Dataset<T>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.EvaluationContext
-
The purpose of this utility is to mark the evaluation of Spark actions, both during Pipeline translation, when evaluation is required, and when finally evaluating the pipeline.
- COLLECTION_TYPES - Static variable in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- CollectionCoder<T> - Class in org.apache.beam.sdk.coders
- CollectionCoder(Coder<T>) - Constructor for class org.apache.beam.sdk.coders.CollectionCoder
- collectionEncoder(Encoder<T>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers
- collectionEncoder(Encoder<T>, boolean) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers
- column(SqlParserPos, SqlIdentifier, SqlDataTypeSpec, SqlNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlDdlNodes
-
Creates a column declaration.
- Column() - Constructor for class org.apache.beam.sdk.io.clickhouse.TableSchema.Column
- Column() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema.Column
- COLUMN_CREATED_AT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for the timestamp at which the partition row was first created.
- COLUMN_END_TIMESTAMP - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for the timestamp to end the change stream query of the partition.
- COLUMN_FAMILIES - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
- COLUMN_FINISHED_AT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for the timestamp at which the partition was marked as finished by the
ReadChangeStreamPartitionDoFnSDF. - COLUMN_HEARTBEAT_MILLIS - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for the change stream query heartbeat interval in millis.
- COLUMN_PARENT_TOKENS - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for parent partition tokens.
- COLUMN_PARTITION_TOKEN - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for the partition token.
- COLUMN_RUNNING_AT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for the timestamp at which the partition was marked as running by the
ReadChangeStreamPartitionDoFnSDF. - COLUMN_SCHEDULED_AT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for the timestamp at which the partition was scheduled by the
DetectNewPartitionsDoFnSDF. - COLUMN_START_TIMESTAMP - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for the timestamp to start the change stream query of the partition.
- COLUMN_STATE - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for the state that the partition is currently in.
- COLUMN_WATERMARK - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for the current watermark of the partition.
- columns() - Method in class org.apache.beam.sdk.io.clickhouse.TableSchema
- COLUMNS - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiCDC
- COLUMNS_MAPPING - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.RowUtils
- columnType() - Method in class org.apache.beam.sdk.io.clickhouse.TableSchema.Column
- ColumnType - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
Defines a column type from a Cloud Spanner table with the following information: column name, column type, flag indicating if column is primary key and column position in the table.
- ColumnType() - Constructor for class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- ColumnType(String, TypeCode, boolean, long) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
- combine(Iterable<? extends Instant>) - Method in enum class org.apache.beam.sdk.transforms.windowing.TimestampCombiner
-
Combines the given times, which must be from the same window and must have been passed through
TimestampCombiner.merge(org.apache.beam.sdk.transforms.windowing.BoundedWindow, java.lang.Iterable<? extends org.joda.time.Instant>). - combine(Iterable<WindowedValue<KV<K, InputT>>>, Collector<WindowedValue<KV<K, AccumT>>>) - Method in class org.apache.beam.runners.flink.translation.functions.FlinkPartialReduceFunction
- combine(AbstractFlinkCombineRunner.FlinkCombiner<K, InputT, AccumT, OutputT>, WindowingStrategy<Object, W>, SideInputReader, PipelineOptions, Iterable<WindowedValue<KV<K, InputT>>>, Collector<WindowedValue<KV<K, OutputT>>>) - Method in class org.apache.beam.runners.flink.translation.functions.AbstractFlinkCombineRunner
-
Consumes
WindowedValuesand produces combined output to the given output. - combine(AbstractFlinkCombineRunner.FlinkCombiner<K, InputT, AccumT, OutputT>, WindowingStrategy<Object, W>, SideInputReader, PipelineOptions, Iterable<WindowedValue<KV<K, InputT>>>, Collector<WindowedValue<KV<K, OutputT>>>) - Method in class org.apache.beam.runners.flink.translation.functions.HashingFlinkCombineRunner
- combine(AbstractFlinkCombineRunner.FlinkCombiner<K, InputT, AccumT, OutputT>, WindowingStrategy<Object, W>, SideInputReader, PipelineOptions, Iterable<WindowedValue<KV<K, InputT>>>, Collector<WindowedValue<KV<K, OutputT>>>) - Method in class org.apache.beam.runners.flink.translation.functions.SingleWindowFlinkCombineRunner
- combine(AbstractFlinkCombineRunner.FlinkCombiner<K, InputT, AccumT, OutputT>, WindowingStrategy<Object, W>, SideInputReader, PipelineOptions, Iterable<WindowedValue<KV<K, InputT>>>, Collector<WindowedValue<KV<K, OutputT>>>) - Method in class org.apache.beam.runners.flink.translation.functions.SortingFlinkCombineRunner
- combine(Instant...) - Method in enum class org.apache.beam.sdk.transforms.windowing.TimestampCombiner
-
Varargs variant of
TimestampCombiner.combine(java.lang.Iterable<? extends org.joda.time.Instant>). - Combine - Class in org.apache.beam.sdk.transforms
-
PTransforms for combiningPCollectionelements globally and per-key. - Combine.AccumulatingCombineFn<InputT,
AccumT, - Class in org.apache.beam.sdk.transformsOutputT> -
A
CombineFnthat uses a subclass ofCombine.AccumulatingCombineFn.Accumulatoras its accumulator type. - Combine.AccumulatingCombineFn.Accumulator<InputT,
AccumT, - Interface in org.apache.beam.sdk.transformsOutputT> -
The type of mutable accumulator values used by this
AccumulatingCombineFn. - Combine.BinaryCombineDoubleFn - Class in org.apache.beam.sdk.transforms
-
An abstract subclass of
Combine.CombineFnfor implementing combiners that are more easily and efficiently expressed as binary operations ondoubles. - Combine.BinaryCombineFn<V> - Class in org.apache.beam.sdk.transforms
-
An abstract subclass of
Combine.CombineFnfor implementing combiners that are more easily expressed as binary operations. - Combine.BinaryCombineIntegerFn - Class in org.apache.beam.sdk.transforms
-
An abstract subclass of
Combine.CombineFnfor implementing combiners that are more easily and efficiently expressed as binary operations onints - Combine.BinaryCombineLongFn - Class in org.apache.beam.sdk.transforms
-
An abstract subclass of
Combine.CombineFnfor implementing combiners that are more easily and efficiently expressed as binary operations onlongs. - Combine.CombineFn<InputT,
AccumT, - Class in org.apache.beam.sdk.transformsOutputT> -
A
CombineFn<InputT, AccumT, OutputT>specifies how to combine a collection of input values of typeInputTinto a single output value of typeOutputT. - Combine.Globally<InputT,
OutputT> - Class in org.apache.beam.sdk.transforms -
Combine.Globally<InputT, OutputT>takes aPCollection<InputT>and returns aPCollection<OutputT>whose elements are the result of combining all the elements in each window of the inputPCollection, using a specifiedCombineFn<InputT, AccumT, OutputT>. - Combine.GloballyAsSingletonView<InputT,
OutputT> - Class in org.apache.beam.sdk.transforms -
Combine.GloballyAsSingletonView<InputT, OutputT>takes aPCollection<InputT>and returns aPCollectionView<OutputT>whose elements are the result of combining all the elements in each window of the inputPCollection, using a specifiedCombineFn<InputT, AccumT, OutputT>. - Combine.GroupedValues<K,
InputT, - Class in org.apache.beam.sdk.transformsOutputT> -
GroupedValues<K, InputT, OutputT>takes aPCollection<KV<K, Iterable<InputT>>>, such as the result ofGroupByKey, applies a specifiedCombineFn<InputT, AccumT, OutputT>to each of the inputKV<K, Iterable<InputT>>elements to produce a combined outputKV<K, OutputT>element, and returns aPCollection<KV<K, OutputT>>containing all the combined output elements. - Combine.Holder<V> - Class in org.apache.beam.sdk.transforms
-
Holds a single value value of type
Vwhich may or may not be present. - Combine.IterableCombineFn<V> - Class in org.apache.beam.sdk.transforms
- Combine.PerKey<K,
InputT, - Class in org.apache.beam.sdk.transformsOutputT> -
PerKey<K, InputT, OutputT>takes aPCollection<KV<K, InputT>>, groups it by key, applies a combining function to theInputTvalues associated with each key to produce a combinedOutputTvalue, and returns aPCollection<KV<K, OutputT>>representing a map from each distinct key of the inputPCollectionto the corresponding combined value. - Combine.PerKeyWithHotKeyFanout<K,
InputT, - Class in org.apache.beam.sdk.transformsOutputT> -
Like
Combine.PerKey, but sharding the combining of hot keys. - Combine.SimpleCombineFn<V> - Class in org.apache.beam.sdk.transforms
-
Deprecated.
- CombineAsIterable<T> - Class in org.apache.beam.sdk.io.gcp.bigquery
- CombineAsIterable() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.CombineAsIterable
- CombineFieldsByFields() - Constructor for class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
- combineFn - Variable in class org.apache.beam.runners.flink.translation.functions.FlinkPartialReduceFunction
- combineFn - Variable in class org.apache.beam.runners.flink.translation.functions.FlinkReduceFunction
- combineFn() - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CountIf
- combineFn() - Static method in class org.apache.beam.sdk.transforms.Count
-
Returns a
Combine.CombineFnthat counts the number of its inputs. - combineFn() - Static method in class org.apache.beam.sdk.transforms.Latest
-
Returns a
Combine.CombineFnthat selects the latest element among its inputs. - combineFn(int) - Static method in class org.apache.beam.sdk.transforms.Sample
-
Returns a
Combine.CombineFnthat computes a fixed-sized uniform sample of its inputs. - CombineFn() - Constructor for class org.apache.beam.sdk.transforms.Combine.CombineFn
- CombineFnBase - Class in org.apache.beam.sdk.transforms
-
For internal use only; no backwards-compatibility guarantees.
- CombineFnBase() - Constructor for class org.apache.beam.sdk.transforms.CombineFnBase
- CombineFnBase.GlobalCombineFn<InputT,
AccumT, - Interface in org.apache.beam.sdk.transformsOutputT> -
For internal use only; no backwards-compatibility guarantees.
- CombineFns - Class in org.apache.beam.sdk.transforms
-
Static utility methods that create combine function instances.
- CombineFns() - Constructor for class org.apache.beam.sdk.transforms.CombineFns
- CombineFns.CoCombineResult - Class in org.apache.beam.sdk.transforms
-
A tuple of outputs produced by a composed combine functions.
- CombineFns.ComposeCombineFnBuilder - Class in org.apache.beam.sdk.transforms
-
A builder class to construct a composed
CombineFnBase.GlobalCombineFn. - CombineFns.ComposedCombineFn<DataT> - Class in org.apache.beam.sdk.transforms
-
A composed
Combine.CombineFnthat applies multipleCombineFns. - CombineFns.ComposedCombineFnWithContext<DataT> - Class in org.apache.beam.sdk.transforms
-
A composed
CombineWithContext.CombineFnWithContextthat applies multipleCombineFnWithContexts. - CombineFnTester - Class in org.apache.beam.sdk.testing
-
Utilities for testing
CombineFns. - CombineFnTester() - Constructor for class org.apache.beam.sdk.testing.CombineFnTester
- CombineFnWithContext() - Constructor for class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
- combineGlobally(JavaRDD<WindowedValue<InputT>>, SparkCombineFn<InputT, InputT, AccumT, OutputT>, Coder<AccumT>, WindowingStrategy<?, ?>) - Static method in class org.apache.beam.runners.spark.translation.GroupCombineFunctions
-
Apply a composite
Combine.Globallytransformation. - CombineGroupedValues(SparkCombineFn<KV<K, InputT>, InputT, ?, OutputT>) - Constructor for class org.apache.beam.runners.spark.translation.TranslationUtils.CombineGroupedValues
- combinePerKey(JavaRDD<WindowedValue<KV<K, V>>>, SparkCombineFn<KV<K, V>, V, AccumT, ?>, Coder<K>, Coder<V>, Coder<AccumT>, WindowingStrategy<?, ?>) - Static method in class org.apache.beam.runners.spark.translation.GroupCombineFunctions
-
Apply a composite
Combine.PerKeytransformation. - CombineWithContext - Class in org.apache.beam.sdk.transforms
-
This class contains combine functions that have access to
PipelineOptionsand side inputs throughCombineWithContext.Context. - CombineWithContext() - Constructor for class org.apache.beam.sdk.transforms.CombineWithContext
- CombineWithContext.CombineFnWithContext<InputT,
AccumT, - Class in org.apache.beam.sdk.transformsOutputT> -
A combine function that has access to
PipelineOptionsand side inputs throughCombineWithContext.Context. - CombineWithContext.Context - Class in org.apache.beam.sdk.transforms
-
Information accessible to all methods in
CombineFnWithContextandKeyedCombineFnWithContext. - CombineWithContext.RequiresContextInternal - Interface in org.apache.beam.sdk.transforms
-
An internal interface for signaling that a
GloballyCombineFnor aPerKeyCombineFnneeds to accessCombineWithContext.Context. - combining(Coder<AccumT>, Combine.CombineFn<InputT, AccumT, OutputT>) - Static method in class org.apache.beam.sdk.state.StateSpecs
-
Identical to
StateSpecs.combining(CombineFn), but with an accumulator coder explicitly supplied. - combining(Coder<AccumT>, CombineWithContext.CombineFnWithContext<InputT, AccumT, OutputT>) - Static method in class org.apache.beam.sdk.state.StateSpecs
-
For internal use only; no backwards compatibility guarantees
- combining(Combine.CombineFn<InputT, AccumT, OutputT>) - Static method in class org.apache.beam.sdk.state.StateSpecs
-
Create a
StateSpecfor aCombiningStatewhich uses aCombine.CombineFnto automatically merge multiple values of typeInputTinto a single resultingOutputT. - combining(CombineWithContext.CombineFnWithContext<InputT, AccumT, OutputT>) - Static method in class org.apache.beam.sdk.state.StateSpecs
-
For internal use only; no backwards compatibility guarantees
- combiningFromInputInternal(Coder<InputT>, Combine.CombineFn<InputT, AccumT, OutputT>) - Static method in class org.apache.beam.sdk.state.StateSpecs
-
For internal use only; no backwards-compatibility guarantees.
- CombiningState<InputT,
AccumT, - Interface in org.apache.beam.sdk.stateOutputT> -
A
ReadableStatecell defined by aCombine.CombineFn, accepting multiple input values, combining them as specified into accumulators, and producing a single output value. - comment(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.Table.Builder
- commit() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- commitOffset(Offset) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.BlockingCommitterImpl
- commitOffsets() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
Enable committing record offset.
- commitOffsetsInFinalize() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Finalized offsets are committed to Kafka.
- commitWriteStreams(String, Iterable<String>) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.WriteStreamService
-
Commit write streams of type PENDING.
- commitWriteStreams(String, Iterable<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.WriteStreamServiceImpl
- commitWriteStreams(String, Iterable<String>) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- Common Kafka Consumer Configurations - Search tag in class org.apache.beam.sdk.io.kafka.KafkaIO
- Section
- commonPrefixLength(RandomAccessData, RandomAccessData) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.UnsignedLexicographicalComparator
-
Compute the length of the common prefix of the two provided sets of bytes.
- compact(AccumT) - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
- compact(AccumT) - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
-
Returns an accumulator that represents the same logical value as the input accumulator, but may have a more compact representation.
- compact(AccumT, CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
-
Returns an accumulator that represents the same logical value as the input accumulator, but may have a more compact representation.
- compact(Object[]) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
- compact(Object[], CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
- compact(List<V>) - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
- compare(byte[], byte[]) - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueComparator
- compare(JobMessage, JobMessage) - Method in class org.apache.beam.runners.dataflow.util.MonitoringUtil.TimeStampComparator
- compare(RandomAccessData, RandomAccessData) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.UnsignedLexicographicalComparator
- compare(RandomAccessData, RandomAccessData, int) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.UnsignedLexicographicalComparator
-
Compare the two sets of bytes starting at the given offset.
- compare(KV<K, V>, KV<K, V>) - Method in class org.apache.beam.sdk.values.KV.OrderByKey
- compare(KV<K, V>, KV<K, V>) - Method in class org.apache.beam.sdk.values.KV.OrderByValue
- compare(Row, Row) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel.BeamSqlRowComparator
- compare(T, T) - Method in class org.apache.beam.sdk.transforms.Top.Largest
-
Deprecated.
- compare(T, T) - Method in class org.apache.beam.sdk.transforms.Top.Natural
- compare(T, T) - Method in class org.apache.beam.sdk.transforms.Top.Reversed
- compare(T, T) - Method in class org.apache.beam.sdk.transforms.Top.Smallest
-
Deprecated.
- compareSchemaField(Schema.Field, Schema.Field) - Static method in class org.apache.beam.sdk.io.jdbc.SchemaUtil
-
compares two fields.
- compareSerialized(DataInputView, DataInputView) - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueComparator
- compareTo(Object) - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
- compareTo(Path) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- compareTo(ByteArray) - Method in class org.apache.beam.runners.spark.util.ByteArray
- compareTo(ContiguousSequenceRange) - Method in class org.apache.beam.sdk.extensions.ordered.ContiguousSequenceRange
- compareTo(ByteKey) - Method in class org.apache.beam.sdk.io.range.ByteKey
-
ByteKeyimplementsComparable<ByteKey>by comparing the arrays in lexicographic order. - compareTo(RedisCursor) - Method in class org.apache.beam.sdk.io.redis.RedisCursor
-
RedisCursorimplementsComparable<RedisCursor>by transforming the cursors to an index of the Redis table. - compareTo(IntervalWindow) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
- compareToReference(TypeComparator<byte[]>) - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueComparator
- comparing(SerializableFunction<? super T, ? extends V>) - Static method in interface org.apache.beam.sdk.transforms.SerializableComparator
-
Analogous to
Comparator.comparing(Function), except that it takes in aSerializableFunctionas the key extractor and returns aSerializableComparator. - comparingNullFirst(Function<? super T, ? extends K>) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
- CompatibilityError() - Constructor for class org.apache.beam.sdk.schemas.transforms.Cast.CompatibilityError
- compile(List<CEPPattern>, Schema) - Static method in class org.apache.beam.sdk.extensions.sql.impl.nfa.NFA
- CompileException(DiagnosticCollector<?>) - Constructor for exception class org.apache.beam.sdk.schemas.transforms.providers.StringCompiler.CompileException
- complete() - Method in class org.apache.beam.runners.jet.processors.ParDoP
- complete() - Method in class org.apache.beam.runners.jet.processors.BoundedSourceP
- complete() - Method in class org.apache.beam.runners.jet.processors.ImpulseP
- complete() - Method in class org.apache.beam.runners.jet.processors.StatefulParDoP
- complete() - Method in class org.apache.beam.runners.jet.processors.UnboundedSourceP
- complete() - Method in class org.apache.beam.runners.jet.processors.ViewP
- complete() - Method in class org.apache.beam.runners.jet.processors.WindowGroupP
- complete(List<TimestampedValue<OutputT>>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth.PollResult
-
Constructs a
Watch.Growth.PollResultwith the given outputs and declares that there will be no new outputs for the current input. - complete(Instant, List<OutputT>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth.PollResult
-
Like
Watch.Growth.PollResult.complete(List), but assigns the same timestamp to all new outputs. - completed() - Method in interface org.apache.beam.runners.local.PipelineMessageReceiver
-
Report that the pipeline has successfully completed.
- completeEdge(int) - Method in class org.apache.beam.runners.jet.processors.ParDoP
- completeEdge(int) - Method in class org.apache.beam.runners.jet.processors.StatefulParDoP
- CompleteFlinkCombiner(CombineFnBase.GlobalCombineFn<InputT, AccumT, OutputT>) - Constructor for class org.apache.beam.runners.flink.translation.functions.AbstractFlinkCombineRunner.CompleteFlinkCombiner
- complexityFactor - Variable in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator
- COMPONENT_ENCODINGS - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- compose() - Static method in class org.apache.beam.sdk.transforms.CombineFns
-
Returns a
CombineFns.ComposeCombineFnBuilderto construct a composedCombineFnBase.GlobalCombineFn. - compose(String, SerializableFunction<InputT, OutputT>) - Static method in class org.apache.beam.sdk.transforms.PTransform
-
Like
PTransform.compose(SerializableFunction), but with a custom name. - compose(SerializableFunction<InputT, OutputT>) - Static method in class org.apache.beam.sdk.transforms.PTransform
-
For a
SerializableFunction<InputT, OutputT>fn, returns aPTransformgiven by applyingfn.apply(v)to the inputPCollection<InputT>. - ComposeCombineFnBuilder() - Constructor for class org.apache.beam.sdk.transforms.CombineFns.ComposeCombineFnBuilder
- COMPOSITE_TYPES - Static variable in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- CompressedReader(CompressedSource<T>, FileBasedSource.FileBasedReader<T>) - Constructor for class org.apache.beam.sdk.io.CompressedSource.CompressedReader
-
Create a
CompressedReaderfrom aCompressedSourceand delegate reader. - CompressedSource<T> - Class in org.apache.beam.sdk.io
-
A Source that reads from compressed files.
- CompressedSource.CompressedReader<T> - Class in org.apache.beam.sdk.io
-
Reader for a
CompressedSource. - CompressedSource.CompressionMode - Enum Class in org.apache.beam.sdk.io
-
Deprecated.Use
Compressioninstead - CompressedSource.DecompressingChannelFactory - Interface in org.apache.beam.sdk.io
-
Factory interface for creating channels that decompress the content of an underlying channel.
- Compression - Enum Class in org.apache.beam.sdk.io
-
Various compression types for reading/writing files.
- compute(Iterator<RawUnionValue>, RecordCollector<WindowedValue<OutputT>>) - Method in class org.apache.beam.runners.twister2.translators.functions.OutputTagFilter
- compute(Iterator<WindowedValue<InputT>>, RecordCollector<RawUnionValue>) - Method in class org.apache.beam.runners.twister2.translators.functions.DoFnFunction
- compute(Iterator<WindowedValue<T>>, RecordCollector<WindowedValue<T>>) - Method in class org.apache.beam.runners.twister2.translators.functions.AssignWindowsFunction
- compute(Partition, TaskContext) - Method in class org.apache.beam.runners.spark.io.SourceRDD.Bounded
- compute(Partition, TaskContext) - Method in class org.apache.beam.runners.spark.io.SourceRDD.Unbounded
- compute(Time) - Method in class org.apache.beam.runners.spark.translation.SingleEmitInputDStream
- compute(Time) - Method in class org.apache.beam.runners.spark.translation.streaming.TestDStream
- computeIfAbsent(K, Function<? super K, ? extends V>) - Method in interface org.apache.beam.sdk.state.MapState
-
A deferred read-followed-by-write.
- computeOutputs() - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
-
Computes the outputs for all RDDs that are leaves in the DAG and do not have any actions (like saving to a file) registered on them (i.e.
- computeOutputs() - Method in class org.apache.beam.runners.spark.translation.SparkTranslationContext
-
Compute the outputs for all RDDs that are leaves in the DAG.
- computeSelfCost(RelOptPlanner, RelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamEnumerableConverter
- computeSelfCost(RelOptPlanner, RelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
- concat(Iterable<T>...) - Static method in class org.apache.beam.sdk.fn.stream.PrefetchableIterables
-
Concatentates the
Iterables. - concat(Iterator<T>...) - Static method in class org.apache.beam.sdk.fn.stream.PrefetchableIterators
-
Concatentates the
Iterators. - concat(List<T>, List<T>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.utils.ScalaInterop
- CONCAT_FIELD_NAMES - Static variable in class org.apache.beam.sdk.schemas.utils.SelectHelpers
-
This policy keeps all levels of a name.
- Concatenate() - Constructor for class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Concatenate
- ConcatenateAsIterable() - Constructor for class org.apache.beam.runners.flink.FlinkStreamingAggregationsTranslators.ConcatenateAsIterable
- concatFieldNames() - Method in class org.apache.beam.sdk.schemas.transforms.Select.Flattened
-
For nested fields, concatenate all the names separated by a _ character in the flattened schema.
- concatIterators(Iterator<Iterator<T>>) - Static method in class org.apache.beam.sdk.fn.stream.PrefetchableIterators
- CONCRETE_CLASS - Static variable in class org.apache.beam.sdk.io.WriteFiles
-
For internal use by runners.
- config() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- config() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.hcatalog.HCatalogTable
- Config() - Constructor for class org.apache.beam.sdk.managed.testing.TestSchemaTransformProvider.Config
- configuration - Variable in class org.apache.beam.runners.jobsubmission.JobServerDriver
- Configuration - Search tag in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
- Section
- Configuration() - Constructor for class org.apache.beam.io.debezium.DebeziumTransformRegistrar.ReadBuilder.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.expansion.service.WindowIntoTransformProvider.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.extensions.schemaio.expansion.ExternalSchemaIOTransformRegistrar.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.extensions.sql.expansion.ExternalSqlTransformRegistrar.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.ReadDataBuilder.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.WriteBuilder.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ChangeStreamReaderBuilder.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergCdcReadSchemaTransformProvider.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergReadSchemaTransformProvider.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaIO.Write.External.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.io.snowflake.crosslanguage.ReadBuilder.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.io.snowflake.crosslanguage.WriteBuilder.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.FlattenTransformProvider.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider.Configuration
- configurationClass() - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.expansion.service.WindowIntoTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformProvider
-
Provides the required
TypedSchemaTransformProvider.configurationClass(). - configurationClass() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformProvider
-
Returns the expected class of the configuration.
- configurationClass() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.io.json.providers.JsonWriteTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadProvider
-
Returns the expected class of the configuration.
- configurationClass() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteProvider
-
Returns the expected class of the configuration.
- configurationClass() - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.schemas.transforms.providers.FlattenTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.schemas.transforms.TypedSchemaTransformProvider
- ConfigurationLocator() - Constructor for class org.apache.beam.sdk.io.hdfs.HadoopFileSystemOptions.ConfigurationLocator
- Configuration of AWS clients - Search tag in class org.apache.beam.sdk.io.aws2.dynamodb.DynamoDBIO
- Section
- Configuration of AWS clients - Search tag in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO
- Section
- Configuration of AWS clients - Search tag in class org.apache.beam.sdk.io.aws2.sns.SnsIO
- Section
- Configuration of AWS clients - Search tag in class org.apache.beam.sdk.io.aws2.sqs.SqsIO
- Section
- Configuration Options - Search tag in class org.apache.beam.sdk.io.iceberg.IcebergIO
- Section
- configurationSchema() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSchemaIOProvider
-
Returns the expected schema of the configuration object.
- configurationSchema() - Method in class org.apache.beam.sdk.extensions.sql.expansion.SqlTransformSchemaTransformProvider
- configurationSchema() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaIOProvider
-
Returns the expected schema of the configuration object.
- configurationSchema() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider
-
Returns the expected schema of the configuration object.
- configurationSchema() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
-
Returns the expected schema of the configuration object.
- configurationSchema() - Method in class org.apache.beam.sdk.io.jdbc.JdbcSchemaIOProvider
-
Returns the expected schema of the configuration object.
- configurationSchema() - Method in interface org.apache.beam.sdk.schemas.io.SchemaIOProvider
-
Returns the expected schema of the configuration object.
- configurationSchema() - Method in interface org.apache.beam.sdk.schemas.transforms.SchemaTransformProvider
-
Returns the expected schema of the configuration object.
- configurationSchema() - Method in class org.apache.beam.sdk.schemas.transforms.TypedSchemaTransformProvider
- Configurations of ReadSourceDescriptors - Search tag in class org.apache.beam.sdk.io.kafka.KafkaIO
- Section
- configure() - Static method in class org.apache.beam.sdk.transforms.windowing.Window
-
Returns a new builder for a
Windowtransform for setting windowing parameters other than the windowing function. - configure(Map<String, ?>, boolean) - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantDeserializer
- configure(Map<String, ?>, boolean) - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantSerializer
- configure(Configuration) - Method in class org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat
- configure(Configuration) - Method in class org.apache.beam.runners.flink.translation.wrappers.SourceInputFormat
- ConfigWrapper<T> - Class in org.apache.beam.sdk.io.cdap
-
Class for building
PluginConfigobject of the specific class . - ConfigWrapper(Class<T>) - Constructor for class org.apache.beam.sdk.io.cdap.ConfigWrapper
- ConfluentSchemaRegistryDeserializerProvider<T> - Class in org.apache.beam.sdk.io.kafka
-
A
DeserializerProviderthat uses Confluent Schema Registry to resolve aDeserializers andCodergiven a subject. - connect() - Method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
-
Connect to the Redis instance.
- connect() - Method in class org.apache.beam.sdk.io.solace.broker.JcsmpSessionService
- connect() - Method in class org.apache.beam.sdk.io.solace.broker.SessionService
-
Establishes a connection to the service.
- connect(String, Properties) - Method in class org.apache.beam.sdk.extensions.sql.impl.JdbcDriver
-
Configures Beam-specific options and opens a JDBC connection to Calcite.
- connect(CatalogManager, PipelineOptions) - Static method in class org.apache.beam.sdk.extensions.sql.impl.JdbcDriver
-
Like
JdbcDriver.connect(TableProvider, PipelineOptions), but overrides the top-level schema with aCatalogManager. - connect(TableProvider, PipelineOptions) - Static method in class org.apache.beam.sdk.extensions.sql.impl.JdbcDriver
-
Connects to the driver using standard
JdbcDriver.connect(String, Properties)call, but overrides the initial schema factory. - CONNECT_STRING_PREFIX - Static variable in class org.apache.beam.sdk.extensions.sql.impl.JdbcDriver
- connection() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- connection() - Method in class org.apache.beam.sdk.extensions.sql.impl.CatalogManagerSchema
- connectionAcquisitionTimeout() - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration
-
Milliseconds to wait when acquiring a connection from the pool before giving up and timing out.
- connectionAcquisitionTimeout(Integer) - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration.Builder
-
Milliseconds to wait when acquiring a connection from the pool before giving up and timing out.
- ConnectionConfiguration() - Constructor for class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
- ConnectionConfiguration() - Constructor for class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.ConnectionConfiguration
- ConnectionConfiguration() - Constructor for class org.apache.beam.sdk.io.mqtt.MqttIO.ConnectionConfiguration
- ConnectionConfiguration() - Constructor for class org.apache.beam.sdk.io.solr.SolrIO.ConnectionConfiguration
- ConnectionManager - Class in org.apache.beam.sdk.io.cassandra
- ConnectionManager() - Constructor for class org.apache.beam.sdk.io.cassandra.ConnectionManager
- connectionMaxIdleTime() - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration
-
Maximum milliseconds a connection should be allowed to remain open while idle.
- connectionMaxIdleTime(Integer) - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration.Builder
-
Maximum milliseconds a connection should be allowed to remain open while idle.
- connectionTimeout() - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration
-
Milliseconds to wait when initially establishing a connection before giving up and timing out.
- connectionTimeout(Integer) - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration.Builder
-
Milliseconds to wait when initially establishing a connection before giving up and timing out.
- connectionTimeToLive() - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration
-
Maximum milliseconds a connection should be allowed to remain open, regardless of usage frequency.
- connectionTimeToLive(Integer) - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration.Builder
-
Maximum milliseconds a connection should be allowed to remain open, regardless of usage frequency.
- ConnectorConfiguration() - Constructor for class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
- Connector retries - Search tag in class org.apache.beam.sdk.io.solace.SolaceIO
- Section
- Connectors - Enum Class in org.apache.beam.io.debezium
-
Enumeration of debezium connectors.
- consistentWithEquals() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.FooterCoder
- consistentWithEquals() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmRecordCoder
- consistentWithEquals() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmShardCoder
- consistentWithEquals() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.KeyPrefixCoder
- consistentWithEquals() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
- consistentWithEquals() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.SingletonKeyedWorkItemCoder
- consistentWithEquals() - Method in class org.apache.beam.runners.fnexecution.wire.ByteStringCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BigEndianLongCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BigEndianShortCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BitSetCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BooleanCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.ByteCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.Coder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.DequeCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.DoubleCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.DurationCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.FloatCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.InstantCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.KvCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
-
LengthPrefixCoderis consistent with equals if the nestedCoderis. - consistentWithEquals() - Method in class org.apache.beam.sdk.coders.ListCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.MapCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.NullableCoder
-
NullableCoderis consistent with equals if the nestedCoderis. - consistentWithEquals() - Method in class org.apache.beam.sdk.coders.OptionalCoder
-
OptionalCoderis consistent with equals if the nestedCoderis. - consistentWithEquals() - Method in class org.apache.beam.sdk.coders.SerializableCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.SortedMapCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.TimestampPrefixingWindowCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.VarIntCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.VarLongCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.ZstdCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow.Coder
- consistentWithEquals() - Method in class org.apache.beam.sdk.io.fs.MetadataCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.io.fs.MetadataCoderV2
- consistentWithEquals() - Method in class org.apache.beam.sdk.io.fs.ResourceIdCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.io.kafka.ProducerRecordCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.io.range.OffsetRange.Coder
- consistentWithEquals() - Method in class org.apache.beam.sdk.schemas.SchemaCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow.Coder
- consistentWithEquals() - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
- ConsoleIO - Class in org.apache.beam.runners.spark.io
-
Print to console.
- ConsoleIO.Write - Class in org.apache.beam.runners.spark.io
-
Write to console.
- ConsoleIO.Write.Unbound<T> - Class in org.apache.beam.runners.spark.io
-
PTransformwritingPCollectionto the console. - constant(FileBasedSink.FilenamePolicy) - Static method in class org.apache.beam.sdk.io.DynamicFileDestinations
-
A specialization of
DynamicFileDestinations.constant(FilenamePolicy, SerializableFunction)for the case where UserT and OutputT are the same type and the format function is the identity. - constant(FileBasedSink.FilenamePolicy, SerializableFunction<UserT, OutputT>) - Static method in class org.apache.beam.sdk.io.DynamicFileDestinations
-
Returns a
FileBasedSink.DynamicDestinationsthat always returns the sameFileBasedSink.FilenamePolicy. - constant(OutT) - Static method in class org.apache.beam.sdk.transforms.SerializableFunctions
- CONSTANT_WINDOW_SIZE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
- constantDestinations(FileBasedSink.FilenamePolicy, Schema, Map<String, Object>, CodecFactory, SerializableFunction<UserT, OutputT>) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
-
Returns a
DynamicAvroDestinationsthat always returns the sameFileBasedSink.FilenamePolicy, schema, metadata, and codec. - constantDestinations(FileBasedSink.FilenamePolicy, Schema, Map<String, Object>, CodecFactory, SerializableFunction<UserT, OutputT>, AvroSink.DatumWriterFactory<OutputT>) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
-
Returns a
DynamicAvroDestinationsthat always returns the sameFileBasedSink.FilenamePolicy, schema, metadata, and codec. - Constraints - Search tag in class org.apache.beam.sdk.extensions.sql.impl.cep.OrderKey
- Section
- Constraints - Search tag in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel
- Section
- Constraints - Search tag in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamWindowRel
- Section
- constructFilter(List<RexNode>) - Method in class org.apache.beam.sdk.extensions.sql.meta.BaseBeamTable
- constructFilter(List<RexNode>) - Method in interface org.apache.beam.sdk.extensions.sql.meta.BeamSqlTable
-
Generate an IO implementation of
BeamSqlTableFilterfor predicate push-down. - constructFilter(List<RexNode>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigtable.BigtableTable
- constructFilter(List<RexNode>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTable
- constructName(ResourceId, String, String, int, int, String, String) - Static method in class org.apache.beam.sdk.io.DefaultFilenamePolicy
-
Constructs a fully qualified name from components.
- Consumed positions - Search tag in interface org.apache.beam.sdk.io.range.RangeTracker
- Section
- consumesProjection() - Method in interface org.apache.beam.sdk.schemas.ProjectionConsumer
-
Returns a map from input
TupleTagid to aFieldAccessDescriptordescribing which Schema fieldsthismust access from the corresponding inputPCollectionto complete successfully. - Consuming messages from RabbitMQ server - Search tag in class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO
- Section
- contains(Descriptors.Descriptor) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoDomain
- contains(List<SerializableMatcher<? super T>>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcherwith identical criteria toMatchers.contains(List). - contains(Coder<T>, T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcherwith identical criteria toMatchers.contains(Object[]). - contains(SerializableMatcher<? super T>...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcherwith identical criteria toMatchers.contains(Matcher[]). - contains(IntervalWindow) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
-
Returns whether this window contains the given window.
- contains(PCollectionView<T>) - Method in class org.apache.beam.runners.flink.translation.functions.FlinkSideInputReader
- contains(PCollectionView<T>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.CachedSideInputReader
- contains(PCollectionView<T>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.SparkSideInputReader
- contains(PCollectionView<T>) - Method in class org.apache.beam.runners.spark.util.CachedSideInputReader
- contains(PCollectionView<T>) - Method in class org.apache.beam.runners.spark.util.SparkSideInputReader
- contains(PCollectionView<T>) - Method in class org.apache.beam.runners.twister2.utils.Twister2SideInputReader
- contains(T) - Method in interface org.apache.beam.sdk.state.SetState
-
Returns a
ReadableStatewhoseReadableState.read()method will return true if this set contains the specified element at the point when thatReadableState.read()call returns. - contains(T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcherwith identical criteria toMatchers.contains(Object[]). - containsInAnyOrder() - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
-
Deprecated.Prefer
PAssert.IterableAssert.empty()to this method. - containsInAnyOrder() - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
- containsInAnyOrder(Iterable<T>) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
-
Asserts that the iterable in question contains the provided elements.
- containsInAnyOrder(Iterable<T>) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
-
Checks that the
Iterablecontains the expected elements, in any order. - containsInAnyOrder(Collection<SerializableMatcher<? super T>>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcherwith identical criteria toMatchers.containsInAnyOrder(Collection). - containsInAnyOrder(Coder<T>, T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcherwith identical criteria toMatchers.containsInAnyOrder(Object[]). - containsInAnyOrder(SerializableMatcher<? super T>...) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
-
Asserts that the iterable in question matches the provided elements.
- containsInAnyOrder(SerializableMatcher<? super T>...) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
-
Checks that the
Iterablecontains elements that match the provided matchers, in any order. - containsInAnyOrder(SerializableMatcher<? super T>...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcherwith identical criteria toMatchers.containsInAnyOrder(Matcher[]). - containsInAnyOrder(T...) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
-
Asserts that the iterable in question contains the provided elements.
- containsInAnyOrder(T...) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
-
Checks that the
Iterablecontains the expected elements, in any order. - containsInAnyOrder(T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcherwith identical criteria toMatchers.containsInAnyOrder(Object[]). - containsKey(Object) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
- containsKey(K) - Method in interface org.apache.beam.sdk.state.MultimapState
-
Returns a
ReadableStatewhoseReadableState.read()method will return true if this multimap contains the specified key at the point when thatReadableState.read()call returns. - containsKey(ByteKey) - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
-
Returns
trueif the specifiedByteKeyis contained within this range. - containsSeekableInput(RelNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
-
This method returns whether any of the children of the relNode are Seekable.
- containsString(String) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcherwith identical criteria toMatchers.containsString(java.lang.String). - containsValue(Object) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
- CONTENT_STRUCTURE_UNSPECIFIED - Enum constant in enum class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Import.ContentStructure
-
If the content structure is not specified, the default value BUNDLE will be used.
- context - Variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- Context() - Constructor for class org.apache.beam.sdk.schemas.utils.SchemaZipFold.Context
- Context() - Constructor for class org.apache.beam.sdk.transforms.CombineWithContext.Context
- Context() - Constructor for class org.apache.beam.sdk.transforms.Contextful.Fn.Context
- Context(boolean) - Constructor for class org.apache.beam.sdk.coders.Coder.Context
-
Deprecated.
- Context(TableDataInsertAllResponse.InsertErrors) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy.Context
- Contextful<ClosureT> - Class in org.apache.beam.sdk.transforms
-
Pair of a bit of user code (a "closure") and the
Requirementsneeded to run it. - Contextful.Fn<InputT,
OutputT> - Interface in org.apache.beam.sdk.transforms -
A function from an input to an output that may additionally access
Contextful.Fn.Contextwhen computing the result. - Contextful.Fn.Context - Class in org.apache.beam.sdk.transforms
-
An accessor for additional capabilities available in
Contextful.Fn.apply(InputT, org.apache.beam.sdk.transforms.Contextful.Fn.Context). - contextSideInput - Variable in class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromBytes
- contextSideInput - Variable in class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromBytesWithContext
- contextSideInput - Variable in class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromGcsUri
- contextSideInput - Variable in class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromGcsUriWithContext
- ContextualTextIO - Class in org.apache.beam.sdk.io.contextualtextio
-
PTransforms that read text files and collect contextual information of the elements in the input. - ContextualTextIO.Read - Class in org.apache.beam.sdk.io.contextualtextio
-
Implementation of
ContextualTextIO.read(). - ContextualTextIO.ReadFiles - Class in org.apache.beam.sdk.io.contextualtextio
-
Implementation of
ContextualTextIO.readFiles(). - ContiguousSequenceRange - Class in org.apache.beam.sdk.extensions.ordered
-
A range of contiguous event sequences and the latest timestamp of the events in the range.
- ContiguousSequenceRange() - Constructor for class org.apache.beam.sdk.extensions.ordered.ContiguousSequenceRange
- CONTINUE - Enum constant in enum class org.apache.beam.runners.local.ExecutionDriver.DriverState
- continuously(Duration, Watch.Growth.TerminationCondition<String, ?>) - Method in class org.apache.beam.sdk.io.FileIO.Match
- continuously(Duration, Watch.Growth.TerminationCondition<String, ?>) - Method in class org.apache.beam.sdk.io.FileIO.MatchAll
- continuously(Duration, Watch.Growth.TerminationCondition<String, ?>) - Method in class org.apache.beam.sdk.io.FileIO.MatchConfiguration
-
Continuously watches for new files at the given interval until the given termination condition is reached, where the input to the condition is the filepattern.
- continuously(Duration, Watch.Growth.TerminationCondition<String, ?>, boolean) - Method in class org.apache.beam.sdk.io.FileIO.Match
- continuously(Duration, Watch.Growth.TerminationCondition<String, ?>, boolean) - Method in class org.apache.beam.sdk.io.FileIO.MatchAll
- continuously(Duration, Watch.Growth.TerminationCondition<String, ?>, boolean) - Method in class org.apache.beam.sdk.io.FileIO.MatchConfiguration
-
Continuously watches for new files at the given interval until the given termination condition is reached, where the input to the condition is the filepattern.
- control(StreamObserver<BeamFnApi.InstructionRequest>) - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService
-
Called by gRPC for each incoming connection from an SDK harness, and enqueue an available SDK harness client.
- ControlClientPool - Interface in org.apache.beam.runners.fnexecution.control
-
A pool of control clients that brokers incoming SDK harness connections (in the form of
InstructionRequestHandlers. - ControlClientPool.Sink - Interface in org.apache.beam.runners.fnexecution.control
-
A sink for
InstructionRequestHandlerskeyed by worker id. - ControlClientPool.Source - Interface in org.apache.beam.runners.fnexecution.control
-
A source of
InstructionRequestHandlers. - ConversionOptions() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions
- convert() - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTable.RowToDocument
- convert(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamCalcRule
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamEnumerableConverterRule
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamIntersectRule
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamIOSinkRule
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamMatchRule
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamMinusRule
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamSideInputLookupJoinRule
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamSortRule
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamTableFunctionScanRule
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamUncollectRule
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamUnionRule
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamValuesRule
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamWindowRule
- Convert - Class in org.apache.beam.sdk.schemas.transforms
-
A set of utilities for converting between different objects supporting schemas.
- Convert() - Constructor for class org.apache.beam.sdk.schemas.transforms.Convert
- CONVERT_TO_BIG_DECIMAL - Enum constant in enum class org.apache.beam.sdk.extensions.sbe.UnsignedOptions.Behavior
-
Converts the unsigned value to a
BigDecimalvalue. - CONVERT_TO_STRING - Enum constant in enum class org.apache.beam.sdk.extensions.sbe.UnsignedOptions.Behavior
-
Converts the unsigned value to a string representation.
- convertArray(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
- convertArray(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
- convertArray(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
- convertArray(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
- convertAvroFieldStrict(Object, Schema, Schema.FieldType) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
-
Strict conversion from AVRO to Beam, strict because it doesn't do widening or narrowing during conversion.
- convertAvroFieldStrict(Object, Schema, Schema.FieldType, GenericData) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
-
Strict conversion from AVRO to Beam, strict because it doesn't do widening or narrowing during conversion.
- convertAvroFormat(Schema.FieldType, Object, BigQueryUtils.ConversionOptions) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
-
Tries to convert an Avro decoded value to a Beam field value based on the target type of the Beam field.
- convertByteBuffer(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
- convertByteBuffer(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
- convertByteBuffer(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
- convertByteBuffer(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
- convertCharSequence(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
- convertCharSequence(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
- convertCharSequence(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
- convertCharSequence(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
- convertCollection(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
- convertCollection(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
- convertCollection(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
- convertCollection(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
- convertDateTime(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
- convertDateTime(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
- convertDateTime(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
- convertDateTime(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
- convertDefault(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.AvroConvertType
- convertDefault(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.AvroConvertValueForGetter
- convertDefault(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.AvroConvertValueForSetter
- convertDefault(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
- convertDefault(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
- convertDefault(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
- convertDefault(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
- ConvertedSchemaInformation(SchemaCoder<T>, Schema.FieldType) - Constructor for class org.apache.beam.sdk.schemas.utils.ConvertHelpers.ConvertedSchemaInformation
- convertEnum(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
- convertEnum(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
- convertEnum(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
- convertEnum(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
- convertGenericRecordToTableRow(GenericRecord) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
-
Convert generic record to Bq TableRow.
- convertGenericRecordToTableRow(GenericRecord, TableSchema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
-
Deprecated.
- ConvertHelpers - Class in org.apache.beam.sdk.schemas.utils
-
Helper functions for converting between equivalent schema types.
- ConvertHelpers() - Constructor for class org.apache.beam.sdk.schemas.utils.ConvertHelpers
- ConvertHelpers.ConvertedSchemaInformation<T> - Class in org.apache.beam.sdk.schemas.utils
-
Return value after converting a schema.
- convertIterable(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
- convertIterable(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
- convertIterable(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
- convertIterable(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
- convertList(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
- convertList(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
- convertList(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
- convertList(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
- convertMap(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
- convertMap(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
- convertMap(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
- convertMap(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
- convertNewPartitionRowKeyToPartition(ByteString) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
-
Convert new partition row key to partition to process metadata read from Bigtable.
- convertNode2Map(JsonNode) - Static method in class org.apache.beam.sdk.extensions.sql.TableUtils
- convertNumbers(TableRow) - Static method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
- convertPartitionToNewPartitionRowKey(Range.ByteStringRange) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
-
Convert partition to a New Partition row key to query for partitions ready to be streamed as the result of splits and merges.
- convertPartitionToStreamPartitionRowKey(Range.ByteStringRange) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
-
Convert partition to a Stream Partition row key to query for metadata of partitions that are currently being streamed.
- convertPrimitive(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
- convertPrimitive(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
- convertPrimitive(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
- convertPrimitive(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
- convertRelOptCost(RelOptCost) - Static method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
- convertStreamPartitionRowKeyToPartition(ByteString) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
-
Convert stream partition row key to partition to process metadata read from Bigtable.
- convertToBagSpecInternal(StateSpec<CombiningState<InputT, AccumT, OutputT>>) - Static method in class org.apache.beam.sdk.state.StateSpecs
-
For internal use only; no backwards-compatibility guarantees.
- convertToBeamRel(String, QueryPlanner.QueryParameters) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner
-
It parses and validate the input query, then convert into a
BeamRelNodetree. - convertToBeamRel(String, QueryPlanner.QueryParameters) - Method in interface org.apache.beam.sdk.extensions.sql.impl.QueryPlanner
-
It parses and validate the input query, then convert into a
BeamRelNodetree. - convertToFileResourceIfPossible(String) - Static method in class org.apache.beam.sdk.io.FileBasedSink
-
This is a helper function for turning a user-provided output filename prefix and converting it into a
ResourceIdfor writing output files. - convertToJcsmpDestination(Solace.Destination) - Static method in class org.apache.beam.sdk.io.solace.SolaceIO
-
Convert to a JCSMP destination from a schema-enabled
Solace.Destination. - convertToMapSpecInternal(StateSpec<SetState<KeyT>>) - Static method in class org.apache.beam.sdk.state.StateSpecs
-
For internal use only; no backwards-compatibility guarantees.
- convertToMultimapSpecInternal(StateSpec<MapState<KeyT, ValueT>>) - Static method in class org.apache.beam.sdk.state.StateSpecs
-
For internal use only; no backwards-compatibility guarantees.
- ConvertType(boolean) - Constructor for class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
- ConvertValueForGetter(StackManipulation) - Constructor for class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
- ConvertValueForSetter(StackManipulation) - Constructor for class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
- copy() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
-
Returns a copy of this RandomAccessData.
- copy() - Method in class org.apache.beam.runners.spark.metrics.MetricsContainerStepMapAccumulator
- copy() - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.MetricsAccumulator
- copy(byte[]) - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueSerializer
- copy(byte[], byte[]) - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueSerializer
- copy(Iterable<String>, Iterable<String>) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
- copy(List<ClassLoaderFileSystem.ClassLoaderResourceId>, List<ClassLoaderFileSystem.ClassLoaderResourceId>) - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem
- copy(List<ResourceId>, List<ResourceId>, MoveOptions...) - Static method in class org.apache.beam.sdk.io.FileSystems
-
Copies a
Listof file-like resources from one location to another. - copy(List<RexLiteral>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamWindowRel
- copy(List<ResourceIdT>, List<ResourceIdT>) - Method in class org.apache.beam.sdk.io.FileSystem
-
Copies a
Listof file-like resources from one location to another. - copy(StateNamespace) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.FlinkStateNamespaceKeySerializer
- copy(StateNamespace, StateNamespace) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.FlinkStateNamespaceKeySerializer
- copy(RelTraitSet, List<RelNode>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamEnumerableConverter
- copy(RelTraitSet, List<RelNode>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSinkRel
- copy(RelTraitSet, List<RelNode>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMatchRel
- copy(RelTraitSet, List<RelNode>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamWindowRel
- copy(RelTraitSet, List<RelNode>, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIntersectRel
- copy(RelTraitSet, List<RelNode>, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMinusRel
- copy(RelTraitSet, List<RelNode>, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnionRel
- copy(RelTraitSet, List<RelNode>, RexNode, Type, RelDataType, Set<RelColumnMapping>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamTableFunctionScanRel
- copy(RelTraitSet, RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUncollectRel
- copy(RelTraitSet, RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnnestRel
- copy(RelTraitSet, RelNode, List<RexLiteral>, RelDataType, List<Window.Group>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamWindowRel
- copy(RelTraitSet, RelNode, RelCollation, RexNode, RexNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel
- copy(RelTraitSet, RelNode, RexProgram) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel
- copy(RelTraitSet, RelNode, ImmutableBitSet, List<ImmutableBitSet>, List<AggregateCall>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamAggregationRel
- copy(RelTraitSet, RexNode, RelNode, RelNode, JoinRelType, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCoGBKJoinRel
- copy(RelTraitSet, RexNode, RelNode, RelNode, JoinRelType, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSideInputJoinRel
- copy(RelTraitSet, RexNode, RelNode, RelNode, JoinRelType, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSideInputLookupJoinRel
- copy(RelNode, RelDataType, RexNode, boolean, boolean, Map<String, RexNode>, Map<String, RexNode>, RexNode, Map<String, ? extends SortedSet<String>>, boolean, ImmutableBitSet, RelCollation, RexNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMatchRel
- copy(DataInputView, DataOutputView) - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeSerializer
- copy(DataInputView, DataOutputView) - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueSerializer
- copy(DataInputView, DataOutputView) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.FlinkStateNamespaceKeySerializer
- copy(T) - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeSerializer
- copy(T, T) - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeSerializer
- copyFrom(byte[]) - Static method in class org.apache.beam.sdk.io.range.ByteKey
-
Creates a new
ByteKeybacked by a copy of the specifiedbyte[]. - copyFrom(ByteBuffer) - Static method in class org.apache.beam.sdk.io.range.ByteKey
-
Creates a new
ByteKeybacked by a copy of the data remaining in the specifiedByteBuffer. - copyFrom(FieldSpecifierNotationParser.DotExpressionComponentContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionComponentContext
- copyFrom(FieldSpecifierNotationParser.QualifierListContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifierListContext
- copyResourcesFromJar(JarFile) - Method in class org.apache.beam.runners.jobsubmission.PortablePipelineJarCreator
-
Copy resources from
inputJartoPortablePipelineJarCreator.outputStream. - copyToList(ArrayData, DataType) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers.Utils
- coreName() - Method in class org.apache.beam.sdk.io.solr.SolrIO.ReplicaInfo
- coreUrl() - Method in class org.apache.beam.sdk.io.solr.SolrIO.ReplicaInfo
- CorrelationKey() - Constructor for class org.apache.beam.sdk.io.solace.data.Solace.CorrelationKey
- cosh(Double) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinTrigonometricFunctions
-
COSH(X)
- CosmosClientBuilderFactory() - Constructor for class org.apache.beam.sdk.io.azure.cosmos.CosmosOptions.CosmosClientBuilderFactory
- CosmosIO - Class in org.apache.beam.sdk.io.azure.cosmos
- CosmosIO.BoundedCosmosBDSource<T> - Class in org.apache.beam.sdk.io.azure.cosmos
-
A
BoundedSourcereading from Comos. - CosmosIO.Read<T> - Class in org.apache.beam.sdk.io.azure.cosmos
- CosmosOptions - Interface in org.apache.beam.sdk.io.azure.cosmos
- CosmosOptions.CosmosClientBuilderFactory - Class in org.apache.beam.sdk.io.azure.cosmos
-
Create a cosmos client from the pipeline options.
- COST_OPTIMIZED - Enum constant in enum class org.apache.beam.runners.dataflow.options.DataflowPipelineOptions.FlexResourceSchedulingGoal
-
Optimize for lower cost.
- Count - Class in org.apache.beam.sdk.transforms
-
PTransformsto count the elements in aPCollection. - COUNT - Enum constant in enum class org.apache.beam.sdk.extensions.sql.impl.cep.CEPKind
- countAsserts(Pipeline) - Static method in class org.apache.beam.sdk.testing.PAssert
- counter(Class<?>, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
-
Create a metric that can be incremented and decremented, and is aggregated by taking the sum.
- counter(String, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
-
Create a metric that can be incremented and decremented, and is aggregated by taking the sum.
- Counter - Interface in org.apache.beam.sdk.metrics
-
A metric that reports a single long value and can be incremented or decremented.
- COUNTER - Static variable in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTable
- CounterImpl - Class in org.apache.beam.runners.jet.metrics
-
Implementation of
Counter. - CounterMark(long, Instant) - Constructor for class org.apache.beam.sdk.io.CountingSource.CounterMark
-
Creates a checkpoint mark reflecting the last emitted value.
- CounterMarkCoder() - Constructor for class org.apache.beam.sdk.io.CountingSource.CounterMarkCoder
- CountErrors(Counter) - Constructor for class org.apache.beam.sdk.transforms.errorhandling.ErrorHandler.PTransformErrorHandler.WriteErrorMetrics.CountErrors
- CountIf - Class in org.apache.beam.sdk.extensions.sql.impl.transform.agg
-
Returns the count of TRUE values for expression.
- CountIf.CountIfFn - Class in org.apache.beam.sdk.extensions.sql.impl.transform.agg
- CountIfFn() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CountIf.CountIfFn
- CountingPipelineVisitor - Class in org.apache.beam.runners.flink.translation.utils
-
Pipeline visitors that fills a lookup table of
PValueto number of consumers. - CountingPipelineVisitor() - Constructor for class org.apache.beam.runners.flink.translation.utils.CountingPipelineVisitor
- CountingReadableByteChannel - Class in org.apache.beam.sdk.extensions.gcp.util.channels
- CountingReadableByteChannel(ReadableByteChannel, Consumer<Integer>) - Constructor for class org.apache.beam.sdk.extensions.gcp.util.channels.CountingReadableByteChannel
- CountingSeekableByteChannel - Class in org.apache.beam.sdk.extensions.gcp.util.channels
- CountingSeekableByteChannel(SeekableByteChannel, Consumer<Integer>, Consumer<Integer>) - Constructor for class org.apache.beam.sdk.extensions.gcp.util.channels.CountingSeekableByteChannel
- CountingSource - Class in org.apache.beam.sdk.io
-
Most users should use
GenerateSequenceinstead. - CountingSource.CounterMark - Class in org.apache.beam.sdk.io
-
The checkpoint for an unbounded
CountingSourceis simply the last value produced. - CountingSource.CounterMarkCoder - Class in org.apache.beam.sdk.io
-
A custom coder for
CounterMark. - CountingWritableByteChannel - Class in org.apache.beam.sdk.extensions.gcp.util.channels
- CountingWritableByteChannel(WritableByteChannel, Consumer<Integer>) - Constructor for class org.apache.beam.sdk.extensions.gcp.util.channels.CountingWritableByteChannel
- countPartitionsCreatedAfter(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
-
Counts all partitions with a
PartitionMetadataAdminDao.COLUMN_CREATED_ATless than the given timestamp. - CountWords() - Constructor for class org.apache.beam.runners.spark.structuredstreaming.examples.WordCount.CountWords
- CovarianceFn<T> - Class in org.apache.beam.sdk.extensions.sql.impl.transform.agg
-
Combine.CombineFnfor Covariance onNumbertypes. - coverSameKeySpace(List<Range.ByteStringRange>, Range.ByteStringRange) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ByteStringRangeHelper
-
Returns true if parentPartitions form a proper superset of childPartition.
- CrashingRunner - Class in org.apache.beam.sdk.testing
-
A
PipelineRunnerthat applies no overrides and throws an exception on calls toPipeline.run(). - CrashingRunner() - Constructor for class org.apache.beam.sdk.testing.CrashingRunner
- create() - Static method in class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
-
Creates a ConnectorConfiguration.
- create() - Static method in class org.apache.beam.runners.dataflow.internal.DataflowGroupByKey
-
Returns a
DataflowGroupByKey<K, V>PTransform. - create() - Static method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.PayloadTranslator
- create() - Static method in class org.apache.beam.runners.fnexecution.control.MapControlClientPool
-
Creates a
MapControlClientPool. - create() - Static method in class org.apache.beam.runners.fnexecution.environment.ProcessManager
- create() - Static method in class org.apache.beam.runners.fnexecution.state.GrpcStateService
-
Create a new
GrpcStateService. - create() - Method in interface org.apache.beam.runners.jobsubmission.JobServerDriver.JobInvokerFactory
- create() - Static method in class org.apache.beam.runners.spark.SparkRunner
-
Creates and returns a new SparkRunner with default options.
- create() - Static method in class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingRunner
-
Creates and returns a new SparkStructuredStreamingRunner with default options.
- create() - Static method in class org.apache.beam.sdk.fn.server.InProcessServerFactory
- create() - Static method in class org.apache.beam.sdk.io.aws2.kinesis.WatermarkParameters
- create() - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Document
- create() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingInsertsMetrics.StreamingInsertsMetricsImpl
- create() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
-
Creates an instance of this rule using options provided by
TestPipeline.testingPipelineOptions(). - create() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal
-
Creates an instance of this rule.
- create() - Static method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- create() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- create() - Static method in class org.apache.beam.sdk.io.jdbc.JdbcWriteResult
- create() - Static method in class org.apache.beam.sdk.io.jms.RetryConfiguration
- create() - Static method in class org.apache.beam.sdk.io.kafka.KafkaMetrics.KafkaMetricsImpl
- create() - Static method in class org.apache.beam.sdk.io.mongodb.AggregationQuery
- create() - Static method in class org.apache.beam.sdk.io.mongodb.FindQuery
- create() - Static method in class org.apache.beam.sdk.io.mongodb.UpdateConfiguration
- create() - Static method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverConfiguration
- create() - Static method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
- create() - Static method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- create() - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionServiceFactory
- create() - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthSempClientFactory
- create() - Method in class org.apache.beam.sdk.io.solace.broker.GCPSecretSessionServiceFactory
- create() - Method in interface org.apache.beam.sdk.io.solace.broker.SempClientFactory
-
This method is the core of the factory interface.
- create() - Method in class org.apache.beam.sdk.io.solace.broker.SessionServiceFactory
-
This is the core method that subclasses must implement.
- create() - Static method in class org.apache.beam.sdk.io.solace.RetryCallableManager
-
Creates a new
RetryCallableManagerwith default retry settings. - create() - Method in class org.apache.beam.sdk.io.splunk.SplunkEvent.Builder
-
Creates a
SplunkEventobject. - create() - Method in class org.apache.beam.sdk.io.splunk.SplunkWriteError.Builder
-
Builds a
SplunkWriteErrorobject. - create() - Method in class org.apache.beam.sdk.options.PipelineOptionsFactory.Builder
-
Creates and returns an object that implements
PipelineOptionsusing the values configured on this builder during construction. - create() - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
-
Creates and returns an object that implements
PipelineOptions. - create() - Static method in class org.apache.beam.sdk.Pipeline
-
Constructs a pipeline from default
PipelineOptions. - create() - Static method in class org.apache.beam.sdk.PipelineRunner
-
Creates a runner from the default app
PipelineOptions. - create() - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
-
Return an empty
FieldAccessDescriptor. - create() - Static method in class org.apache.beam.sdk.schemas.transforms.AddFields
- create() - Static method in class org.apache.beam.sdk.schemas.transforms.Filter
- create() - Static method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
-
Returns a transform that does a global combine using an aggregation built up by calls to aggregateField and aggregateFields.
- create() - Static method in class org.apache.beam.sdk.schemas.transforms.RenameFields
-
Create an instance of this transform.
- create() - Static method in class org.apache.beam.sdk.schemas.transforms.Select
- create() - Static method in class org.apache.beam.sdk.testing.TestPipeline
-
Creates and returns a new test pipeline.
- create() - Static method in class org.apache.beam.sdk.testing.TestPipelineExtension
-
Creates a new TestPipelineExtension with default options.
- create() - Static method in class org.apache.beam.sdk.transforms.Distinct
-
Returns a
Distinct<T>PTransform. - create() - Static method in class org.apache.beam.sdk.transforms.DoFnSchemaInformation
-
Create an instance.
- create() - Static method in class org.apache.beam.sdk.transforms.GroupByKey
-
Returns a
GroupByKey<K, V>PTransform. - create() - Static method in class org.apache.beam.sdk.transforms.Impulse
-
Create a new
ImpulsePTransform. - create() - Static method in class org.apache.beam.sdk.transforms.join.CoGroupByKey
-
Returns a
CoGroupByKey<K>PTransform. - create() - Static method in class org.apache.beam.sdk.transforms.Keys
-
Returns a
Keys<K>PTransform. - create() - Static method in class org.apache.beam.sdk.transforms.KvSwap
-
Returns a
KvSwap<K, V>PTransform. - create() - Static method in class org.apache.beam.sdk.transforms.PeriodicImpulse
- create() - Static method in class org.apache.beam.sdk.transforms.PeriodicSequence
- create() - Static method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHints
-
Creates a
ResourceHintsinstance with no hints. - create() - Static method in class org.apache.beam.sdk.transforms.Values
-
Returns a
Values<V>PTransform. - create(boolean) - Static method in class org.apache.beam.runners.flink.TestFlinkRunner
- create(boolean, Map<TupleTag<?>, KV<WindowingStrategy<?, ?>, SideInputBroadcast<?>>>) - Static method in class org.apache.beam.runners.spark.util.SideInputReaderFactory
-
Creates and returns a
SideInputReaderbased on the configuration. - create(byte[], SparkPCollectionView.Type, Coder<Iterable<WindowedValue<?>>>) - Static method in class org.apache.beam.runners.spark.translation.SideInputMetadata
-
Creates a new instance of SideInputMetadata.
- create(byte[], SparkPCollectionView.Type, Coder<T>) - Static method in class org.apache.beam.runners.spark.util.SideInputBroadcast
- create(double) - Static method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
-
Returns
TDigestQuantiles.TDigestQuantilesFncombiner with the given compression factor. - create(double) - Static method in class org.apache.beam.sdk.extensions.sql.impl.planner.NodeStats
-
It creates an instance with rate=0 and window=rowCount for bounded sources.
- create(double, double, double) - Static method in class org.apache.beam.sdk.extensions.sql.impl.planner.NodeStats
- create(int) - Static method in class org.apache.beam.sdk.io.jms.RetryConfiguration
- create(int) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
-
Like
ApproximateQuantiles.ApproximateQuantilesCombineFn.create(int, Comparator), but sorts values using their natural ordering. - create(int, ComparatorT) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
-
Returns an approximate quantiles combiner with the given
compareFnand desired number of quantiles. - create(int, ComparatorT, long, double) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
-
Creates an approximate quantiles combiner with the given
compareFnand desired number of quantiles. - create(int, Duration, Duration) - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO.RetryConfiguration
- create(int, Duration, Duration) - Static method in class org.apache.beam.sdk.io.jms.RetryConfiguration
- create(int, Duration) - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.RetryConfiguration
-
Creates RetryConfiguration for
ElasticsearchIOwith provided maxAttempts, maxDurations and exponential backoff based retries. - create(int, Duration) - Static method in class org.apache.beam.sdk.io.solr.SolrIO.RetryConfiguration
- create(long) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResult
- create(long, long, long, long) - Static method in class org.apache.beam.sdk.metrics.DistributionResult
- create(long, long, SerializableFunction<InputT, Long>, Duration) - Static method in class org.apache.beam.sdk.transforms.GroupIntoBatches.BatchingParams
- create(long, Instant) - Static method in class org.apache.beam.sdk.metrics.GaugeResult
- create(BuilderT, ClientConfiguration, AwsOptions) - Method in interface org.apache.beam.sdk.io.aws2.common.ClientBuilderFactory
-
Configure a client builder
ClientBuilderFactoryusing the providedClientConfigurationand fall back to the global defaults inAwsOptionswhere necessary. - create(BuilderT, ClientConfiguration, AwsOptions) - Method in class org.apache.beam.sdk.io.aws2.common.ClientBuilderFactory.DefaultClientBuilder
- create(BuilderT, AwsOptions) - Method in interface org.apache.beam.sdk.io.aws2.common.ClientBuilderFactory
-
Configure a client builder
ClientBuilderFactoryusing the global defaults inAwsOptions. - create(StorageObject) - Static method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.StorageObjectOrIOException
- create(BatchTransactionId) - Static method in class org.apache.beam.sdk.io.gcp.spanner.Transaction
- create(Mutation, Mutation...) - Static method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
-
Creates a new group.
- create(Mutation, Iterable<Mutation>) - Static method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
- create(JCSMPProperties, Queue) - Static method in class org.apache.beam.sdk.io.solace.broker.JcsmpSessionService
- create(EventT, Exception) - Static method in class org.apache.beam.sdk.extensions.ordered.UnprocessedEvent
-
Create new unprocessed event which failed due to an exception thrown.
- create(EventT, UnprocessedEvent.Reason) - Static method in class org.apache.beam.sdk.extensions.ordered.UnprocessedEvent
-
Create new unprocessed event.
- create(IOException) - Static method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.StorageObjectOrIOException
- create(String, long, String, byte[], Map<String, String>, byte[]) - Static method in class org.apache.beam.sdk.io.pulsar.PulsarMessage
- create(String, String) - Static method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.GcsCountersOptions
- create(String, MetricName) - Static method in class org.apache.beam.sdk.metrics.MetricKey
- create(Class<?>, Schema) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils.ClassWithSchema
- create(Iterable<MetricResult<Long>>, Iterable<MetricResult<DistributionResult>>, Iterable<MetricResult<GaugeResult>>, Iterable<MetricResult<StringSetResult>>, Iterable<MetricResult<BoundedTrieResult>>, Iterable<MetricResult<HistogramData>>) - Static method in class org.apache.beam.sdk.metrics.MetricQueryResults
- create(Long, long, Long, Long, long, long, long, boolean, ContiguousSequenceRange) - Static method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus
- create(Object...) - Method in interface org.apache.beam.sdk.schemas.SchemaUserTypeCreator
- create(Method) - Static method in class org.apache.beam.sdk.extensions.sql.impl.ScalarFunctionImpl
-
Creates
Functionfrom given method. - create(Method, String) - Static method in class org.apache.beam.sdk.extensions.sql.impl.ScalarFunctionImpl
-
Createsfrom given method.
invalid reference
org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.Function - create(String) - Method in interface org.apache.beam.runners.fnexecution.control.OutputReceiverFactory
-
Get a new
FnDataReceiverfor an output PCollection. - create(String) - Static method in class org.apache.beam.sdk.extensions.sql.impl.TableName
-
Splits the input String by "." separator and returns a new
TableName. - create(String) - Static method in class org.apache.beam.sdk.fn.channel.AddHarnessIdInterceptor
- create(String) - Static method in interface org.apache.beam.sdk.io.aws2.auth.WebIdTokenProvider
-
Factory method for OIDC web identity token provider implementations.
- create(String) - Static method in class org.apache.beam.sdk.io.mqtt.MqttIO.ConnectionConfiguration
- create(String) - Static method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.DataSourceConfiguration
- create(String) - Static method in class org.apache.beam.sdk.io.solr.SolrIO.ConnectionConfiguration
-
Creates a new Solr connection configuration.
- create(String) - Static method in class org.apache.beam.sdk.metrics.LabeledMetricNameUtils.ParsedMetricName
- create(String[]) - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
Creates a new Elasticsearch connection configuration with no default index nor type.
- create(String...) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
-
Create an enumeration type from a fixed set of String values; integer values will be automatically chosen.
- create(String[], String) - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
Creates a new Elasticsearch connection configuration with no default type.
- create(String[], String, String) - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
Creates a new Elasticsearch connection configuration.
- create(String, int) - Static method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
- create(String, String) - Method in class org.apache.beam.runners.fnexecution.control.TimerReceiverFactory
- create(String, String) - Static method in class org.apache.beam.sdk.io.gcp.datastore.RowToEntity
-
Create a PTransform instance.
- create(String, String) - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
- create(String, String) - Static method in class org.apache.beam.sdk.io.mqtt.MqttIO.ConnectionConfiguration
-
Describe a connection configuration to the MQTT broker.
- create(String, String, String) - Static method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverConfiguration
- create(String, String, String, long, long) - Static method in class org.apache.beam.sdk.io.aws2.sqs.SqsMessage
- create(String, String, String, Boolean) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher.TableAndQuery
- create(String, String, String, Struct) - Static method in class org.apache.beam.runners.fnexecution.provisioning.JobInfo
- create(String, String, Coder<T>, FnDataReceiver<T>) - Static method in class org.apache.beam.sdk.fn.data.TimerEndpoint
- create(String, Map<String, String>) - Static method in class org.apache.beam.sdk.metrics.LabeledMetricNameUtils.ParsedMetricName
- create(String, Coder<T>, FnDataReceiver<T>) - Static method in class org.apache.beam.sdk.fn.data.DataEndpoint
- create(String, Schema.FieldType) - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations
- create(String, Schema.FieldType) - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions
- create(String, ByteString, OutputStream) - Static method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService.ArtifactDestination
- create(WritableByteChannel) - Method in enum class org.apache.beam.sdk.io.FileBasedSink.CompressionType
-
Deprecated.
- create(WritableByteChannel) - Method in interface org.apache.beam.sdk.io.FileBasedSink.WritableByteChannelFactory
- create(List<? extends FnService>, Endpoints.ApiServiceDescriptor, ServerFactory) - Static method in class org.apache.beam.sdk.fn.server.GrpcFnServer
-
Create
GrpcFnServers for the providedFnServices running on a specified port. - create(List<String>) - Static method in class org.apache.beam.sdk.extensions.sql.impl.TableName
-
Full table name with path.
- create(List<String>) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
-
Create an enumeration type from a fixed set of String values; integer values will be automatically chosen.
- create(List<String>, String) - Static method in class org.apache.beam.sdk.extensions.sql.impl.TableName
-
Table name plus the path up to but not including table name.
- create(List<String>, String) - Static method in class org.apache.beam.sdk.schemas.transforms.Cast.CompatibilityError
- create(List<String>, Map<String, List<Dependency>>) - Static method in class org.apache.beam.sdk.expansion.service.ExpansionServiceConfig
- create(List<String>, Optional<Schema.TypeName>) - Static method in class org.apache.beam.sdk.schemas.utils.SchemaZipFold.Context
- create(List<Schema.Field>) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
-
Create an
OneOfTypelogical type. - create(List<Schema.Field>, Map<String, Integer>) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
-
Create an
OneOfTypelogical type. - create(List<BindableService>, Endpoints.ApiServiceDescriptor) - Method in class org.apache.beam.sdk.fn.server.InProcessServerFactory
- create(List<BindableService>, Endpoints.ApiServiceDescriptor) - Method in class org.apache.beam.sdk.fn.server.ServerFactory
-
Creates an instance of this server at the address specified by the given service descriptor and bound to multiple services.
- create(List<BindableService>, Endpoints.ApiServiceDescriptor) - Method in class org.apache.beam.sdk.fn.server.ServerFactory.InetSocketAddressServerFactory
- create(Map<String, Integer>) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
-
Create an enumeration type over a set of String->Integer values.
- create(Map<String, String>) - Static method in class org.apache.beam.sdk.io.hcatalog.HCatalogBeamSchema
-
Create the schema adapter.
- create(Map<String, Broadcast<SideInputValues<?>>>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.SparkSideInputReader
-
Creates a
SideInputReaderfor Spark from a map of PCollectionViewtag idsand the corresponding broadcastedSideInputValues. - create(Set<String>) - Static method in class org.apache.beam.sdk.metrics.StringSetResult
-
Creates a
StringSetResultfrom the givenSetby making an immutable copy. - create(Set<List<String>>) - Static method in class org.apache.beam.sdk.metrics.BoundedTrieResult
-
Creates a
BoundedTrieResultfrom the givenSetby making an immutable copy. - create(DataSource) - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
- create(DataSource) - Static method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
Creates
SnowflakeIO.DataSourceConfigurationfrom existing instance ofDataSource. - create(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.TypeWithNullability
- create(ProvisionApi.ProvisionInfo, HeaderAccessor) - Static method in class org.apache.beam.runners.fnexecution.provisioning.StaticGrpcProvisionService
- create(Endpoints.ApiServiceDescriptor, HeaderAccessor) - Static method in class org.apache.beam.runners.fnexecution.status.BeamWorkerStatusGrpcService
-
Create new instance of
BeamWorkerStatusGrpcService. - create(DoFnRunner<InputT, OutputT>, String, Coder, Coder, OperatorStateBackend, KeyedStateBackend<Object>, int, SerializablePipelineOptions) - Static method in class org.apache.beam.runners.flink.translation.wrappers.streaming.stableinput.BufferingDoFnRunner
- create(DoFnRunner<InputT, OutputT>, String, Coder, Coder, OperatorStateBackend, KeyedStateBackend<Object>, int, SerializablePipelineOptions, Supplier<Locker>, Function<InputT, Object>, Runnable) - Static method in class org.apache.beam.runners.flink.translation.wrappers.streaming.stableinput.BufferingDoFnRunner
- create(DataflowPipelineOptions) - Static method in class org.apache.beam.runners.dataflow.DataflowClient
- create(FlinkJobServerDriver.FlinkServerConfiguration) - Static method in class org.apache.beam.runners.flink.FlinkJobInvoker
- create(ReferenceCountingExecutableStageContextFactory.Creator, SerializableFunction<Object, Boolean>) - Static method in class org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory
- create(EnvironmentFactory, GrpcFnServer<GrpcDataService>, GrpcFnServer<GrpcStateService>, IdGenerator) - Static method in class org.apache.beam.runners.fnexecution.control.SingleEnvironmentInstanceJobBundleFactory
-
Deprecated.
- create(ProcessManager, RunnerApi.Environment, String, InstructionRequestHandler) - Static method in class org.apache.beam.runners.fnexecution.environment.ProcessEnvironment
- create(ProcessManager, GrpcFnServer<StaticGrpcProvisionService>, ControlClientPool.Source, IdGenerator, PipelineOptions) - Static method in class org.apache.beam.runners.fnexecution.environment.ProcessEnvironmentFactory
- create(JobInfo) - Static method in class org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext
- create(JobInfo) - Static method in class org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory
- create(JobInfo, Map<String, EnvironmentFactory.Provider>) - Static method in class org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory
- create(SparkJobServerDriver.SparkServerConfiguration) - Static method in class org.apache.beam.runners.spark.SparkJobInvoker
- create(SparkPipelineOptions) - Static method in class org.apache.beam.runners.spark.SparkRunner
-
Creates and returns a new SparkRunner with specified options.
- create(SparkStructuredStreamingPipelineOptions) - Static method in class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingRunner
-
Creates and returns a new SparkStructuredStreamingRunner with specified options.
- create(SparkCombineFn<InputT, ValueT, AccumT, ?>, Function<InputT, ValueT>, WindowingStrategy<?, ?>, Comparator<BoundedWindow>) - Static method in interface org.apache.beam.runners.spark.translation.SparkCombineFn.WindowedAccumulator
-
Create concrete accumulator for given type.
- create(Coder<InputT>) - Static method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
-
Returns an
ApproximateDistinct.ApproximateDistinctFncombiner with the given input coder. - create(Coder<InputT>) - Static method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
-
Returns a
SketchFrequencies.CountMinSketchFncombiner with the given input coder. - create(Coder<T>) - Static method in class org.apache.beam.sdk.testing.TestStream
-
Create a new
TestStream.Builderwith no elements and watermark equal toBoundedWindow.TIMESTAMP_MIN_VALUE. - create(Coder<T>, Coder<MetaT>) - Static method in class org.apache.beam.sdk.values.PCollectionViews.ValueOrMetadataCoder
- create(Coder<T>, FnDataReceiver<T>) - Static method in class org.apache.beam.sdk.fn.data.DecodingFnDataReceiver
- create(ExpansionService, String, int) - Static method in class org.apache.beam.sdk.expansion.service.ExpansionServer
-
Create a
ExpansionServerfor the provided ExpansionService running on an arbitrary port. - create(GcsPath, String) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
-
Deprecated.Use
GcsUtil.create(GcsPath, CreateOptions)instead. - create(GcsPath, String, Integer) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
-
Deprecated.Use
GcsUtil.create(GcsPath, CreateOptions)instead. - create(GcsPath, GcsUtil.CreateOptions) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
-
Creates an object in GCS and prepares for uploading its contents.
- create(OrderedProcessingHandler<EventTypeT, EventKeyTypeT, StateTypeT, ResultTypeT>) - Static method in class org.apache.beam.sdk.extensions.ordered.OrderedEventProcessor
-
Create the transform.
- create(BufferedExternalSorter.Options) - Static method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter
- create(BufferedExternalSorter.Options) - Static method in class org.apache.beam.sdk.extensions.sorter.SortValues
-
Returns a
SortValues<PrimaryKeyT, SecondaryKeyT, ValueT>PTransform. - create(ExternalSorter.Options) - Static method in class org.apache.beam.sdk.extensions.sorter.ExternalSorter
-
Returns a
Sorterconfigured with the givenExternalSorter.Options. - create(DataCatalogPipelineOptions) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
- create(GrpcFnServer<ArtifactStagingService>, Function<String, String>, ThrowingConsumer<Exception, String>, JobInvoker) - Static method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
-
Creates an InMemoryJobService.
- create(GrpcFnServer<ArtifactStagingService>, Function<String, String>, ThrowingConsumer<Exception, String>, JobInvoker, int) - Static method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
-
Creates an InMemoryJobService.
- create(GrpcFnServer<FnApiControlClientPoolService>, GrpcFnServer<GrpcLoggingService>, GrpcFnServer<ArtifactRetrievalService>, GrpcFnServer<StaticGrpcProvisionService>, ControlClientPool.Source, IdGenerator) - Static method in class org.apache.beam.runners.fnexecution.environment.ExternalEnvironmentFactory
- create(ClassLoaderFileSystem.ClassLoaderResourceId, CreateOptions) - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem
- create(EmptyMatchTreatment) - Static method in class org.apache.beam.sdk.io.FileIO.MatchConfiguration
-
Creates a
FileIO.MatchConfigurationwith the givenEmptyMatchTreatment. - create(MatchResult.Status, IOException) - Static method in class org.apache.beam.sdk.io.fs.MatchResult
- create(MatchResult.Status, List<MatchResult.Metadata>) - Static method in class org.apache.beam.sdk.io.fs.MatchResult
- create(ResourceId, String) - Static method in class org.apache.beam.sdk.io.FileSystems
-
Returns a write channel for the given
ResourceId. - create(ResourceId, CreateOptions) - Static method in class org.apache.beam.sdk.io.FileSystems
-
Returns a write channel for the given
ResourceIdwithCreateOptions. - create(SubscriptionPartition) - Method in interface org.apache.beam.sdk.io.gcp.pubsublite.internal.ManagedFactory
- create(SubscriptionPartition) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.ManagedFactoryImpl
- create(SpannerConfig, String, String) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.MetadataSpannerConfigFactory
-
Generates a SpannerConfig that can be used to access the change stream metadata database by copying only the necessary fields from the given primary database SpannerConfig and setting the instance ID and database ID to the supplied metadata values.
- create(MetricKey, T, T) - Static method in class org.apache.beam.sdk.metrics.MetricResult
- create(MetricKey, Boolean, T) - Static method in class org.apache.beam.sdk.metrics.MetricResult
- create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions.DataflowClientFactory
- create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions.StagerFactory
- create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions.UnboundedReaderMaxReadTimeFactory
- create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowPipelineOptions.StagingLocationFactory
- create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions.EnableWindmillServiceDirectPathFactory
- create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions.GlobalConfigRefreshPeriodFactory
- create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions.HarnessUpdateReportingPeriodFactory
- create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions.LocalWindmillHostportFactory
- create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions.MaxStackTraceDepthToReportFactory
- create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions.PeriodicStatusPageDirectoryFactory
- create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions.WindmillServiceStreamingRpcBatchLimitFactory
- create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DefaultGcpRegionFactory
- create(PipelineOptions) - Method in class org.apache.beam.runners.direct.DirectOptions.AvailableParallelismFactory
- create(PipelineOptions) - Method in class org.apache.beam.runners.flink.FlinkPipelineOptions.MaxBundleSizeFactory
- create(PipelineOptions) - Method in class org.apache.beam.runners.flink.FlinkPipelineOptions.MaxBundleTimeFactory
- create(PipelineOptions) - Method in class org.apache.beam.runners.portability.testing.TestPortablePipelineOptions.DefaultJobServerConfigFactory
- create(PipelineOptions) - Method in class org.apache.beam.runners.spark.SparkCommonPipelineOptions.StorageLevelFactory
- create(PipelineOptions) - Method in class org.apache.beam.runners.spark.SparkCommonPipelineOptions.TmpCheckpointDirFactory
- create(PipelineOptions) - Method in class org.apache.beam.runners.spark.SparkContextOptions.EmptyListenersList
- create(PipelineOptions) - Method in class org.apache.beam.runners.spark.TestSparkPipelineOptions.DefaultStopPipelineWatermarkFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.expansion.service.ExpansionServiceOptions.ExpansionServiceConfigFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.expansion.service.ExpansionServiceOptions.JavaClassLookupAllowListFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.DefaultProjectFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.EnableStreamingEngineFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.GcpOAuthScopesFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.GcpTempLocationFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.GcpUserCredentialsFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcsOptions.ExecutorServiceFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcsOptions.PathValidatorFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.GcsReadOptionsFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.GcsUtilFactory
-
Returns an instance of
GcsUtilbased on thePipelineOptions. - create(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIOOptions.MapFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws2.options.AwsOptions.AwsRegionFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws2.options.AwsOptions.AwsUserCredentialsFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws2.options.S3Options.S3UploadBufferSizeBytesFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws2.options.S3Options.SSECustomerKeyFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.io.azure.cosmos.CosmosOptions.CosmosClientBuilderFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.io.azure.options.AzureOptions.AzureUserCredentialsFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsOptions.GoogleAdsCredentialsFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.io.hdfs.HadoopFileSystemOptions.ConfigurationLocator
- create(PipelineOptions) - Method in class org.apache.beam.sdk.metrics.MetricsOptions.NoOpMetricsSink
- create(PipelineOptions) - Method in interface org.apache.beam.sdk.options.DefaultValueFactory
-
Creates a default value for a getter marked with
Default.InstanceFactory. - create(PipelineOptions) - Method in class org.apache.beam.sdk.options.ExecutorOptions.ScheduledExecutorServiceFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.options.PipelineOptions.AtomicLongFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.options.PipelineOptions.DirectRunner
- create(PipelineOptions) - Method in class org.apache.beam.sdk.options.PipelineOptions.JobNameFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.options.PipelineOptions.UserAgentFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.options.SdkHarnessOptions.BundleProcessorCacheTimeoutFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.options.SdkHarnessOptions.DefaultMaxCacheMemoryUsageMbFactory
- create(PipelineOptions) - Static method in class org.apache.beam.sdk.Pipeline
-
Constructs a pipeline from the provided
PipelineOptions. - create(PipelineOptions) - Method in class org.apache.beam.sdk.testing.TestPipelineOptions.AlwaysPassMatcherFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHintsOptions.EmptyListDefault
- create(PipelineOptions, Storage, HttpRequestInitializer, ExecutorService, Credentials, Integer, GcsUtil.GcsCountersOptions, GoogleCloudStorageReadOptions) - Static method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.GcsUtilFactory
-
Returns an instance of
GcsUtilbased on the given parameters. - create(PipelineOptions, ExecutorService, OutboundObserverFactory) - Static method in class org.apache.beam.runners.fnexecution.data.GrpcDataService
- create(PipelineOptions, GrpcFnServer<GrpcLoggingService>, GrpcFnServer<FnApiControlClientPoolService>, ControlClientPool.Source) - Static method in class org.apache.beam.runners.fnexecution.environment.EmbeddedEnvironmentFactory
- create(ValueProvider<TableReference>, DataFormat, ValueProvider<List<String>>, ValueProvider<String>, SerializableFunction<SchemaAndRecord, T>, Coder<T>, BigQueryServices, boolean) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
- create(ValueProvider<TableReference>, ValueProvider<List<String>>, ValueProvider<String>, SerializableFunction<SchemaAndRecord, T>, Coder<T>, BigQueryServices) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
- create(ValueProvider<String>, ValueProvider<String>) - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
- create(ValueProvider<String>, ValueProvider<Integer>) - Static method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
- create(ValueProvider<String>, ValueProvider<String>, ValueProvider<String>) - Static method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.DataSourceConfiguration
- create(Schema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
-
Creates an instance of this rule.
- create(Schema) - Static method in class org.apache.beam.sdk.testing.TestStream
- create(Schema.Field...) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
-
Create an
OneOfTypelogical type. - create(Schema, String) - Static method in class org.apache.beam.sdk.io.gcp.datastore.EntityToRow
-
Create a PTransform instance.
- create(Schema, TypeDescriptor<T>, SerializableFunction<T, Row>, SerializableFunction<Row, T>) - Static method in class org.apache.beam.sdk.testing.TestStream
- create(JsonToRow.JsonToRowWithErrFn) - Static method in class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn.ParseWithError
- create(Secret) - Static method in class org.apache.beam.sdk.transforms.GroupByEncryptedKey
-
Creates a
GroupByEncryptedKeytransform. - create(PCollectionView<?>, Coder<T>) - Static method in class org.apache.beam.runners.dataflow.DataflowRunner.StreamingPCollectionViewWriterFn
- create(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.schemas.CachingFactory
- create(TypeDescriptor<?>, Schema) - Method in interface org.apache.beam.sdk.schemas.Factory
- create(TypeDescriptor<T>, Schema) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils.TypeDescriptorWithSchema
- create(StreamObserver<ReqT>, Runnable) - Static method in class org.apache.beam.sdk.fn.stream.ForwardingClientResponseObserver
- create(StreamObserver<ReqT>, Runnable, Runnable) - Static method in class org.apache.beam.sdk.fn.stream.ForwardingClientResponseObserver
- create(Output<StreamRecord<WindowedValue<OutputT>>>, Lock, OperatorStateBackend) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator.MultiOutputOutputManagerFactory
- create(Message<byte[]>) - Static method in class org.apache.beam.sdk.io.pulsar.PulsarMessage
- create(Function<InputT, ValueT>, SparkCombineFn.WindowedAccumulator.Type, Iterable<WindowedValue<AccumT>>, Comparator<BoundedWindow>) - Static method in interface org.apache.beam.runners.spark.translation.SparkCombineFn.WindowedAccumulator
-
Create concrete accumulator for given type.
- create(Function<InputT, ValueT>, SparkCombineFn.WindowedAccumulator.Type, Comparator<BoundedWindow>) - Static method in interface org.apache.beam.runners.spark.translation.SparkCombineFn.WindowedAccumulator
- create(ResourceIdT, CreateOptions) - Method in class org.apache.beam.sdk.io.FileSystem
-
Returns a write channel for the given
FileSystem. - create(ServiceT, Endpoints.ApiServiceDescriptor) - Static method in class org.apache.beam.sdk.fn.server.GrpcFnServer
-
Deprecated.This create function is used for Dataflow migration purpose only.
- create(ServiceT, Endpoints.ApiServiceDescriptor, ServerFactory) - Static method in class org.apache.beam.sdk.fn.server.GrpcFnServer
-
Create a
GrpcFnServerfor the providedFnServicewhich will run at the endpoint specified in theEndpoints.ApiServiceDescriptor. - create(AwsCredentialsProvider, Region, URI) - Static method in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration
- create(T) - Static method in class org.apache.beam.sdk.values.PCollectionViews.ValueOrMetadata
- Create<T> - Class in org.apache.beam.sdk.transforms
-
Create<T>takes a collection of elements of typeTknown when the pipeline is constructed and returns aPCollection<T>containing the elements. - Create() - Constructor for class org.apache.beam.sdk.transforms.Create
- CREATE_IF_NEEDED - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.CreateDisposition
-
Specifies that tables should be created if needed.
- CREATE_IF_NEEDED - Enum constant in enum class org.apache.beam.sdk.io.snowflake.enums.CreateDisposition
- CREATE_NEVER - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.CreateDisposition
-
Specifics that tables should not be created.
- CREATE_NEVER - Enum constant in enum class org.apache.beam.sdk.io.snowflake.enums.CreateDisposition
- CREATE_STREAMING_SPARK_VIEW_URN - Static variable in class org.apache.beam.runners.spark.translation.streaming.CreateStreamingSparkView
- CREATE_TIME - Enum constant in enum class org.apache.beam.sdk.io.kafka.KafkaTimestampType
- Create.OfValueProvider<T> - Class in org.apache.beam.sdk.transforms
- Create.TimestampedValues<T> - Class in org.apache.beam.sdk.transforms
-
A
PTransformthat creates aPCollectionwhose elements have associated timestamps. - Create.Values<T> - Class in org.apache.beam.sdk.transforms
-
A
PTransformthat creates aPCollectionfrom a set of in-memory objects. - Create.WindowedValues<T> - Class in org.apache.beam.sdk.transforms
-
A
PTransformthat creates aPCollectionwhose elements have associated windowing metadata. - createAccumulator() - Method in class org.apache.beam.runners.flink.FlinkStreamingAggregationsTranslators.ConcatenateAsIterable
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.ordered.combiner.DefaultSequenceCombiner
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.impl.LazyAggregateCombineFn
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CountIf.CountIfFn
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CovarianceFn
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.VarianceFn
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations.BitXOr
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.ArrayAgg.ArrayAggArray
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggByte
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggString
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.Sum
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
- createAccumulator() - Method in interface org.apache.beam.sdk.extensions.sql.udf.AggregateFn
-
Returns a new, mutable accumulator value, representing the accumulation of zero input values.
- createAccumulator() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Concatenate
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
-
Deprecated.
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
-
Returns a new, mutable accumulator value, representing the accumulation of zero input values.
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.Top.TopCombineFn
- createAccumulator(CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
- createAccumulator(CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
-
Returns a new, mutable accumulator value, representing the accumulation of zero input values.
- createAll(Class<?>) - Static method in class org.apache.beam.sdk.extensions.sql.impl.ScalarFunctionImpl
-
Creates
Functionfor each method in a given class. - createAndTrackNextReader() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- createArrayOf(String, Object[]) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- createArtifactServerFactory(JobServerDriver.ServerConfiguration) - Static method in class org.apache.beam.runners.jobsubmission.JobServerDriver
- createBacklogGauge(MetricName) - Static method in class org.apache.beam.sdk.io.kafka.KafkaSinkMetrics
-
Creates a
Gaugemetric to record per partition backlog with the name - createBatch(Class<?>, Class<?>, Class<?>) - Static method in class org.apache.beam.sdk.io.cdap.Plugin
-
Creates a batch plugin instance.
- createBatchExecutionEnvironment(FlinkPipelineOptions) - Static method in class org.apache.beam.runners.flink.FlinkExecutionEnvironments
-
If the submitted job is a batch processing job, this method creates the adequate Flink
ExecutionEnvironmentdepending on the user-specified options. - createBigQueryClientCustomErrors() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl
- createBitXOr(Schema.FieldType) - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations
- createBlob() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- createBlockGenerator(BlockGeneratorListener) - Method in class org.apache.beam.sdk.io.sparkreceiver.WrappedSupervisor
- createBounded() - Static method in interface org.apache.beam.runners.spark.translation.SparkInputDataProcessor
-
Creates
SparkInputDataProcessorwhich does process input elements in separate thread and observes produced outputs via bounded queue in other thread. - createBoundedTableStatistics(Double) - Static method in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
- createBucket(String, Bucket) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
-
Creates a
Bucketunder the specified project in Cloud Storage or propagates an exception. - createBuilder(S3Options) - Method in interface org.apache.beam.sdk.io.aws2.options.S3ClientBuilderFactory
- createBuilder(S3Options) - Method in class org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
- createBuilder(BlobstoreOptions) - Method in class org.apache.beam.sdk.io.azure.blobstore.DefaultBlobstoreClientBuilderFactory
- createBuilder(BlobstoreOptions) - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreClientBuilderFactory
- createBytesXMLMessage(Solace.Record, boolean, DeliveryMode) - Static method in class org.apache.beam.sdk.io.solace.broker.MessageProducerUtils
-
Create a
BytesXMLMessageto be published in Solace. - createCatalog(String, String, Map<String, String>) - Method in interface org.apache.beam.sdk.extensions.sql.meta.catalog.CatalogManager
-
Creates and stores a catalog of a particular type.
- createCatalog(String, String, Map<String, String>) - Method in class org.apache.beam.sdk.extensions.sql.meta.catalog.EmptyCatalogManager
- createCatalog(String, String, Map<String, String>) - Method in class org.apache.beam.sdk.extensions.sql.meta.catalog.InMemoryCatalogManager
- createCatalog(SqlIdentifier, String, Map<String, String>, boolean, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.CatalogManagerSchema
- createCatalogItems() - Static method in class org.apache.beam.sdk.extensions.ml.RecommendationAIIO
- createClassLoader(List<String>) - Method in class org.apache.beam.sdk.extensions.sql.impl.JavaUdfLoader
- createClob() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- createCombineFn(AggregateCall, Schema.Field, String) - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.AggregationCombineFnAdapter
-
Creates either a UDAF or a built-in
Combine.CombineFn. - createCombineFnAnalyticsFunctions(AggregateCall, Schema.Field, String) - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.AggregationCombineFnAdapter
-
Creates either a UDAF or a built-in
Combine.CombineFnfor Analytic Functions. - createComparator(boolean, ExecutionConfig) - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeInformation
- createComparator(boolean, ExecutionConfig) - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueTypeInformation
- createConstantCombineFn() - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.AggregationCombineFnAdapter
- createConstructorCreator(Class<? super T>, Constructor<T>, Schema, List<FieldValueTypeInformation>, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.POJOUtils
- createConstructorCreator(Class<T>, Constructor<T>, Schema, List<FieldValueTypeInformation>, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
- createContextual(DeserializationContext, BeanProperty) - Method in class org.apache.beam.sdk.options.ValueProvider.Deserializer
- CREATED - Enum constant in enum class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.State
- createDatabase(String) - Method in interface org.apache.beam.sdk.extensions.sql.meta.catalog.Catalog
-
Creates a database with this name.
- createDatabase(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.catalog.InMemoryCatalog
- createDatabase(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.iceberg.IcebergCatalog
- createDatabase(SqlIdentifier, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.CatalogSchema
- createDataCatalogClient(DataCatalogPipelineOptions) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
- CreateDataflowView<ElemT,
ViewT> - Class in org.apache.beam.runners.dataflow -
A
DataflowRunnermarker class for creating aPCollectionView. - createDataset(String, String, DatasetProperties) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- createDataset(String, String, String, String, Long) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Create a
Datasetwith the givenlocation,descriptionand default expiration time for tables in the dataset (ifnull, tables don't expire). - createDataset(String, String, String, String, Long) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
-
Create a
Datasetwith the givenlocation,descriptionand default expiration time for tables in the dataset (ifnull, tables don't expire). - createDataset(String, String, String, String, Long) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- createDataset(List<WindowedValue<T>>, Encoder<WindowedValue<T>>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
- createDatasetFromRDD(SparkSession, BoundedSource<T>, Supplier<PipelineOptions>, Encoder<WindowedValue<T>>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.io.BoundedDatasetFactory
- createDatasetFromRows(SparkSession, BoundedSource<T>, Supplier<PipelineOptions>, Encoder<WindowedValue<T>>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.io.BoundedDatasetFactory
- createDecompressingChannel(ReadableByteChannel) - Method in enum class org.apache.beam.sdk.io.CompressedSource.CompressionMode
-
Deprecated.
- createDecompressingChannel(ReadableByteChannel) - Method in interface org.apache.beam.sdk.io.CompressedSource.DecompressingChannelFactory
-
Given a channel, create a channel that decompresses the content read from the channel.
- createDefault() - Static method in class org.apache.beam.sdk.coders.CoderRegistry
-
Creates a CoderRegistry containing registrations for all standard coders part of the core Java Apache Beam SDK and also any registrations provided by
coder registrars. - createDefault() - Static method in class org.apache.beam.sdk.fn.channel.ManagedChannelFactory
-
Creates a
ManagedChannelrelying on theManagedChannelBuilderto choose the channel type. - createDefault() - Static method in class org.apache.beam.sdk.fn.server.ServerFactory
-
Create a default
ServerFactory.InetSocketAddressServerFactory. - createDefault() - Static method in interface org.apache.beam.sdk.fn.server.ServerFactory.UrlFactory
- createDefault() - Static method in class org.apache.beam.sdk.schemas.SchemaRegistry
- createDefault() - Static method in class org.apache.beam.sdk.transforms.GroupIntoBatches.BatchingParams
- createDicomStore(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Create a DicomStore.
- createDicomStore(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- createDicomStore(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- createDicomStore(String, String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Create a DicomStore with a PubSub listener.
- CreateDisposition - Enum Class in org.apache.beam.sdk.io.snowflake.enums
-
Enum containing all supported dispositions for table.
- createEnumerator(SplitEnumeratorContext<FlinkSourceSplit<T>>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSource
- createEnumerator(SplitEnumeratorContext<FlinkSourceSplit<T>>, boolean) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSource
- createEnvironment(RunnerApi.Environment, String) - Method in class org.apache.beam.runners.fnexecution.environment.DockerEnvironmentFactory
-
Creates a new, active
RemoteEnvironmentbacked by a local Docker container. - createEnvironment(RunnerApi.Environment, String) - Method in class org.apache.beam.runners.fnexecution.environment.EmbeddedEnvironmentFactory
- createEnvironment(RunnerApi.Environment, String) - Method in interface org.apache.beam.runners.fnexecution.environment.EnvironmentFactory
-
Creates an active
RunnerApi.Environmentand returns a handle to it. - createEnvironment(RunnerApi.Environment, String) - Method in class org.apache.beam.runners.fnexecution.environment.ExternalEnvironmentFactory
-
Creates a new, active
RemoteEnvironmentbacked by an unmanaged worker. - createEnvironment(RunnerApi.Environment, String) - Method in class org.apache.beam.runners.fnexecution.environment.ProcessEnvironmentFactory
-
Creates a new, active
RemoteEnvironmentbacked by a forked process. - createEnvironment(RunnerApi.Environment, String) - Method in class org.apache.beam.runners.fnexecution.environment.StaticRemoteEnvironmentFactory
- createEnvironmentFactory(GrpcFnServer<FnApiControlClientPoolService>, GrpcFnServer<GrpcLoggingService>, GrpcFnServer<ArtifactRetrievalService>, GrpcFnServer<StaticGrpcProvisionService>, ControlClientPool, IdGenerator) - Method in class org.apache.beam.runners.fnexecution.environment.DockerEnvironmentFactory.Provider
- createEnvironmentFactory(GrpcFnServer<FnApiControlClientPoolService>, GrpcFnServer<GrpcLoggingService>, GrpcFnServer<ArtifactRetrievalService>, GrpcFnServer<StaticGrpcProvisionService>, ControlClientPool, IdGenerator) - Method in class org.apache.beam.runners.fnexecution.environment.EmbeddedEnvironmentFactory.Provider
- createEnvironmentFactory(GrpcFnServer<FnApiControlClientPoolService>, GrpcFnServer<GrpcLoggingService>, GrpcFnServer<ArtifactRetrievalService>, GrpcFnServer<StaticGrpcProvisionService>, ControlClientPool, IdGenerator) - Method in interface org.apache.beam.runners.fnexecution.environment.EnvironmentFactory.Provider
-
Creates
EnvironmentFactoryfor the provided GrpcServices. - createEnvironmentFactory(GrpcFnServer<FnApiControlClientPoolService>, GrpcFnServer<GrpcLoggingService>, GrpcFnServer<ArtifactRetrievalService>, GrpcFnServer<StaticGrpcProvisionService>, ControlClientPool, IdGenerator) - Method in class org.apache.beam.runners.fnexecution.environment.ExternalEnvironmentFactory.Provider
- createEnvironmentFactory(GrpcFnServer<FnApiControlClientPoolService>, GrpcFnServer<GrpcLoggingService>, GrpcFnServer<ArtifactRetrievalService>, GrpcFnServer<StaticGrpcProvisionService>, ControlClientPool, IdGenerator) - Method in class org.apache.beam.runners.fnexecution.environment.ProcessEnvironmentFactory.Provider
- createEnvironmentFactory(GrpcFnServer<FnApiControlClientPoolService>, GrpcFnServer<GrpcLoggingService>, GrpcFnServer<ArtifactRetrievalService>, GrpcFnServer<StaticGrpcProvisionService>, ControlClientPool, IdGenerator) - Method in class org.apache.beam.runners.fnexecution.environment.StaticRemoteEnvironmentFactory.Provider
- createEpoll() - Static method in class org.apache.beam.sdk.fn.channel.ManagedChannelFactory
-
Creates a
ManagedChannelFactorybacked by anEpollDomainSocketChannelif the address is aDomainSocketAddress. - createEpollDomainSocket() - Static method in class org.apache.beam.sdk.fn.server.ServerFactory
-
Create a
ServerFactory.EpollDomainSocket. - createEpollSocket() - Static method in class org.apache.beam.sdk.fn.server.ServerFactory
-
Create a
ServerFactory.EpollSocket. - createFactory() - Method in class org.apache.beam.sdk.extensions.sql.impl.JdbcDriver
- createFactoryForCreateSubscription() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- createFactoryForGetSchema(PubsubClient.TopicPath, PubsubClient.SchemaPath, Schema) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- createFactoryForPublish(PubsubClient.TopicPath, Iterable<PubsubClient.OutgoingMessage>, Iterable<PubsubClient.OutgoingMessage>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
-
Return a factory for testing publishers.
- createFactoryForPull(Clock, PubsubClient.SubscriptionPath, int, Iterable<PubsubClient.IncomingMessage>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
-
Return a factory for testing subscribers.
- createFactoryForPullAndPublish(PubsubClient.SubscriptionPath, PubsubClient.TopicPath, Clock, int, Iterable<PubsubClient.IncomingMessage>, Iterable<PubsubClient.OutgoingMessage>, Iterable<PubsubClient.OutgoingMessage>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
-
Returns a factory for a test that is expected to both publish and pull messages over the course of the test.
- createFhirStore(String, String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Create FHIR Store.
- createFhirStore(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- createFhirStore(String, String, String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- createFhirStore(String, String, String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Create FHIR Store with a PubSub topic listener.
- createFile() - Method in class org.apache.beam.sdk.jmh.io.TextSourceBenchmark.Data
-
Generates a random file with
NUM_LINESbetween 60 and 120 characters each. - createForSubrangeOfFile(String, long, long) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource
-
Deprecated.Used by Dataflow worker
- createForSubrangeOfFile(MatchResult.Metadata, long, long) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource
- createForSubrangeOfFile(MatchResult.Metadata, long, long) - Method in class org.apache.beam.sdk.io.BlockBasedSource
-
Creates a
BlockBasedSourcefor the specified range in a single file. - createForSubrangeOfFile(MatchResult.Metadata, long, long) - Method in class org.apache.beam.sdk.io.CompressedSource
-
Creates a
CompressedSourcefor a subrange of a file. - createForSubrangeOfFile(MatchResult.Metadata, long, long) - Method in class org.apache.beam.sdk.io.FileBasedSource
-
Creates and returns a new
FileBasedSourceof the same type as the currentFileBasedSourcebacked by a given file and an offset range. - createForSubrangeOfFile(MatchResult.Metadata, long, long) - Method in class org.apache.beam.sdk.io.TextSource
- createForSubrangeOfFile(MatchResult.Metadata, long, long) - Method in class org.apache.beam.sdk.io.xml.XmlSource
- createFrom(String) - Static method in class org.apache.beam.sdk.fn.channel.SocketAddressFactory
-
Parse a
SocketAddressfrom the given string. - createGetter(FieldValueTypeInformation, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
- createGetterConversions(StackManipulation) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.DefaultTypeConversionsFactory
- createGetterConversions(StackManipulation) - Method in interface org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversionsFactory
- createHaver(Class<ObjectT>, Method) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
- createHL7v2Message(String, Message) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Creates an HL7v2 message.
- createHL7v2Message(String, Message) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- createHL7v2Store(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Create hl 7 v 2 store hl 7 v 2 store.
- createHL7v2Store(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- createImplementor(Method) - Static method in class org.apache.beam.sdk.extensions.sql.impl.ScalarFunctionImpl
- createInProcess() - Static method in class org.apache.beam.sdk.fn.channel.ManagedChannelFactory
-
Creates a
ManagedChannelusing an in-process channel. - createInput(Pipeline, Map<String, PCollection<?>>) - Method in class org.apache.beam.sdk.expansion.service.ExpansionServiceSchemaTransformProvider
- createInput(Pipeline, Map<String, PCollection<?>>) - Method in interface org.apache.beam.sdk.expansion.service.TransformProvider
- createInputFormatInstance() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.HadoopInputFormatBoundedSource
-
Creates instance of InputFormat class.
- createInputSplits(int) - Method in class org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat
- createInputSplits(int) - Method in class org.apache.beam.runners.flink.translation.wrappers.SourceInputFormat
- createInstance() - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeSerializer
- createInstance() - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueSerializer
- createInstance() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.FlinkStateNamespaceKeySerializer
- createInternal(WindowingStrategy) - Static method in class org.apache.beam.sdk.transforms.windowing.Window.Assign
- createIterator() - Method in class org.apache.beam.sdk.fn.stream.PrefetchableIterables.Default
- createJCSMPSendMultipleEntry(List<Solace.Record>, boolean, SerializableFunction<Solace.Record, Destination>, DeliveryMode) - Static method in class org.apache.beam.sdk.io.solace.broker.MessageProducerUtils
-
Create a
JCSMPSendMultipleEntryarray to be published in Solace. - createJob(Job) - Method in class org.apache.beam.runners.dataflow.DataflowClient
-
Creates the Dataflow
Job. - createJobInvocation(String, String, ListeningExecutorService, RunnerApi.Pipeline, FlinkPipelineOptions, PortablePipelineRunner) - Method in class org.apache.beam.runners.flink.FlinkJobInvoker
- createJobServerFactory(JobServerDriver.ServerConfiguration) - Static method in class org.apache.beam.runners.jobsubmission.JobServerDriver
- createJobService() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver
- createKafkaRead() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
- createMessagesArray(Iterable<Solace.Record>, boolean) - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
- createMetadata(MetaT) - Static method in class org.apache.beam.sdk.values.PCollectionViews.ValueOrMetadata
- createMetadataTable() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
-
Create the metadata table if it does not exist yet.
- createNamespace(String) - Method in class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig
- createNClob() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- createNewDataset(String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
-
Creates a new dataset.
- createNewDataset(String, String, Long) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
-
Creates a new dataset with defaultTableExpirationMs.
- createNewDataset(String, String, Long, String) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
-
Creates a new dataset with defaultTableExpirationMs and in a specified location (GCP region).
- createNewTable(String, String, Table) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
- CreateOptions - Class in org.apache.beam.sdk.io.fs
-
An abstract class that contains common configuration options for creating resources.
- CreateOptions() - Constructor for class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.CreateOptions
- CreateOptions() - Constructor for class org.apache.beam.sdk.io.fs.CreateOptions
- CreateOptions.Builder<BuilderT> - Class in org.apache.beam.sdk.io.fs
-
An abstract builder for
CreateOptions. - CreateOptions.StandardCreateOptions - Class in org.apache.beam.sdk.io.fs
-
A standard configuration options with builder.
- CreateOptions.StandardCreateOptions.Builder - Class in org.apache.beam.sdk.io.fs
-
Builder for
CreateOptions.StandardCreateOptions. - createOrUpdateReadChangeStreamMetadataTable(String, String, String) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO
-
Utility method to create or update Read Change Stream metadata table.
- createOutboundAggregator(Supplier<String>, boolean) - Method in interface org.apache.beam.runners.fnexecution.data.FnDataService
-
Creates a
BeamFnDataOutboundAggregatorfor buffering and sending outbound data and timers over the data plane. - createOutboundAggregator(Supplier<String>, boolean) - Method in class org.apache.beam.runners.fnexecution.data.GrpcDataService
- createOutputIterator(Iterator<WindowedValue<FnInputT>>, SparkProcessContext<K, FnInputT, FnOutputT>) - Method in interface org.apache.beam.runners.spark.translation.SparkInputDataProcessor
-
Creates a transformation which processes input partition data and returns output results as
Iterator. - createOutputMap(Iterable<String>) - Static method in class org.apache.beam.runners.fnexecution.translation.PipelineTranslatorUtils
-
Creates a mapping from PCollection id to output tag integer.
- createPane(boolean, boolean, PaneInfo.Timing) - Static method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
- createPane(boolean, boolean, PaneInfo.Timing, long, long) - Static method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
-
Factory method to create a
PaneInfowith the specified parameters. - createPane(boolean, boolean, PaneInfo.Timing, long, long, boolean) - Static method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
-
Factory method to create a
PaneInfowith the specified parameters. - createPartitionMetadataTable() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Creates the metadata table in the given instance, database configuration, with the constructor specified table name.
- createPipeline(PipelineOptions) - Method in class org.apache.beam.sdk.expansion.service.ExpansionService
- createPipelineOptions(Map<String, String>) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamEnumerableConverter
- createPlanner(JdbcConnection, Collection<RuleSet>) - Method in interface org.apache.beam.sdk.extensions.sql.impl.QueryPlanner.Factory
- createPrepareContext() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- createPrimitiveOutputInternal(Pipeline, WindowingStrategy<?, ?>, PCollection.IsBounded, Coder<T>) - Static method in class org.apache.beam.sdk.values.PCollection
-
For internal use only; no backwards-compatibility guarantees.
- createPrimitiveOutputInternal(Pipeline, WindowingStrategy<?, ?>, PCollection.IsBounded, Coder<T>, TupleTag<?>) - Static method in class org.apache.beam.sdk.values.PCollection
-
For internal use only; no backwards-compatibility guarantees.
- createProcessContext(ValueInSingleWindow<InputT>) - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
Deprecated.Use
TestPipelinewith theDirectRunner. - createProperties() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
- createPushDownRel(RelDataType, List<String>, BeamSqlTableFilter) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
- createQuery(String, String, String) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher
- createQuery(Expression, Class<T>) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- createQuery(Expression, Type) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- createQueryUsingStandardSql(String, String, String) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher
- createQueueForTopic(String, String) - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthSempClient
- createQueueForTopic(String, String) - Method in interface org.apache.beam.sdk.io.solace.broker.SempClient
-
This is only called when a user requests to read data from a topic.
- createRandomSubscription(PubsubClient.ProjectPath, PubsubClient.TopicPath, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Create a random subscription for
topic. - createReader(FlinkSourceSplit<T>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.bounded.FlinkBoundedSourceReader
- createReader(FlinkSourceSplit<T>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
-
Create
Source.Readerfor givenFlinkSourceSplit. - createReader(FlinkSourceSplit<T>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.unbounded.FlinkUnboundedSourceReader
- createReader(PipelineOptions) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.impulse.BeamImpulseSource
- createReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.azure.cosmos.CosmosIO.BoundedCosmosBDSource
- createReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.BoundedSource
-
Returns a new
BoundedSource.BoundedReaderthat reads from this source. - createReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BoundedElasticsearchSource
- createReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.FileBasedSource
- createReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
- createReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.HadoopInputFormatBoundedSource
- createReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read.BoundedGridFSSource
- createReader(PipelineOptions, CheckpointMarkT) - Method in class org.apache.beam.sdk.io.UnboundedSource
-
Create a new
UnboundedSource.UnboundedReaderto read from this source, resuming from the given checkpoint if present. - createReader(PipelineOptions, CheckpointMarkImpl) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedSourceImpl
- createReader(PipelineOptions, SolaceCheckpointMark) - Method in class org.apache.beam.sdk.io.solace.read.UnboundedSolaceSource
- createReader(SourceReaderContext) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.bounded.FlinkBoundedSource
- createReader(SourceReaderContext) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.unbounded.FlinkUnboundedSource
- createReadSession(CreateReadSessionRequest) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StorageClient
-
Create a new read session against an existing table.
- createRPCLatencyHistogram(KafkaSinkMetrics.RpcMethod, String) - Static method in class org.apache.beam.sdk.io.kafka.KafkaSinkMetrics
-
Creates a
Histogrammetric to record RPC latency with the name - createSchema(PubsubClient.SchemaPath, String, Schema.Type) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Create
Schemafrom Schema definition content. - createSchema(PubsubClient.SchemaPath, String, Schema.Type) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
-
Create
Schemafrom Schema definition content. - createSchema(PubsubClient.SchemaPath, String, Schema.Type) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
-
Create
Schemafrom Schema definition content. - createSchema(PubsubClient.SchemaPath, String, Schema.Type) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- createSerializer(ExecutionConfig) - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeInformation
- createSerializer(ExecutionConfig) - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueTypeInformation
- createSessionToken(String) - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver
- createSetter(FieldValueTypeInformation, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
- createSetterConversions(StackManipulation) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.DefaultTypeConversionsFactory
- createSetterConversions(StackManipulation) - Method in interface org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversionsFactory
- createSingleFileReader(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource
- createSingleFileReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.BlockBasedSource
-
Creates a
BlockBasedReader. - createSingleFileReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.CompressedSource
-
Creates a
FileBasedReaderto read a single file. - createSingleFileReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.FileBasedSource
-
Creates and returns an instance of a
FileBasedReaderimplementation for the current source assuming the source represents a single file. - createSingleFileReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.TextSource
- createSingleFileReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.xml.XmlSource
- createSingleMessage(Solace.Record, boolean) - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
- createSource - Variable in class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceTransform
- createSourceForSubrange(long, long) - Method in class org.apache.beam.sdk.io.FileBasedSource
- createSourceForSubrange(long, long) - Method in class org.apache.beam.sdk.io.OffsetBasedSource
-
Returns an
OffsetBasedSourcefor a subrange of the current source. - createSQLXML() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- createStateBackend(FlinkPipelineOptions) - Method in interface org.apache.beam.runners.flink.FlinkStateBackendFactory
- createStatement() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- createStatement(int, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- createStatement(int, int, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- createStateOnInitialEvent(EventT) - Method in interface org.apache.beam.sdk.extensions.ordered.EventExaminer
-
If the event was the first event for a given key, create the state to hold the required data needed for processing.
- createStaticCreator(Class<T>, Method, Schema, List<FieldValueTypeInformation>, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
- createStaticCreator(Class<T>, Method, Schema, List<FieldValueTypeInformation>, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.POJOUtils
- CreateStream<T> - Class in org.apache.beam.runners.spark.io
-
Create an input stream from Queue.
- createStreamExecutionEnvironment(FlinkPipelineOptions, List<String>, String) - Static method in class org.apache.beam.runners.flink.FlinkExecutionEnvironments
-
If the submitted job is a stream processing job, this method creates the adequate Flink
StreamExecutionEnvironmentdepending on the user-specified options. - createStreaming(Class<?>, SerializableFunction<V, Long>, Class<? extends Receiver<V>>) - Static method in class org.apache.beam.sdk.io.cdap.Plugin
-
Creates a streaming plugin instance with default function for getting args for
Receiver. - createStreaming(Class<?>, SerializableFunction<V, Long>, Class<? extends Receiver<V>>, SerializableFunction<PluginConfig, Object[]>) - Static method in class org.apache.beam.sdk.io.cdap.Plugin
-
Creates a streaming plugin instance.
- CreateStreamingSparkView<ElemT,
ViewT> - Class in org.apache.beam.runners.spark.translation.streaming -
Spark streaming overrides for various view (side input) transforms.
- CreateStreamingSparkView(PCollectionView<ViewT>) - Constructor for class org.apache.beam.runners.spark.translation.streaming.CreateStreamingSparkView
- CreateStreamingSparkView.CreateSparkPCollectionView<ElemT,
ViewT> - Class in org.apache.beam.runners.spark.translation.streaming -
Creates a primitive
PCollectionView. - CreateStreamingSparkView.Factory<ElemT,
ViewT> - Class in org.apache.beam.runners.spark.translation.streaming - createStruct(String, Object[]) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- createStructuralValues(Coder<T>, List<T>) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
-
Testing utilities below depend on standard assertions and matchers to compare elements read by sources.
- createSubscription(PubsubClient.TopicPath, PubsubClient.SubscriptionPath, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Create
subscriptiontotopic. - createSubscription(PubsubClient.TopicPath, PubsubClient.SubscriptionPath, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
- createSubscription(PubsubClient.TopicPath, PubsubClient.SubscriptionPath, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
- createSubscription(PubsubClient.TopicPath, PubsubClient.SubscriptionPath, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- createTable(Table) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Creates the specified table if it does not exist.
- createTable(Table) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
-
Creates the specified table if it does not exist.
- createTable(Table) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- createTable(String, Schema, List<String>) - Method in class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig
- createTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
- createTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.iceberg.IcebergMetastore
- createTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.InMemoryMetaTableProvider
- createTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.ReadOnlyTableProvider
- createTable(Table) - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider
-
Creates a table.
- createTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider
- createTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.store.InMemoryMetaStore
- CreateTableHelpers - Class in org.apache.beam.sdk.io.gcp.bigquery
- CreateTableHelpers() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.CreateTableHelpers
- CreateTables<DestinationT,
ElementT> - Class in org.apache.beam.sdk.io.gcp.bigquery -
Creates any tables needed before performing streaming writes to the tables.
- CreateTables(BigQueryIO.Write.CreateDisposition, DynamicDestinations<?, DestinationT>) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.CreateTables
-
The list of tables created so far, so we don't try the creation each time.
- createTest(String, String, String) - Static method in class org.apache.beam.sdk.io.gcp.datastore.RowToEntity
- createTimestampPolicy(TopicPartition, Optional<Instant>) - Method in interface org.apache.beam.sdk.io.kafka.TimestampPolicyFactory
-
Creates a TimestampPolicy for a partition.
- createTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Create
topic. - createTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
- createTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
- createTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- createTopic(PubsubClient.TopicPath, PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Create {link TopicPath} with
PubsubClient.SchemaPath. - createTopic(PubsubClient.TopicPath, PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
- createTopic(PubsubClient.TopicPath, PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
- createTopic(PubsubClient.TopicPath, PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- createTransaction() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
-
Returns a transform that creates a batch transaction.
- CreateTransaction() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
- createTranslationContext(JobInfo, FlinkPipelineOptions, String, List<String>) - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator
-
Creates a batch translation context.
- createTranslationContext(JobInfo, FlinkPipelineOptions, String, List<String>) - Method in interface org.apache.beam.runners.flink.FlinkPortablePipelineTranslator
- createTranslationContext(JobInfo, FlinkPipelineOptions, String, List<String>) - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator
-
Creates a streaming translation context.
- createTranslationContext(JobInfo, FlinkPipelineOptions, ExecutionEnvironment) - Static method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator
- createTranslationContext(JobInfo, FlinkPipelineOptions, StreamExecutionEnvironment) - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator
-
Creates a streaming translation context.
- createTranslationContext(JavaSparkContext, SparkPipelineOptions, JobInfo) - Method in class org.apache.beam.runners.spark.translation.SparkBatchPortablePipelineTranslator
- createTranslationContext(JavaSparkContext, SparkPipelineOptions, JobInfo) - Method in interface org.apache.beam.runners.spark.translation.SparkPortablePipelineTranslator
- createTranslationContext(JavaSparkContext, SparkPipelineOptions, JobInfo) - Method in class org.apache.beam.runners.spark.translation.SparkStreamingPortablePipelineTranslator
- createTranslator() - Static method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator
-
Creates a batch translator.
- createTranslator(Map<String, FlinkBatchPortablePipelineTranslator.PTransformTranslator>) - Static method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator
-
Creates a batch translator.
- createTypeConversion(boolean) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.DefaultTypeConversionsFactory
- createTypeConversion(boolean) - Method in interface org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversionsFactory
- createUnbounded() - Static method in interface org.apache.beam.runners.spark.translation.SparkInputDataProcessor
-
Creates
SparkInputDataProcessorwhich does processing in calling thread. - createUnboundedTableStatistics(Double) - Static method in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
- createUrl(String, int) - Method in interface org.apache.beam.sdk.fn.server.ServerFactory.UrlFactory
- createValue(int, T) - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
-
Create a
OneOfType.Valuespecifying which field to set and the value to set. - createValue(String, T) - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
-
Create a
OneOfType.Valuespecifying which field to set and the value to set. - createValue(EnumerationType.Value, T) - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
-
Create a
OneOfType.Valuespecifying which field to set and the value to set. - createWatermarkPolicy() - Method in interface org.apache.beam.sdk.io.aws2.kinesis.WatermarkPolicyFactory
- createWithAllowDuplicates() - Static method in class org.apache.beam.runners.dataflow.internal.DataflowGroupByKey
-
Returns a
DataflowGroupByKey<K, V>PTransformthat its output can have duplicated elements. - createWithBytesReadConsumer(SeekableByteChannel, Consumer<Integer>) - Static method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingSeekableByteChannel
- createWithBytesWrittenConsumer(SeekableByteChannel, Consumer<Integer>) - Static method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingSeekableByteChannel
- createWithCustomGbk(Secret, PTransform<PCollection<KV<byte[], KV<byte[], byte[]>>>, PCollection<KV<byte[], Iterable<KV<byte[], byte[]>>>>>) - Static method in class org.apache.beam.sdk.transforms.GroupByEncryptedKey
-
Creates a
GroupByEncryptedKeytransform with a custom GBK in the middle. - createWithNoOpConsumer(ReadableByteChannel) - Static method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingReadableByteChannel
- createWithNoOpConsumer(SeekableByteChannel) - Static method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingSeekableByteChannel
- createWithNoOpConsumer(WritableByteChannel) - Static method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingWritableByteChannel
- createWithPortSupplier(Supplier<Integer>) - Static method in class org.apache.beam.sdk.fn.server.ServerFactory
-
Create a
ServerFactory.InetSocketAddressServerFactorythat uses ports from a supplier. - createWithUrlFactory(ServerFactory.UrlFactory) - Static method in class org.apache.beam.sdk.fn.server.ServerFactory
-
Create a
ServerFactory.InetSocketAddressServerFactorythat uses the given url factory. - createWithUrlFactoryAndPortSupplier(ServerFactory.UrlFactory, Supplier<Integer>) - Static method in class org.apache.beam.sdk.fn.server.ServerFactory
-
Create a
ServerFactory.InetSocketAddressServerFactorythat uses the given url factory and ports from a supplier. - createWrappingDoFnRunner(DoFnRunner<InputT, OutputT>, StepContext) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- createWrappingDoFnRunner(DoFnRunner<InputT, OutputT>, StepContext) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.ExecutableStageDoFnOperator
- createWrappingDoFnRunner(DoFnRunner<KeyedWorkItem<byte[], KV<InputT, RestrictionT>>, OutputT>, StepContext) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.SplittableDoFnOperator
- createWrappingDoFnRunner(DoFnRunner<KeyedWorkItem<K, InputT>, KV<K, OutputT>>, StepContext) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.WindowDoFnOperator
- createWriteOperation() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSink
- createWriteOperation() - Method in class org.apache.beam.sdk.io.FileBasedSink
-
Return a subclass of
FileBasedSink.WriteOperationthat will manage the write to the sink. - createWriter() - Method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
-
Clients must implement to return a subclass of
FileBasedSink.Writer. - createWriteStream(String, WriteStream.Type) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.WriteStreamService
-
Create a Write Stream for use with the Storage Write API.
- createWriteStream(String, WriteStream.Type) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.WriteStreamServiceImpl
- createWriteStream(String, WriteStream.Type) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- Creating Tables - Search tag in class org.apache.beam.sdk.io.iceberg.IcebergIO
- Section
- CredentialFactory - Interface in org.apache.beam.sdk.extensions.gcp.auth
-
Construct an oauth credential to be used by the SDK and the SDK workers.
- credentialsProvider() - Method in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration
-
Optional
AwsCredentialsProvider. - credentialsProvider(AwsCredentialsProvider) - Method in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration.Builder
-
Optional
AwsCredentialsProvider. - CrossLanguageConfiguration - Class in org.apache.beam.sdk.io.snowflake.crosslanguage
-
Parameters abstract class to expose the transforms to an external SDK.
- CrossLanguageConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.CrossLanguageConfiguration
- CrossLanguageConfiguration() - Constructor for class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- crossProductJoin() - Method in class org.apache.beam.sdk.schemas.transforms.CoGroup.Impl
-
Expand the join into individual rows, similar to SQL joins.
- CsvConfiguration() - Constructor for class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.CsvConfiguration
- csvConfigurationBuilder() - Static method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration
- CsvIO - Class in org.apache.beam.sdk.io.csv
-
PTransforms for reading and writing CSV files. - CsvIO() - Constructor for class org.apache.beam.sdk.io.csv.CsvIO
- CsvIO.Write<T> - Class in org.apache.beam.sdk.io.csv
-
PTransformfor writing CSV files. - CsvIOParse<T> - Class in org.apache.beam.sdk.io.csv
-
PTransformfor Parsing CSV Record Strings intoSchema-mapped target types. - CsvIOParse() - Constructor for class org.apache.beam.sdk.io.csv.CsvIOParse
- CsvIOParseError - Class in org.apache.beam.sdk.io.csv
-
CsvIOParseErroris a data class to store errors from CSV record processing. - CsvIOParseError() - Constructor for class org.apache.beam.sdk.io.csv.CsvIOParseError
- CsvIOParseResult<T> - Class in org.apache.beam.sdk.io.csv
- csvLines2BeamRows(CSVFormat, String, Schema) - Static method in class org.apache.beam.sdk.extensions.sql.impl.schema.BeamTableUtils
- CsvSink - Class in org.apache.beam.runners.spark.metrics.sink
-
A
Sinkfor Spark's metric system reporting metrics (including Beam step metrics) to a CSV file. - CsvSink(Properties, MetricRegistry) - Constructor for class org.apache.beam.runners.spark.metrics.sink.CsvSink
-
Constructor for Spark 3.2.x and later.
- CsvSink(Properties, MetricRegistry, SecurityManager) - Constructor for class org.apache.beam.runners.spark.metrics.sink.CsvSink
-
Constructor for Spark 3.1.x and earlier.
- CsvToRow(Schema, CSVFormat) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTableProvider.CsvToRow
- CsvWriteConfiguration() - Constructor for class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider.CsvWriteConfiguration
- CsvWriteSchemaTransformFormatProvider - Class in org.apache.beam.sdk.io.fileschematransform
-
A
FileWriteSchemaTransformFormatProviderfor CSV format. - CsvWriteSchemaTransformFormatProvider() - Constructor for class org.apache.beam.sdk.io.fileschematransform.CsvWriteSchemaTransformFormatProvider
- CsvWriteTransformProvider - Class in org.apache.beam.sdk.io.csv.providers
-
An implementation of
TypedSchemaTransformProviderforCsvIO.write(java.lang.String, org.apache.commons.csv.CSVFormat). - CsvWriteTransformProvider() - Constructor for class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider
- CsvWriteTransformProvider.CsvWriteConfiguration - Class in org.apache.beam.sdk.io.csv.providers
-
Configuration for writing to BigQuery with Storage Write API.
- CsvWriteTransformProvider.CsvWriteConfiguration.Builder - Class in org.apache.beam.sdk.io.csv.providers
-
Builder for
CsvWriteTransformProvider.CsvWriteConfiguration. - CsvWriteTransformProvider.CsvWriteTransform - Class in org.apache.beam.sdk.io.csv.providers
- ctx - Variable in class org.apache.beam.runners.spark.translation.AbstractInOutIterator
- ctxt - Variable in class org.apache.beam.runners.spark.SparkRunner.Evaluator
- ctxt - Variable in class org.apache.beam.runners.spark.translation.GroupByKeyVisitor
- current() - Static method in class org.apache.beam.sdk.io.googleads.GoogleAdsIO
- CURRENT_METADATA_TABLE_VERSION - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
- currentCatalog() - Method in interface org.apache.beam.sdk.extensions.sql.meta.catalog.CatalogManager
-
Produces the currently active catalog.
- currentCatalog() - Method in class org.apache.beam.sdk.extensions.sql.meta.catalog.EmptyCatalogManager
- currentCatalog() - Method in class org.apache.beam.sdk.extensions.sql.meta.catalog.InMemoryCatalogManager
- currentDatabase - Variable in class org.apache.beam.sdk.extensions.sql.meta.catalog.InMemoryCatalog
- currentDatabase() - Method in interface org.apache.beam.sdk.extensions.sql.meta.catalog.Catalog
-
Produces the currently active database.
- currentDatabase() - Method in class org.apache.beam.sdk.extensions.sql.meta.catalog.InMemoryCatalog
- currentEventTime() - Method in interface org.apache.beam.sdk.state.Timers
-
Returns the current event time.
- currentInputWatermarkTime() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
- currentOutputWatermarkTime() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
- currentProcessingTime() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
- currentProcessingTime() - Method in interface org.apache.beam.sdk.state.Timers
-
Returns the current processing time.
- currentRecordId() - Method in class org.apache.beam.sdk.transforms.DoFn.ProcessContext
- currentRecordOffset() - Method in class org.apache.beam.sdk.transforms.DoFn.ProcessContext
- currentRestriction() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.ReadChangeStreamPartitionProgressTracker
-
Returns the streamProgress that was successfully claimed.
- currentRestriction() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
- currentRestriction() - Method in class org.apache.beam.sdk.transforms.PeriodicSequence.OutputRangeTracker
- currentRestriction() - Method in class org.apache.beam.sdk.transforms.splittabledofn.ByteKeyRangeTracker
- currentRestriction() - Method in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
- currentRestriction() - Method in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker
-
Returns a restriction accurately describing the full range of work the current
DoFn.ProcessElementcall will do, including already completed work. - currentSynchronizedProcessingTime() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
- currentSynchronizedProcessingTime() - Method in interface org.apache.beam.sdk.state.Timers
-
Returns the current synchronized processing time or
nullif unknown. - currentWatermark - Variable in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.LogAppendTimePolicy
- currentWatermark() - Method in interface org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimator
-
Return estimated output watermark.
- currentWatermark() - Method in class org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimators.Manual
- currentWatermark() - Method in class org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimators.MonotonicallyIncreasing
- currentWatermark() - Method in class org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimators.WallTime
- custom() - Static method in class org.apache.beam.sdk.io.thrift.ThriftSchema
-
Builds a schema provider that maps any thrift type to a Beam schema, allowing for custom thrift typedef entries (which cannot be resolved using the available metadata) to be manually registered with their corresponding beam types.
- CUSTOM - Enum constant in enum class org.apache.beam.sdk.io.solace.SolaceIO.SubmissionMode
- CUSTOM_AUDIT_JOB_ENTRY_KEY - Static variable in class org.apache.beam.sdk.extensions.gcp.options.GcsOptions.GcsCustomAuditEntries
- CUSTOM_SOURCE_FORMAT - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- CustomCoder<T> - Class in org.apache.beam.sdk.coders
-
An abstract base class that implements all methods of
CoderexceptCoder.encode(T, java.io.OutputStream)andCoder.decode(java.io.InputStream). - CustomCoder() - Constructor for class org.apache.beam.sdk.coders.CustomCoder
- Customer - Class in org.apache.beam.sdk.extensions.sql.example.model
-
Describes a customer.
- Customer() - Constructor for class org.apache.beam.sdk.extensions.sql.example.model.Customer
- Customer(int, String, String) - Constructor for class org.apache.beam.sdk.extensions.sql.example.model.Customer
- CustomHttpErrors - Class in org.apache.beam.sdk.extensions.gcp.util
-
An optional component to use with the
RetryHttpRequestInitializerin order to provide custom errors for failing http calls. - CustomHttpErrors.Builder - Class in org.apache.beam.sdk.extensions.gcp.util
-
A Builder which allows building immutable CustomHttpErrors object.
- CustomHttpErrors.MatcherAndError - Class in org.apache.beam.sdk.extensions.gcp.util
-
A simple Tuple class for creating a list of HttpResponseMatcher and HttpResponseCustomError to print for the responses.
- CustomSources - Class in org.apache.beam.runners.dataflow.internal
-
A helper class for supporting sources defined as
Source. - CustomSources() - Constructor for class org.apache.beam.runners.dataflow.internal.CustomSources
- CustomTableResolver - Interface in org.apache.beam.sdk.extensions.sql.meta
-
Interface that table providers can implement if they require custom table name resolution.
- CustomTimestampPolicyWithLimitedDelay<K,
V> - Class in org.apache.beam.sdk.io.kafka -
A policy for custom record timestamps where timestamps within a partition are expected to be roughly monotonically increasing with a cap on out of order event delays (say 1 minute).
- CustomTimestampPolicyWithLimitedDelay(SerializableFunction<KafkaRecord<K, V>, Instant>, Duration, Optional<Instant>) - Constructor for class org.apache.beam.sdk.io.kafka.CustomTimestampPolicyWithLimitedDelay
-
A policy for custom record timestamps where timestamps are expected to be roughly monotonically increasing with out of order event delays less than
maxDelay. - Custom timestamps - Search tag in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
- Section
- CustomX509TrustManager - Class in org.apache.beam.sdk.io.splunk
-
A Custom X509TrustManager that trusts a user provided CA and default CA's.
- CustomX509TrustManager(X509Certificate) - Constructor for class org.apache.beam.sdk.io.splunk.CustomX509TrustManager
D
- DAGBuilder - Class in org.apache.beam.runners.jet
-
Utility class for wiring up Jet DAGs based on Beam pipelines.
- DAGBuilder.WiringListener - Interface in org.apache.beam.runners.jet
-
Listener that can be registered with a
DAGBuilderin order to be notified when edges are being registered. - DaoFactory - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao
- DaoFactory - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
-
Factory class to create data access objects to perform change stream queries and access the metadata tables.
- DaoFactory(BigtableConfig, BigtableConfig, String, String, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.DaoFactory
- DaoFactory(SpannerConfig, String, SpannerConfig, PartitionMetadataTableNames, Options.RpcPriority, String, Dialect, Dialect) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.DaoFactory
-
Constructs a
DaoFactorywith the configuration to be used for the underlying instances. - data() - Method in class org.apache.beam.sdk.io.solace.data.Semp.Queue
- data(String, String) - Static method in class org.apache.beam.sdk.fn.data.LogicalEndpoint
- data(StreamObserver<BeamFnApi.Elements>) - Method in class org.apache.beam.runners.fnexecution.data.GrpcDataService
- Data() - Constructor for class org.apache.beam.sdk.jmh.io.TextSourceBenchmark.Data
- DATA_BUFFER_SIZE_LIMIT - Static variable in class org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator
- DATA_BUFFER_TIME_LIMIT_MS - Static variable in class org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator
- DATA_RECORD_COMMITTED_TO_EMITTED_0MS_TO_1000MS_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Counter for record latencies [0, 1000) ms during the execution of the Connector.
- DATA_RECORD_COMMITTED_TO_EMITTED_1000MS_TO_3000MS_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Counter for record latencies [1000, 3000) ms during the execution of the Connector.
- DATA_RECORD_COMMITTED_TO_EMITTED_3000MS_TO_INF_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Counter for record latencies equal or above 3000ms during the execution of the Connector.
- DATA_RECORD_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Counter for the total number of data records identified during the execution of the Connector.
- database() - Method in class org.apache.beam.sdk.extensions.sql.impl.TableName
-
Returns the database name in this table path.
- database() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.hcatalog.HCatalogTable
- databaseExists(String) - Method in interface org.apache.beam.sdk.extensions.sql.meta.catalog.Catalog
-
Returns true if the database exists.
- databaseExists(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.catalog.InMemoryCatalog
- databaseExists(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.iceberg.IcebergCatalog
- databaseId - Variable in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntityWithSummary
- databaseId - Variable in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKeyWithSummary
- databaseId - Variable in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.WriteWithSummary
- Database Schema Preparation - Search tag in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
- Section
- DataCatalogPipelineOptions - Interface in org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog
-
Pipeline options for Data Catalog table provider.
- DataCatalogPipelineOptionsRegistrar - Class in org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog
- DataCatalogPipelineOptionsRegistrar() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogPipelineOptionsRegistrar
- dataCatalogSegments() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
- dataCatalogSegments(TableReference, BigQueryOptions) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
- DataCatalogTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog
-
Uses DataCatalog to get the source type and schema for a table.
- DataChangeRecord - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
A data change record encodes modifications to Cloud Spanner rows.
- DataChangeRecord(String, Timestamp, String, boolean, String, String, List<ColumnType>, List<Mod>, ModType, ValueCaptureType, long, long, String, boolean, ChangeStreamRecordMetadata) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
Constructs a data change record for a given partition, at a given timestamp, for a given transaction.
- dataChangeRecordAction(ThroughputEstimator<DataChangeRecord>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.ActionFactory
-
Creates and returns a singleton instance of an action class capable of processing
DataChangeRecords. - DataChangeRecordAction - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.action
-
This class is part of the process for
ReadChangeStreamPartitionDoFnSDF. - DataChangeRecordAction(ThroughputEstimator<DataChangeRecord>) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.DataChangeRecordAction
- DataEndpoint<T> - Class in org.apache.beam.sdk.fn.data
- DataEndpoint() - Constructor for class org.apache.beam.sdk.fn.data.DataEndpoint
- DataflowClient - Class in org.apache.beam.runners.dataflow
-
Wrapper around the generated
Dataflowclient to provide common functionality. - DataflowClientFactory() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions.DataflowClientFactory
- DataflowGroupByKey<K,
V> - Class in org.apache.beam.runners.dataflow.internal -
Specialized implementation of
GroupByKeyfor translating Redistribute transform into Dataflow service protos. - DataflowGroupByKey.Registrar - Class in org.apache.beam.runners.dataflow.internal
-
Registers
DataflowGroupByKey.DataflowGroupByKeyTranslator. - DataflowJobAlreadyExistsException - Exception Class in org.apache.beam.runners.dataflow
-
An exception that is thrown if the unique job name constraint of the Dataflow service is broken because an existing job with the same job name is currently active.
- DataflowJobAlreadyExistsException(DataflowPipelineJob, String) - Constructor for exception class org.apache.beam.runners.dataflow.DataflowJobAlreadyExistsException
-
Create a new
DataflowJobAlreadyExistsExceptionwith the specifiedDataflowPipelineJoband message. - DataflowJobAlreadyUpdatedException - Exception Class in org.apache.beam.runners.dataflow
-
An exception that is thrown if the existing job has already been updated within the Dataflow service and is no longer able to be updated.
- DataflowJobAlreadyUpdatedException(DataflowPipelineJob, String) - Constructor for exception class org.apache.beam.runners.dataflow.DataflowJobAlreadyUpdatedException
-
Create a new
DataflowJobAlreadyUpdatedExceptionwith the specifiedDataflowPipelineJoband message. - DataflowJobException - Exception Class in org.apache.beam.runners.dataflow
-
A
RuntimeExceptionthat contains information about aDataflowPipelineJob. - DataflowPipelineDebugOptions - Interface in org.apache.beam.runners.dataflow.options
-
Internal.
- DataflowPipelineDebugOptions.DataflowClientFactory - Class in org.apache.beam.runners.dataflow.options
-
Returns the default Dataflow client built from the passed in PipelineOptions.
- DataflowPipelineDebugOptions.StagerFactory - Class in org.apache.beam.runners.dataflow.options
-
Creates a
Stagerobject using the class specified inDataflowPipelineDebugOptions.getStagerClass(). - DataflowPipelineDebugOptions.UnboundedReaderMaxReadTimeFactory - Class in org.apache.beam.runners.dataflow.options
-
Sets Integer value based on old, deprecated field (
DataflowPipelineDebugOptions.getUnboundedReaderMaxReadTimeSec()). - DataflowPipelineJob - Class in org.apache.beam.runners.dataflow
-
A DataflowPipelineJob represents a job submitted to Dataflow using
DataflowRunner. - DataflowPipelineJob(DataflowClient, String, DataflowPipelineOptions, Map<AppliedPTransform<?, ?, ?>, String>) - Constructor for class org.apache.beam.runners.dataflow.DataflowPipelineJob
-
Constructs the job.
- DataflowPipelineJob(DataflowClient, String, DataflowPipelineOptions, Map<AppliedPTransform<?, ?, ?>, String>, RunnerApi.Pipeline) - Constructor for class org.apache.beam.runners.dataflow.DataflowPipelineJob
-
Constructs the job.
- DataflowPipelineOptions - Interface in org.apache.beam.runners.dataflow.options
-
Options that can be used to configure the
DataflowRunner. - DataflowPipelineOptions.FlexResourceSchedulingGoal - Enum Class in org.apache.beam.runners.dataflow.options
-
Set of available Flexible Resource Scheduling goals.
- DataflowPipelineOptions.StagingLocationFactory - Class in org.apache.beam.runners.dataflow.options
-
Returns a default staging location under
GcpOptions.getGcpTempLocation(). - DataflowPipelineRegistrar - Class in org.apache.beam.runners.dataflow
- DataflowPipelineRegistrar.Options - Class in org.apache.beam.runners.dataflow
-
Register the
DataflowPipelineOptions. - DataflowPipelineRegistrar.Runner - Class in org.apache.beam.runners.dataflow
-
Register the
DataflowRunner. - DataflowPipelineTranslator - Class in org.apache.beam.runners.dataflow
-
DataflowPipelineTranslatorknows how to translatePipelineobjects into Cloud Dataflow Service APIJobs. - DataflowPipelineTranslator.JobSpecification - Class in org.apache.beam.runners.dataflow
-
The result of a job translation.
- DataflowPipelineWorkerPoolOptions - Interface in org.apache.beam.runners.dataflow.options
-
Options that are used to configure the Dataflow pipeline worker pool.
- DataflowPipelineWorkerPoolOptions.AutoscalingAlgorithmType - Enum Class in org.apache.beam.runners.dataflow.options
-
Type of autoscaling algorithm to use.
- DataflowProfilingAgentConfiguration() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowProfilingOptions.DataflowProfilingAgentConfiguration
- DataflowProfilingOptions - Interface in org.apache.beam.runners.dataflow.options
-
Options for controlling profiling of pipeline execution.
- DataflowProfilingOptions.DataflowProfilingAgentConfiguration - Class in org.apache.beam.runners.dataflow.options
-
Configuration the for profiling agent.
- DataflowRunner - Class in org.apache.beam.runners.dataflow
-
A
PipelineRunnerthat executes the operations in the pipeline by first translating them to the Dataflow representation using theDataflowPipelineTranslatorand then submitting them to a Dataflow service for execution. - DataflowRunner(DataflowPipelineOptions) - Constructor for class org.apache.beam.runners.dataflow.DataflowRunner
- DataflowRunner.DataflowTransformTranslator - Class in org.apache.beam.runners.dataflow
- DataflowRunner.StreamingPCollectionViewWriterFn<T> - Class in org.apache.beam.runners.dataflow
-
A marker
DoFnfor writing the contents of aPCollectionto a streamingPCollectionViewbackend implementation. - DataflowRunnerHooks - Class in org.apache.beam.runners.dataflow
-
An instance of this class can be passed to the
DataflowRunnerto add user defined hooks to be invoked at various times during pipeline execution. - DataflowRunnerHooks() - Constructor for class org.apache.beam.runners.dataflow.DataflowRunnerHooks
- DataflowRunnerInfo - Class in org.apache.beam.runners.dataflow
-
Populates versioning and other information for
DataflowRunner. - DataflowServiceException - Exception Class in org.apache.beam.runners.dataflow
-
Signals there was an error retrieving information about a job from the Cloud Dataflow Service.
- DataflowStreamingPipelineOptions - Interface in org.apache.beam.runners.dataflow.options
-
[Internal] Options for configuring StreamingDataflowWorker.
- DataflowStreamingPipelineOptions.EnableWindmillServiceDirectPathFactory - Class in org.apache.beam.runners.dataflow.options
-
EnableStreamingEngine defaults to false unless one of the experiment is set.
- DataflowStreamingPipelineOptions.GlobalConfigRefreshPeriodFactory - Class in org.apache.beam.runners.dataflow.options
-
Read global get config request period from system property 'windmill.global_config_refresh_period'.
- DataflowStreamingPipelineOptions.HarnessUpdateReportingPeriodFactory - Class in org.apache.beam.runners.dataflow.options
-
Read counter reporting period from system property 'windmill.harness_update_reporting_period'.
- DataflowStreamingPipelineOptions.LocalWindmillHostportFactory - Class in org.apache.beam.runners.dataflow.options
-
Factory for creating local Windmill address.
- DataflowStreamingPipelineOptions.MaxStackTraceDepthToReportFactory - Class in org.apache.beam.runners.dataflow.options
-
Read 'MaxStackTraceToReport' from system property 'windmill.max_stack_trace_to_report' or Integer.MAX_VALUE if unspecified.
- DataflowStreamingPipelineOptions.PeriodicStatusPageDirectoryFactory - Class in org.apache.beam.runners.dataflow.options
-
Read 'PeriodicStatusPageOutputDirector' from system property 'windmill.periodic_status_page_directory' or null if unspecified.
- DataflowStreamingPipelineOptions.WindmillServiceStreamingRpcBatchLimitFactory - Class in org.apache.beam.runners.dataflow.options
-
Factory for setting value of WindmillServiceStreamingRpcBatchLimit based on environment.
- DataflowTemplateJob - Class in org.apache.beam.runners.dataflow.util
-
A
DataflowPipelineJobthat is returned when--templateRunneris set. - DataflowTemplateJob() - Constructor for class org.apache.beam.runners.dataflow.util.DataflowTemplateJob
- DataflowTransformTranslator() - Constructor for class org.apache.beam.runners.dataflow.DataflowRunner.DataflowTransformTranslator
- DataflowTransport - Class in org.apache.beam.runners.dataflow.util
-
Helpers for cloud communication.
- DataflowTransport() - Constructor for class org.apache.beam.runners.dataflow.util.DataflowTransport
- DataflowWorkerHarnessOptions - Interface in org.apache.beam.runners.dataflow.options
-
Options that are used exclusively within the Dataflow worker harness.
- DataflowWorkerLoggingOptions - Interface in org.apache.beam.runners.dataflow.options
-
Deprecated.This interface will no longer be the source of truth for worker logging configuration once jobs are executed using a dedicated SDK harness instead of user code being co-located alongside Dataflow worker code. Consider set corresponding options within
SdkHarnessOptionsto ensure forward compatibility. - DataflowWorkerLoggingOptions.Level - Enum Class in org.apache.beam.runners.dataflow.options
-
Deprecated.The set of log levels that can be used on the Dataflow worker.
- DataflowWorkerLoggingOptions.WorkerLogLevelOverrides - Class in org.apache.beam.runners.dataflow.options
-
Deprecated.Defines a log level override for a specific class, package, or name.
- DataframeTransform - Class in org.apache.beam.sdk.extensions.python.transforms
-
Wrapper for invoking external Python
DataframeTransform. - DataGeneratorPTransform - Class in org.apache.beam.sdk.extensions.sql.meta.provider.datagen
-
The main PTransform that encapsulates the data generation logic.
- DataGeneratorPTransform(Schema, ObjectNode) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.datagen.DataGeneratorPTransform
- DataGeneratorRowFn - Class in org.apache.beam.sdk.extensions.sql.meta.provider.datagen
-
A stateful DoFn that converts a sequence of Longs into structured Rows.
- DataGeneratorRowFn(Schema, ObjectNode, String) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.datagen.DataGeneratorRowFn
- DataGeneratorTable - Class in org.apache.beam.sdk.extensions.sql.meta.provider.datagen
-
Represents a 'datagen' table within a Beam SQL pipeline.
- DataGeneratorTable(Schema, ObjectNode) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.datagen.DataGeneratorTable
- DataGeneratorTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.datagen
-
The service entry point for the 'datagen' table type.
- DataGeneratorTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.datagen.DataGeneratorTableProvider
- DataInputViewWrapper - Class in org.apache.beam.runners.flink.translation.wrappers
-
Wrapper for
DataInputView. - DataInputViewWrapper(DataInputView) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.DataInputViewWrapper
- DataOutputViewWrapper - Class in org.apache.beam.runners.flink.translation.wrappers
-
Wrapper for
DataOutputView. - DataOutputViewWrapper(DataOutputView) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.DataOutputViewWrapper
- dataSchema - Variable in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
- dataset - Variable in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
- Dataset - Interface in org.apache.beam.runners.spark.translation
-
Holder for Spark RDD/DStream.
- datasetExists(String) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- dataSets - Variable in class org.apache.beam.runners.twister2.Twister2TranslationContext
- DatasetServiceImpl(BigQueryOptions) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
- dataSourceConfiguration() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransform
- dataSourceConfiguration() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransform
- DataSourceConfiguration() - Constructor for class org.apache.beam.sdk.io.influxdb.InfluxDbIO.DataSourceConfiguration
- DataSourceConfiguration() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
- DataSourceConfiguration() - Constructor for class org.apache.beam.sdk.io.singlestore.SingleStoreIO.DataSourceConfiguration
- DataSourceConfiguration() - Constructor for class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- DatastoreIO - Class in org.apache.beam.sdk.io.gcp.datastore
-
DatastoreIOprovides an API for reading from and writing to Google Cloud Datastore over different versions of the Cloud Datastore Client libraries. - DatastoreV1 - Class in org.apache.beam.sdk.io.gcp.datastore
-
DatastoreV1provides an API to Read, Write and DeletePCollectionsof Google Cloud Datastore version v1Entityobjects. - DatastoreV1.DeleteEntity - Class in org.apache.beam.sdk.io.gcp.datastore
-
A
PTransformthat deletesEntitiesfrom Cloud Datastore. - DatastoreV1.DeleteEntityWithSummary - Class in org.apache.beam.sdk.io.gcp.datastore
-
A
PTransformthat deletesEntitiesfrom Cloud Datastore and returnsDatastoreV1.WriteSuccessSummaryfor each successful write. - DatastoreV1.DeleteKey - Class in org.apache.beam.sdk.io.gcp.datastore
- DatastoreV1.DeleteKeyWithSummary - Class in org.apache.beam.sdk.io.gcp.datastore
-
A
PTransformthat deletesEntitiesassociated with the givenKeysfrom Cloud Datastore and returnsDatastoreV1.WriteSuccessSummaryfor each successful delete. - DatastoreV1.Read - Class in org.apache.beam.sdk.io.gcp.datastore
-
A
PTransformthat reads the result rows of a Cloud Datastore query asEntityobjects. - DatastoreV1.Write - Class in org.apache.beam.sdk.io.gcp.datastore
-
A
PTransformthat writesEntityobjects to Cloud Datastore. - DatastoreV1.WriteSuccessSummary - Class in org.apache.beam.sdk.io.gcp.datastore
-
Summary object produced when a number of writes are successfully written to Datastore in a single Mutation.
- DatastoreV1.WriteWithSummary - Class in org.apache.beam.sdk.io.gcp.datastore
-
A
PTransformthat writesEntityobjects to Cloud Datastore and returnsDatastoreV1.WriteSuccessSummaryfor each successful write. - DataStoreV1SchemaIOProvider - Class in org.apache.beam.sdk.io.gcp.datastore
-
An implementation of
SchemaIOProviderfor reading and writing payloads withDatastoreIO. - DataStoreV1SchemaIOProvider() - Constructor for class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider
- DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO - Class in org.apache.beam.sdk.io.gcp.datastore
-
An abstraction to create schema aware IOs.
- DataStoreV1TableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.datastore
-
TableProviderforDatastoreIOfor consumption by Beam SQL. - DataStoreV1TableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.datastore.DataStoreV1TableProvider
- DataStreamDecoder(Coder<T>, PrefetchableIterator<ByteString>) - Constructor for class org.apache.beam.sdk.fn.stream.DataStreams.DataStreamDecoder
- DataStreams - Class in org.apache.beam.sdk.fn.stream
-
DataStreams.DataStreamDecodertreats multipleByteStrings as a single input stream decoding values with the supplied iterator. - DataStreams() - Constructor for class org.apache.beam.sdk.fn.stream.DataStreams
- DataStreams.DataStreamDecoder<T> - Class in org.apache.beam.sdk.fn.stream
-
An adapter which converts an
InputStreamto aPrefetchableIteratorofTvalues using the specifiedCoder. - DataStreams.ElementDelimitedOutputStream - Class in org.apache.beam.sdk.fn.stream
-
An adapter which wraps an
DataStreams.OutputChunkConsumeras anOutputStream. - DataStreams.OutputChunkConsumer<T> - Interface in org.apache.beam.sdk.fn.stream
-
A callback which is invoked whenever the
DataStreams.outbound(org.apache.beam.sdk.fn.stream.DataStreams.OutputChunkConsumer<org.apache.beam.vendor.grpc.v1p69p0.com.google.protobuf.ByteString>)OutputStreambecomes full. - Date - Class in org.apache.beam.sdk.schemas.logicaltypes
-
A date without a time-zone.
- Date() - Constructor for class org.apache.beam.sdk.schemas.logicaltypes.Date
- DATE - Enum constant in enum class org.apache.beam.sdk.io.clickhouse.TableSchema.TypeName
- DATE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
- DATE - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- DATE - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.SqlTypes
-
Beam LogicalType corresponding to CalciteSQL DATE type.
- DATE_FIELD_NAME - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.DateTime
- DATE_TYPES - Static variable in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- DateConversion() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.DateConversion
- DateConversion() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.DateConversion
- DateIncrementAllFn() - Constructor for class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.DateIncrementAllFn
- DateTime - Class in org.apache.beam.sdk.schemas.logicaltypes
-
A datetime without a time-zone.
- DateTime() - Constructor for class org.apache.beam.sdk.schemas.logicaltypes.DateTime
- DATETIME - Enum constant in enum class org.apache.beam.sdk.io.clickhouse.TableSchema.TypeName
- DATETIME - Enum constant in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- DATETIME - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- DATETIME - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.SqlTypes
-
Beam LogicalType corresponding to DATETIME type.
- DATETIME - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
-
The type of datetime fields.
- DATETIME_SCHEMA - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.DateTime
- DateTimeBundle() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.DateTimeBundle
- days(int) - Static method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows
-
Returns a
WindowFnthat windows elements into periods measured by days. - DB2 - Enum constant in enum class org.apache.beam.io.debezium.Connectors
- DDL_EXECUTOR - Static variable in class org.apache.beam.sdk.extensions.sql.impl.parser.BeamSqlParser
-
Ddl Executor.
- DEAD_LETTER - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read
-
The tag for the deadletter output of FHIR resources.
- DEAD_LETTER - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search
-
The tag for the deadletter output of FHIR Resources.
- DEAD_LETTER - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything
-
The tag for the deadletter output of FHIR Resources from a GetPatientEverything request.
- DEAD_LETTER - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read
-
The tag for the deadletter output of HL7v2 read responses.
- DEAD_LETTER - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read
-
The tag for the deadletter output of HL7v2 Messages.
- DeadLetteredTransform<InputT,
OutputT> - Class in org.apache.beam.sdk.schemas.io - DeadLetteredTransform(SimpleFunction<InputT, OutputT>, String) - Constructor for class org.apache.beam.sdk.schemas.io.DeadLetteredTransform
- deadLetterQueue - Search tag in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
- Section
- DebeziumIO - Class in org.apache.beam.io.debezium
-
Utility class which exposes an implementation
DebeziumIO.read()and a Debezium configuration. - DebeziumIO.ConnectorConfiguration - Class in org.apache.beam.io.debezium
-
A POJO describing a Debezium configuration.
- DebeziumIO.Read<T> - Class in org.apache.beam.io.debezium
-
Implementation of
DebeziumIO.read(). - DebeziumReadSchemaTransformConfiguration() - Constructor for class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration
- DebeziumReadSchemaTransformProvider - Class in org.apache.beam.io.debezium
-
A schema-aware transform provider for
DebeziumIO. - DebeziumReadSchemaTransformProvider() - Constructor for class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider
- DebeziumReadSchemaTransformProvider(Boolean, Integer, Long) - Constructor for class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider
- DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration - Class in org.apache.beam.io.debezium
- DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration.Builder - Class in org.apache.beam.io.debezium
- debeziumRecordInstant(SourceRecord) - Static method in class org.apache.beam.io.debezium.KafkaConnectUtils
- DebeziumSDFDatabaseHistory() - Constructor for class org.apache.beam.io.debezium.KafkaSourceConsumerFn.DebeziumSDFDatabaseHistory
- DebeziumTransformRegistrar - Class in org.apache.beam.io.debezium
-
Exposes
DebeziumIO.Readas an external transform for cross-language usage. - DebeziumTransformRegistrar() - Constructor for class org.apache.beam.io.debezium.DebeziumTransformRegistrar
- DebeziumTransformRegistrar.ReadBuilder - Class in org.apache.beam.io.debezium
- DebeziumTransformRegistrar.ReadBuilder.Configuration - Class in org.apache.beam.io.debezium
- DEBUG - Enum constant in enum class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.Level
-
Deprecated.Level for logging diagnostic messages.
- DEBUG - Enum constant in enum class org.apache.beam.sdk.options.SdkHarnessOptions.LogLevel
-
LogLevel for logging diagnostic messages.
- dec() - Method in class org.apache.beam.runners.jet.metrics.CounterImpl
- dec() - Method in interface org.apache.beam.sdk.metrics.Counter
- dec() - Method in class org.apache.beam.sdk.metrics.DelegatingCounter
- dec() - Method in class org.apache.beam.sdk.metrics.NoOpCounter
- dec(long) - Method in class org.apache.beam.runners.jet.metrics.CounterImpl
- dec(long) - Method in interface org.apache.beam.sdk.metrics.Counter
- dec(long) - Method in class org.apache.beam.sdk.metrics.DelegatingCounter
- dec(long) - Method in class org.apache.beam.sdk.metrics.NoOpCounter
- decActivePartitionReadCounter() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Decrements the
ChangeStreamMetrics.ACTIVE_PARTITION_READ_COUNTby 1 if the metric is enabled. - DECIMAL - Enum constant in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- DECIMAL - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
- DECIMAL - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
-
The type of decimal fields.
- decode(InputStream) - Method in class org.apache.beam.sdk.extensions.ordered.combiner.SequenceRangeAccumulator.SequenceRangeAccumulatorCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.ReadableFileCoder
- decode(InputStream) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.FooterCoder
- decode(InputStream) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmRecordCoder
- decode(InputStream) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmShardCoder
- decode(InputStream) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.KeyPrefixCoder
- decode(InputStream) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.MetadataKeyCoder
- decode(InputStream) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
- decode(InputStream) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.SingletonKeyedWorkItemCoder
- decode(InputStream) - Method in class org.apache.beam.runners.fnexecution.wire.ByteStringCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.BigEndianLongCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.BigEndianShortCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.BitSetCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.BooleanCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.ByteCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.Coder
-
Decodes a value of type
Tfrom the given input stream in the given context. - decode(InputStream) - Method in class org.apache.beam.sdk.coders.DelegateCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.DoubleCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.DurationCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.FloatCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.InstantCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.KvCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.MapCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.NullableCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.OptionalCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.SerializableCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.ShardedKeyCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.SnappyCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.SortedMapCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.TextualIntegerCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.TimestampPrefixingWindowCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.VarIntCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.VarLongCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.VoidCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.ZstdCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.HyperLogLogPlusCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow.Coder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.amqp.AmqpMessageCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.aws2.dynamodb.AttributeValueCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.CountingSource.CounterMarkCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy.ParamsCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocumentCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResultCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.fs.MetadataCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.fs.MetadataCoderV2
- decode(InputStream) - Method in class org.apache.beam.sdk.io.fs.ResourceIdCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertErrorCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.RowMutation.RowMutationCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV2
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV3
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResultCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameterCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOErrorCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2MessageCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadResponseCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.JsonArrayCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdAndOrderingKeyCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithMessageIdCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithTopicCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.OffsetByteRangeCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.SubscriptionPartitionCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UuidCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.hadoop.WritableCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.kafka.ProducerRecordCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.kafka.TopicPartitionCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.range.OffsetRange.Coder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.splunk.SplunkEventCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.thrift.ThriftCoder
-
Decodes a value of type
Tfrom the given input stream using providedThriftCoder.protocolFactory. - decode(InputStream) - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.schemas.SchemaCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.testing.TestStream.TestStreamCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow.Coder
- decode(InputStream) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo.PaneInfoCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow.Coder
- decode(InputStream) - Method in class org.apache.beam.sdk.values.PCollectionViews.ValueOrMetadataCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
- decode(InputStream) - Method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.values.WindowedValues.FullWindowedValueCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.values.WindowedValues.ParamWindowedValueCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.values.WindowedValues.ValueOnlyWindowedValueCoder
-
Deprecated.
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.SingletonKeyedWorkItemCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.runners.fnexecution.wire.ByteStringCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.BitSetCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.Coder
-
Deprecated.only implement and call
Coder.decode(InputStream) - decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.DelegateCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.KvCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.MapCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.NullableCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.SortedMapCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.TextualIntegerCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow.Coder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.WindowedValues.FullWindowedValueCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.WindowedValues.ParamWindowedValueCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.WindowedValues.ValueOnlyWindowedValueCoder
-
Deprecated.
- decodeFromChunkBoundaryToChunkBoundary() - Method in class org.apache.beam.sdk.fn.stream.DataStreams.DataStreamDecoder
-
Skips any remaining bytes in the current
ByteStringmoving to the nextByteStringin the underlyingByteStringiteratorand decoding elements till at the next boundary. - decodeKey(ByteBuffer, Coder<K>) - Static method in class org.apache.beam.runners.flink.translation.wrappers.streaming.FlinkKeyUtils
-
Decodes a key from a ByteBuffer containing a byte array.
- decodePacked32TimeSeconds(int) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Decodes
bitFieldTimeSecondsas aLocalTimewith seconds precision. - decodePacked32TimeSecondsAsJavaTime(int) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Decodes
bitFieldTimeSecondsas aLocalTimewith seconds precision. - decodePacked64DatetimeMicros(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Decodes
bitFieldDatetimeMicrosas aLocalDateTimewith microseconds precision. - decodePacked64DatetimeMicrosAsJavaTime(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Decodes
bitFieldDatetimeMicrosas aLocalDateTimewith microseconds precision. - decodePacked64DatetimeSeconds(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Decodes
bitFieldDatetimeSecondsas aLocalDateTimewith seconds precision. - decodePacked64DatetimeSecondsAsJavaTime(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Decodes
bitFieldDatetimeSecondsas aLocalDateTimewith seconds precision. - decodePacked64TimeMicros(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Decodes
bitFieldTimeMicrosas aLocalTimewith microseconds precision. - decodePacked64TimeMicrosAsJavaTime(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Decodes
bitFieldTimeMicrosas aLocalTimewith microseconds precision. - decodePacked64TimeNanos(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Decodes
bitFieldTimeNanosas aLocalTimewith nanoseconds precision. - decodePacked64TimeNanosAsJavaTime(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Decodes
bitFieldTimeNanosas aLocalTimewith nanoseconds precision. - decodePane(byte) - Static method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
- decodeQueryResult(String) - Static method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
- decodeTimerDataTimerId(String) - Static method in class org.apache.beam.runners.fnexecution.control.TimerReceiverFactory
-
Decodes a string into the transform and timer family ids.
- decodeToIterable(List<T>) - Method in class org.apache.beam.sdk.coders.CollectionCoder
-
Builds an instance of
IterableT, this coder's associatedIterable-like subtype, from a list of decoded elements. - decodeToIterable(List<T>) - Method in class org.apache.beam.sdk.coders.DequeCoder
- decodeToIterable(List<T>) - Method in class org.apache.beam.sdk.coders.IterableCoder
- decodeToIterable(List<T>) - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
-
Builds an instance of
IterableT, this coder's associatedIterable-like subtype, from a list of decoded elements. - decodeToIterable(List<T>) - Method in class org.apache.beam.sdk.coders.ListCoder
- decodeToIterable(List<T>) - Method in class org.apache.beam.sdk.coders.SetCoder
-
Builds an instance of
IterableT, this coder's associatedIterable-like subtype, from a list of decoded elements. - decodeToIterable(List<T>, long, InputStream) - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
-
Builds an instance of
IterableT, this coder's associatedIterable-like subtype, from a list of decoded elements with theInputStreamat the position where this coder detected the end of the stream. - decodeWindowedValue(byte[], Coder) - Static method in class org.apache.beam.runners.jet.Utils
- DecodingFnDataReceiver<T> - Class in org.apache.beam.sdk.fn.data
-
A receiver of encoded data, decoding it and passing it onto a downstream consumer.
- DecodingFnDataReceiver(Coder<T>, FnDataReceiver<T>) - Constructor for class org.apache.beam.sdk.fn.data.DecodingFnDataReceiver
- decPartitionStreamCount() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
-
Decrements the
ChangeStreamMetrics.PARTITION_STREAM_COUNTby 1. - DECRBY - Enum constant in enum class org.apache.beam.sdk.io.redis.RedisIO.Write.Method
-
Use DECRBY command.
- decrementingLongs() - Static method in class org.apache.beam.sdk.fn.IdGenerators
-
Returns an
IdGeneratorsthat will provide successive decrementing longs. - DedupingOperator<T> - Class in org.apache.beam.runners.flink.translation.wrappers.streaming.io
-
Remove values with duplicate ids.
- DedupingOperator(PipelineOptions) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.io.DedupingOperator
- deduplicate() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions
- deduplicate(UuidDeduplicationOptions) - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteIO
-
Remove duplicates from the PTransform from a read.
- Deduplicate - Class in org.apache.beam.sdk.transforms
-
A set of
PTransforms which deduplicate input records over a time domain and threshold. - Deduplicate.KeyedValues<K,
V> - Class in org.apache.beam.sdk.transforms -
Deduplicates keyed values using the key over a specified time domain and threshold.
- Deduplicate.Values<T> - Class in org.apache.beam.sdk.transforms
-
Deduplicates values over a specified time domain and threshold.
- Deduplicate.WithRepresentativeValues<T,
IdT> - Class in org.apache.beam.sdk.transforms -
A
PTransformthat uses aSerializableFunctionto obtain a representative value for each input element used for deduplication. - Deduplication - Search tag in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO
- Section
- deepEquals(Object) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.AbstractBeamCalcRel
- deepEquals(Object, Object, Schema.FieldType) - Static method in class org.apache.beam.sdk.values.Row.Equals
- deepHashCode(Object, Schema.FieldType) - Static method in class org.apache.beam.sdk.values.Row.Equals
- DEF - Static variable in interface org.apache.beam.sdk.extensions.sql.impl.planner.NodeStatsMetadata
- Default - Annotation Interface in org.apache.beam.sdk.options
-
Defaultrepresents a set of annotations that can be used to annotate getter properties onPipelineOptionswith information representing the default value to be returned if no value is specified. - Default() - Constructor for class org.apache.beam.sdk.fn.stream.PrefetchableIterables.Default
- DEFAULT - Enum constant in enum class org.apache.beam.sdk.io.clickhouse.TableSchema.DefaultType
- DEFAULT - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead.Method
-
The default behavior if no method is explicitly set.
- DEFAULT - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.Method
-
The default behavior if no method is explicitly set.
- DEFAULT - Static variable in class org.apache.beam.sdk.extensions.sbe.SbeSchema.IrOptions
- DEFAULT - Static variable in interface org.apache.beam.sdk.extensions.sql.meta.catalog.Catalog
- DEFAULT - Static variable in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamBigQuerySqlDialect
- DEFAULT_ADVANCE_TIMEOUT_IN_MILLIS - Static variable in class org.apache.beam.sdk.io.solace.broker.SolaceMessageReceiver
- DEFAULT_ATTRIBUTE - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.internal.Uuid
- DEFAULT_BUFFER_LIMIT_BYTES - Static variable in class org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator
- DEFAULT_BUFFER_LIMIT_TIME_MS - Static variable in class org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator
- DEFAULT_BUFFER_SIZE - Static variable in class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
- DEFAULT_BYTE_ARRAY_CODER - Static variable in class org.apache.beam.sdk.io.TFRecordIO
-
The default coder, which returns each record of the input file as a byte array.
- DEFAULT_CHANGE_STREAM_NAME - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
-
The default change stream name for a change stream query is the empty
String. - DEFAULT_CONTEXT - Static variable in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamBigQuerySqlDialect
- DEFAULT_DEDUPLICATE_DURATION - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions
- DEFAULT_DURATION - Static variable in class org.apache.beam.sdk.transforms.Deduplicate
-
The default duration is 10 mins.
- DEFAULT_INCLUSIVE_END_AT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
-
The default end timestamp for a change stream query is
ChangeStreamsConstants.MAX_INCLUSIVE_END_AT. - DEFAULT_INCLUSIVE_START_AT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
-
The default start timestamp for a change stream query is
Timestamp.MIN_VALUE. - DEFAULT_INITIAL_BACKOFF - Static variable in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO
- DEFAULT_MASTER_URL - Static variable in interface org.apache.beam.runners.spark.SparkCommonPipelineOptions
- DEFAULT_MAX_CUMULATIVE_BACKOFF - Static variable in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO
- DEFAULT_MAX_ELEMENTS_TO_OUTPUT - Static variable in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler
- DEFAULT_MAX_INSERT_BLOCK_SIZE - Static variable in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO
- DEFAULT_MAX_INVOCATION_HISTORY - Static variable in class org.apache.beam.runners.jobsubmission.InMemoryJobService
-
The default maximum number of completed invocations to keep.
- DEFAULT_MAX_NUM_ELEMENTS - Static variable in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
-
The cost (in time and space) to compute quantiles to a given accuracy is a function of the total number of elements in the data set.
- DEFAULT_MAX_RETRIES - Static variable in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO
- DEFAULT_METADATA_TABLE_NAME - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
- DEFAULT_OUTBOUND_BUFFER_LIMIT_BYTES - Static variable in class org.apache.beam.sdk.fn.stream.DataStreams
- DEFAULT_PRECISION - Static variable in class org.apache.beam.sdk.extensions.zetasketch.HllCount
-
The default
precisionvalue used inHllCount.Init.Builder.withPrecision(int)is 15. - DEFAULT_RPC_PRIORITY - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
-
The default priority for a change stream query is
Options.RpcPriority.HIGH. - DEFAULT_SCHEMA_FIELD_NAME - Static variable in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- DEFAULT_SCHEMA_RECORD_NAME - Static variable in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- DEFAULT_SCHEME - Static variable in class org.apache.beam.sdk.io.FileSystems
- DEFAULT_SESSION_DURATION_SECS - Static variable in class org.apache.beam.sdk.io.aws2.auth.StsAssumeRoleForFederatedCredentialsProvider
- DEFAULT_TIME_DOMAIN - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions
- DEFAULT_TIME_DOMAIN - Static variable in class org.apache.beam.sdk.transforms.Deduplicate
-
The default is the
processing time domain. - DEFAULT_TIMEOUT - Static variable in class org.apache.beam.io.requestresponse.RequestResponseIO
-
The default
Durationto wait until completion of user code. - DEFAULT_UNWINDOWED_SHARD_TEMPLATE - Static variable in class org.apache.beam.sdk.io.DefaultFilenamePolicy
-
The default sharding name template.
- DEFAULT_UPLOAD_BUFFER_SIZE_BYTES - Static variable in class org.apache.beam.sdk.io.aws2.options.S3Options.S3UploadBufferSizeBytesFactory
- DEFAULT_USES_RESHUFFLE - Static variable in class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceTransform
- DEFAULT_UUID_EXTRACTOR - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions
- DEFAULT_VPN_NAME - Static variable in class org.apache.beam.sdk.io.solace.broker.SessionService
- DEFAULT_WATERMARK_REFRESH_RATE - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
-
The default period for which we will re-compute the watermark of the
DetectNewPartitionsDoFnstage. - DEFAULT_WINDOWED_SHARD_TEMPLATE - Static variable in class org.apache.beam.sdk.io.DefaultFilenamePolicy
-
The default windowed sharding name template used when writing windowed files.
- DEFAULT_WRITER_CLIENTS_PER_WORKER - Static variable in class org.apache.beam.sdk.io.solace.SolaceIO
- DEFAULT_WRITER_DELIVERY_MODE - Static variable in class org.apache.beam.sdk.io.solace.SolaceIO
- DEFAULT_WRITER_NUM_SHARDS - Static variable in class org.apache.beam.sdk.io.solace.SolaceIO
- DEFAULT_WRITER_PUBLISH_LATENCY_METRICS - Static variable in class org.apache.beam.sdk.io.solace.SolaceIO
- DEFAULT_WRITER_SUBMISSION_MODE - Static variable in class org.apache.beam.sdk.io.solace.SolaceIO
- DEFAULT_WRITER_TYPE - Static variable in class org.apache.beam.sdk.io.solace.SolaceIO
- Default.Boolean - Annotation Interface in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified boolean primitive value.
- Default.Byte - Annotation Interface in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified byte primitive value.
- Default.Character - Annotation Interface in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified char primitive value.
- Default.Class - Annotation Interface in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified
Classvalue. - Default.Double - Annotation Interface in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified double primitive value.
- Default.Enum - Annotation Interface in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified enum.
- Default.Float - Annotation Interface in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified float primitive value.
- Default.InstanceFactory - Annotation Interface in org.apache.beam.sdk.options
-
Value must be of type
DefaultValueFactoryand have a default constructor. - Default.Integer - Annotation Interface in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified int primitive value.
- Default.Long - Annotation Interface in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified long primitive value.
- Default.Short - Annotation Interface in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified short primitive value.
- Default.String - Annotation Interface in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified
Stringvalue. - DefaultAutoscaler - Class in org.apache.beam.sdk.io.jms
-
Default implementation of
AutoScaler. - DefaultAutoscaler() - Constructor for class org.apache.beam.sdk.io.jms.DefaultAutoscaler
- DefaultBlobstoreClientBuilderFactory - Class in org.apache.beam.sdk.io.azure.blobstore
-
Construct BlobServiceClientBuilder with given values of Azure client properties.
- DefaultBlobstoreClientBuilderFactory() - Constructor for class org.apache.beam.sdk.io.azure.blobstore.DefaultBlobstoreClientBuilderFactory
- DefaultCoder - Annotation Interface in org.apache.beam.sdk.coders
-
The
DefaultCoderannotation specifies aCoderclass to handle encoding and decoding instances of the annotated class. - DefaultCoder.DefaultCoderProviderRegistrar - Class in org.apache.beam.sdk.coders
-
A
CoderProviderRegistrarthat registers aCoderProviderwhich can use the@DefaultCoderannotation to providecoder providersthat createsCoders. - DefaultCoder.DefaultCoderProviderRegistrar.DefaultCoderProvider - Class in org.apache.beam.sdk.coders
-
A
CoderProviderthat uses the@DefaultCoderannotation to providecoder providersthat createCoders. - DefaultCoderCloudObjectTranslatorRegistrar - Class in org.apache.beam.runners.dataflow.util
-
The
CoderCloudObjectTranslatorRegistrarcontaining the default collection ofCoderCloud Object Translators. - DefaultCoderCloudObjectTranslatorRegistrar() - Constructor for class org.apache.beam.runners.dataflow.util.DefaultCoderCloudObjectTranslatorRegistrar
- DefaultCoderProvider() - Constructor for class org.apache.beam.sdk.coders.DefaultCoder.DefaultCoderProviderRegistrar.DefaultCoderProvider
- DefaultCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.coders.DefaultCoder.DefaultCoderProviderRegistrar
- DefaultConcludeTransform() - Constructor for class org.apache.beam.sdk.testing.PAssert.DefaultConcludeTransform
- defaultConfig(JdbcConnection, Collection<RuleSet>) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner
- DefaultErrorHandler() - Constructor for class org.apache.beam.sdk.transforms.errorhandling.ErrorHandler.DefaultErrorHandler
- DefaultExecutableStageContext - Class in org.apache.beam.runners.fnexecution.control
-
Implementation of a
ExecutableStageContext. - defaultFactory() - Static method in interface org.apache.beam.sdk.io.aws2.common.ClientBuilderFactory
-
The default
ClientBuilderFactoryinstance. - DefaultFilenamePolicy - Class in org.apache.beam.sdk.io
-
A default
FileBasedSink.FilenamePolicyfor windowed and unwindowed files. - DefaultFilenamePolicy.Params - Class in org.apache.beam.sdk.io
-
Encapsulates constructor parameters to
DefaultFilenamePolicy. - DefaultFilenamePolicy.ParamsCoder - Class in org.apache.beam.sdk.io
-
A Coder for
DefaultFilenamePolicy.Params. - DefaultGcpRegionFactory - Class in org.apache.beam.runners.dataflow.options
-
Factory for a default value for Google Cloud region according to https://cloud.google.com/compute/docs/gcloud-compute/#default-properties.
- DefaultGcpRegionFactory() - Constructor for class org.apache.beam.runners.dataflow.options.DefaultGcpRegionFactory
- DefaultGoogleAdsClientFactory - Class in org.apache.beam.sdk.io.googleads
-
The default way to construct a
GoogleAdsClient. - DefaultGoogleAdsClientFactory() - Constructor for class org.apache.beam.sdk.io.googleads.DefaultGoogleAdsClientFactory
- DefaultJobBundleFactory - Class in org.apache.beam.runners.fnexecution.control
-
A
JobBundleFactoryfor which the implementation can specify a customEnvironmentFactoryfor environment management. - DefaultJobBundleFactory.ServerInfo - Class in org.apache.beam.runners.fnexecution.control
-
A container for EnvironmentFactory and its corresponding Grpc servers.
- DefaultJobBundleFactory.WrappedSdkHarnessClient - Class in org.apache.beam.runners.fnexecution.control
-
Holder for an
SdkHarnessClientalong with its associated state and data servers. - DefaultJobServerConfigFactory() - Constructor for class org.apache.beam.runners.portability.testing.TestPortablePipelineOptions.DefaultJobServerConfigFactory
- DefaultMaxCacheMemoryUsageMb() - Constructor for class org.apache.beam.sdk.options.SdkHarnessOptions.DefaultMaxCacheMemoryUsageMb
- DefaultMaxCacheMemoryUsageMbFactory() - Constructor for class org.apache.beam.sdk.options.SdkHarnessOptions.DefaultMaxCacheMemoryUsageMbFactory
- defaultNaming(String, String) - Static method in class org.apache.beam.sdk.io.FileIO.Write
- defaultNaming(ValueProvider<String>, ValueProvider<String>) - Static method in class org.apache.beam.sdk.io.FileIO.Write
-
Defines a default
FileIO.Write.FileNamingwhich will use the prefix and suffix supplied to create a name based on the window, pane, number of shards, shard index, and compression. - defaultOptions() - Static method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
Factory method to return a new instance of
RpcQosOptionswith all default values. - DefaultPipelineOptionsRegistrar - Class in org.apache.beam.sdk.options
-
A
PipelineOptionsRegistrarcontaining thePipelineOptionssubclasses available by default. - DefaultPipelineOptionsRegistrar() - Constructor for class org.apache.beam.sdk.options.DefaultPipelineOptionsRegistrar
- DefaultProjectFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.DefaultProjectFactory
- DefaultRateLimiter(BackOff, BackOff) - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory.DefaultRateLimiter
- DefaultRateLimiter(Duration, Duration, Duration) - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory.DefaultRateLimiter
- DefaultRetryStrategy() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcIO.DefaultRetryStrategy
- defaults() - Static method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- Defaults() - Constructor for class org.apache.beam.sdk.Pipeline.PipelineVisitor.Defaults
- DefaultS3ClientBuilderFactory - Class in org.apache.beam.sdk.io.aws2.s3
-
Construct S3ClientBuilder with default values of S3 client properties like path style access, accelerated mode, etc.
- DefaultS3ClientBuilderFactory() - Constructor for class org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
- DefaultS3FileSystemSchemeRegistrar - Class in org.apache.beam.sdk.io.aws2.s3
-
Registers the "s3" uri schema to be handled by
S3FileSystem. - DefaultS3FileSystemSchemeRegistrar() - Constructor for class org.apache.beam.sdk.io.aws2.s3.DefaultS3FileSystemSchemeRegistrar
- DefaultSchema - Annotation Interface in org.apache.beam.sdk.schemas.annotations
-
The
DefaultSchemaannotation specifies aSchemaProviderclass to handle obtaining a schema and row for the specified class. - DefaultSchema.DefaultSchemaProvider - Class in org.apache.beam.sdk.schemas.annotations
-
SchemaProviderfor default schemas. - DefaultSchema.DefaultSchemaProviderRegistrar - Class in org.apache.beam.sdk.schemas.annotations
-
Registrar for default schemas.
- DefaultSchemaProvider() - Constructor for class org.apache.beam.sdk.schemas.annotations.DefaultSchema.DefaultSchemaProvider
- DefaultSchemaProviderRegistrar() - Constructor for class org.apache.beam.sdk.schemas.annotations.DefaultSchema.DefaultSchemaProviderRegistrar
- DefaultSequenceCombiner<EventKeyT,
EventT, - Class in org.apache.beam.sdk.extensions.ordered.combinerStateT> -
Default global sequence combiner.
- DefaultSequenceCombiner(EventExaminer<EventT, StateT>) - Constructor for class org.apache.beam.sdk.extensions.ordered.combiner.DefaultSequenceCombiner
- DefaultStopPipelineWatermarkFactory() - Constructor for class org.apache.beam.runners.spark.TestSparkPipelineOptions.DefaultStopPipelineWatermarkFactory
- DefaultTableFilter - Class in org.apache.beam.sdk.extensions.sql.meta
-
This default implementation of
BeamSqlTableFilterinterface. - DefaultTableFilter(List<RexNode>) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.DefaultTableFilter
- DefaultTrigger - Class in org.apache.beam.sdk.transforms.windowing
-
A trigger that is equivalent to
Repeatedly.forever(AfterWatermark.pastEndOfWindow()). - defaultType() - Method in class org.apache.beam.sdk.io.clickhouse.TableSchema.Column
- DefaultTypeConversionsFactory() - Constructor for class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.DefaultTypeConversionsFactory
- defaultValue() - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
- defaultValue() - Method in class org.apache.beam.sdk.io.clickhouse.TableSchema.Column
- defaultValue() - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
-
Returns the default value when there are no values added to the accumulator.
- defaultValue() - Method in interface org.apache.beam.sdk.transforms.CombineFnBase.GlobalCombineFn
-
Returns the default value when there are no values added to the accumulator.
- defaultValue() - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
- defaultValue() - Method in class org.apache.beam.sdk.transforms.View.AsSingleton
-
Returns the default value of this transform, or null if there isn't one.
- DefaultValueFactory<T> - Interface in org.apache.beam.sdk.options
-
An interface used with the
Default.InstanceFactoryannotation to specify the class that will be an instance factory to produce default values for a given getter onPipelineOptions. - Defining Your Own PipelineOptions - Search tag in interface org.apache.beam.sdk.options.PipelineOptions
- Section
- DEFLATE - Enum constant in enum class org.apache.beam.sdk.io.CompressedSource.CompressionMode
-
Deprecated.
- DEFLATE - Enum constant in enum class org.apache.beam.sdk.io.Compression
-
Deflate compression.
- DEFLATE - Enum constant in enum class org.apache.beam.sdk.io.FileBasedSink.CompressionType
-
Deprecated.
- DEFLATE - Enum constant in enum class org.apache.beam.sdk.io.TextIO.CompressionType
-
Deprecated.
- DEFLATE - Enum constant in enum class org.apache.beam.sdk.io.xml.XmlIO.Read.CompressionType
-
Deprecated.
- deidentify(String, String, DeidentifyConfig) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
-
Deidentify FHIR resources.
- deidentify(ValueProvider<String>, ValueProvider<String>, ValueProvider<DeidentifyConfig>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
-
Deidentify FHIR resources.
- deidentify(DoFn.ProcessContext) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Deidentify.DeidentifyFn
- Deidentify(ValueProvider<String>, ValueProvider<String>, ValueProvider<DeidentifyConfig>) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Deidentify
- deidentifyFhirStore(String, String, DeidentifyConfig) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Deidentify a GCP FHIR Store and write the result into a new FHIR Store.
- deidentifyFhirStore(String, String, DeidentifyConfig) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- DeidentifyFn(ValueProvider<String>, ValueProvider<DeidentifyConfig>) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Deidentify.DeidentifyFn
- delay(Duration) - Static method in class org.apache.beam.sdk.transforms.windowing.TimestampTransform
-
For internal use only; no backwards-compatibility guarantees.
- Delay() - Constructor for class org.apache.beam.sdk.transforms.windowing.TimestampTransform.Delay
- DelayIntervalRateLimiter() - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory.DelayIntervalRateLimiter
- DelayIntervalRateLimiter(Supplier<Duration>) - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory.DelayIntervalRateLimiter
- delegate() - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.ParDoSingle
- delegate(HasDisplayData) - Method in interface org.apache.beam.sdk.transforms.display.DisplayData.Builder
-
Register display data from the specified component on behalf of the current component.
- delegateBasedUponType(EnumMap<BeamFnApi.StateKey.TypeCase, StateRequestHandler>) - Static method in class org.apache.beam.runners.fnexecution.state.StateRequestHandlers
-
Returns a
StateRequestHandlerwhich delegates to the supplied handler depending on theBeamFnApi.StateRequeststype. - DelegateCoder<T,
IntermediateT> - Class in org.apache.beam.sdk.coders -
A
DelegateCoder<T, IntermediateT>wraps aCoderforIntermediateTand encodes/decodes values of typeTby converting to/fromIntermediateTand then encoding/decoding using the underlyingCoder<IntermediateT>. - DelegateCoder(Coder<IntermediateT>, DelegateCoder.CodingFunction<T, IntermediateT>, DelegateCoder.CodingFunction<IntermediateT, T>, TypeDescriptor<T>) - Constructor for class org.apache.beam.sdk.coders.DelegateCoder
- DelegateCoder.CodingFunction<InputT,
OutputT> - Interface in org.apache.beam.sdk.coders -
A
CodingFunction<InputT, OutputT>is a serializable function fromInputTtoOutputTthat may throw anyException. - DelegatingCounter - Class in org.apache.beam.sdk.metrics
-
Implementation of
Counterthat delegates to the instance for the current context. - DelegatingCounter(MetricName) - Constructor for class org.apache.beam.sdk.metrics.DelegatingCounter
-
Create a
DelegatingCounterwithperWorkerCounterandprocessWideContainerset to false. - DelegatingCounter(MetricName, boolean) - Constructor for class org.apache.beam.sdk.metrics.DelegatingCounter
-
Create a
DelegatingCounterwithperWorkerCounterset to false. - DelegatingCounter(MetricName, boolean, boolean) - Constructor for class org.apache.beam.sdk.metrics.DelegatingCounter
- DelegatingDistribution - Class in org.apache.beam.sdk.metrics
-
Implementation of
Distributionthat delegates to the instance for the current context. - DelegatingDistribution(MetricName) - Constructor for class org.apache.beam.sdk.metrics.DelegatingDistribution
- DelegatingDistribution(MetricName, boolean) - Constructor for class org.apache.beam.sdk.metrics.DelegatingDistribution
- DelegatingGauge - Class in org.apache.beam.sdk.metrics
-
Implementation of
Gaugethat delegates to the instance for the current context. - DelegatingGauge(MetricName) - Constructor for class org.apache.beam.sdk.metrics.DelegatingGauge
-
Create a
DelegatingGaugewithperWorkerGaugeandprocessWideContainerset to false. - DelegatingGauge(MetricName, boolean) - Constructor for class org.apache.beam.sdk.metrics.DelegatingGauge
- DelegatingHistogram - Class in org.apache.beam.sdk.metrics
-
Implementation of
Histogramthat delegates to the instance for the current context. - DelegatingHistogram(MetricName, HistogramData.BucketType, boolean) - Constructor for class org.apache.beam.sdk.metrics.DelegatingHistogram
- delete() - Static method in class org.apache.beam.sdk.io.cassandra.CassandraIO
-
Provide a
CassandraIO.WritePTransformto delete data to a Cassandra database. - delete(Collection<ClassLoaderFileSystem.ClassLoaderResourceId>) - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem
- delete(Collection<ResourceId>, MoveOptions...) - Static method in class org.apache.beam.sdk.io.FileSystems
-
Deletes a collection of resources.
- delete(Collection<ResourceIdT>) - Method in class org.apache.beam.sdk.io.FileSystem
-
Deletes a collection of resources.
- DELETE - Enum constant in enum class org.apache.beam.sdk.io.cassandra.CassandraIO.MutationType
- DELETE - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.RowMutationInformation.MutationType
- DELETE - Enum constant in enum class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ModType
- DELETE_URN - Static variable in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar
- deleteAsync(T) - Method in interface org.apache.beam.sdk.io.cassandra.Mapper
-
This method is called for each delete event.
- DeleteBuilder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.DeleteBuilder
- deleteDataset(String, String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Deletes the dataset specified by the datasetId value.
- deleteDataset(String, String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
-
Deletes the dataset specified by the datasetId value.
- deleteDataset(String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
- deleteDataset(String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- deleteDicomStore(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Delete a Dicom Store.
- deleteDicomStore(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- deleteEntity() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1
-
Returns an empty
DatastoreV1.DeleteEntitybuilder. - deleteFhirStore(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Delete Fhir store.
- deleteFhirStore(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- deleteFile() - Method in class org.apache.beam.sdk.jmh.io.TextSourceBenchmark.Data
- deleteHL7v2Message(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Deletes an HL7v2 message.
- deleteHL7v2Message(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- deleteHL7v2Store(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Deletes an HL7v2 store.
- deleteHL7v2Store(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- deleteKey() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1
-
Returns an empty
DatastoreV1.DeleteKeybuilder. - deleteNewPartition(NewPartition) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
-
This is the 2nd step of 2 phase delete.
- deletePartitionMetadataTable(List<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Drops the metadata table.
- deleteSchema(PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Delete
PubsubClient.SchemaPath. - deleteSchema(PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
-
Delete
PubsubClient.SchemaPath. - deleteSchema(PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
-
Delete
PubsubClient.SchemaPath. - deleteSchema(PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
-
Delete
PubsubClient.SchemaPath. - deleteStaticCaches() - Static method in class org.apache.beam.runners.flink.translation.utils.Workarounds
- deleteStreamPartitionRow(Range.ByteStringRange) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
-
This is the 2nd step of 2 phase delete of StreamPartition.
- deleteSubscription(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Delete
subscription. - deleteSubscription(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
- deleteSubscription(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
- deleteSubscription(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- deleteTable(TableReference) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Deletes the table specified by tableId from the dataset.
- deleteTable(TableReference) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
-
Deletes the table specified by tableId from the dataset.
- deleteTable(TableReference) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- deleteTable(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
- deleteTimer(StateNamespace, String, String) - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
- deleteTimer(StateNamespace, String, String, TimeDomain) - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
- deleteTimer(TimerInternals.TimerData) - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
- deleteTimer(TimerInternals.TimerData) - Method in class org.apache.beam.runners.spark.translation.streaming.ParDoStateUpdateFn.SparkTimerInternalsIterator
- deleteTimer(Instant, TimeDomain) - Method in interface org.apache.beam.sdk.state.Timers
-
Removes the timer set in this context for the
timestampandtimeDomain. - deleteTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
- deleteTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
- deleteTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
- deleteTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- delimitElement() - Method in class org.apache.beam.sdk.fn.stream.DataStreams.ElementDelimitedOutputStream
- dependencies(Row, PipelineOptions) - Method in interface org.apache.beam.sdk.schemas.transforms.SchemaTransformProvider
-
List the dependencies needed for this transform.
- dependencies(Row, PipelineOptions) - Method in class org.apache.beam.sdk.schemas.transforms.TypedSchemaTransformProvider
- Dependencies - Search tag in class org.apache.beam.io.debezium.DebeziumIO
- Section
- Dependency - Class in org.apache.beam.sdk.expansion.service
- Dependency() - Constructor for class org.apache.beam.sdk.expansion.service.Dependency
- dependsOnlyOnEarliestTimestamp() - Method in enum class org.apache.beam.sdk.transforms.windowing.TimestampCombiner
-
Returns
trueif the result of combination of many output timestamps actually depends only on the earliest. - dependsOnlyOnWindow() - Method in enum class org.apache.beam.sdk.transforms.windowing.TimestampCombiner
-
Returns
trueif the result does not depend on what outputs were combined but only the window they are in. - DequeCoder<T> - Class in org.apache.beam.sdk.coders
- DequeCoder(Coder<T>) - Constructor for class org.apache.beam.sdk.coders.DequeCoder
- deregister() - Method in interface org.apache.beam.runners.fnexecution.state.StateDelegator.Registration
-
De-registers the handler for all future requests for state for the registered process bundle instruction id.
- deriveIterableValueCoder(WindowedValues.FullWindowedValueCoder) - Static method in class org.apache.beam.runners.jet.Utils
- deriveRowType() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnnestRel
- describe(Set<Class<? extends PipelineOptions>>) - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
-
Outputs the set of options available to be set for the passed in
PipelineOptionsinterfaces. - describeMismatchSafely(BigqueryMatcher.TableAndQuery, Description) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher
- describeMismatchSafely(ShardedFile, Description) - Method in class org.apache.beam.sdk.testing.FileChecksumMatcher
- describeMismatchSafely(T, Description) - Method in class org.apache.beam.sdk.testing.JsonMatcher
- describePipelineOptions(JobApi.DescribePipelineOptionsRequest, StreamObserver<JobApi.DescribePipelineOptionsResponse>) - Method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
- describeTo(Description) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher
- describeTo(Description) - Method in class org.apache.beam.sdk.testing.FileChecksumMatcher
- describeTo(Description) - Method in class org.apache.beam.sdk.testing.JsonMatcher
- describeTo(Description) - Method in class org.apache.beam.sdk.testing.RegexMatcher
- describeTo(Description) - Method in class org.apache.beam.sdk.testing.TestPipelineOptions.AlwaysPassMatcher
- description() - Method in class org.apache.beam.sdk.extensions.sql.expansion.SqlTransformSchemaTransformProvider
- description() - Method in class org.apache.beam.sdk.io.aws2.sqs.providers.SqsReadSchemaTransformProvider
- description() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider
- description() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider
- description() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider
- description() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
- description() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
- description() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider
- description() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider
- description() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider
- description() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider
- description() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider
- description() - Method in class org.apache.beam.sdk.io.jdbc.providers.ReadFromMySqlSchemaTransformProvider
- description() - Method in class org.apache.beam.sdk.io.jdbc.providers.ReadFromOracleSchemaTransformProvider
- description() - Method in class org.apache.beam.sdk.io.jdbc.providers.ReadFromPostgresSchemaTransformProvider
- description() - Method in class org.apache.beam.sdk.io.jdbc.providers.ReadFromSqlServerSchemaTransformProvider
- description() - Method in class org.apache.beam.sdk.io.jdbc.providers.WriteToMySqlSchemaTransformProvider
- description() - Method in class org.apache.beam.sdk.io.jdbc.providers.WriteToOracleSchemaTransformProvider
- description() - Method in class org.apache.beam.sdk.io.jdbc.providers.WriteToPostgresSchemaTransformProvider
- description() - Method in class org.apache.beam.sdk.io.jdbc.providers.WriteToSqlServerSchemaTransformProvider
- description() - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider
- description() - Method in interface org.apache.beam.sdk.schemas.transforms.SchemaTransformProvider
-
Returns a description regarding the
SchemaTransformrepresented by theSchemaTransformProvider. - Description - Annotation Interface in org.apache.beam.sdk.options
-
Descriptions are used to generate human readable output when the
--helpcommand is specified. - deserialize(byte[]) - Method in interface org.apache.beam.sdk.schemas.io.payloads.PayloadSerializer
- deserialize(byte[], DataInputView) - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueSerializer
- deserialize(JsonParser, DeserializationContext) - Method in class org.apache.beam.sdk.options.ValueProvider.Deserializer
- deserialize(String, byte[]) - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantDeserializer
- deserialize(StateNamespace, DataInputView) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.FlinkStateNamespaceKeySerializer
- deserialize(DataInputView) - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeSerializer
- deserialize(DataInputView) - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueSerializer
- deserialize(DataInputView) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.FlinkStateNamespaceKeySerializer
- deserialize(T, DataInputView) - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeSerializer
- deserializeAwsCredentialsProvider(String) - Static method in class org.apache.beam.sdk.io.aws2.options.AwsSerializableUtils
- DeserializeBytesIntoPubsubMessagePayloadOnly() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages.DeserializeBytesIntoPubsubMessagePayloadOnly
- deserializeObject(byte[]) - Static method in class org.apache.beam.runners.flink.translation.utils.SerdeUtils
- deserializeOneOf(Expression, List<Encoder<T>>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers
- DeserializerProvider<T> - Interface in org.apache.beam.sdk.io.kafka
-
Provides a configured
Deserializerinstance and its associatedCoder. - deserializeTimers(Collection<byte[]>, TimerInternals.TimerDataCoderV2) - Static method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
- desiredBundleSizeBytes - Variable in class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceTransform
- desiredRequestParallelism - Variable in class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromBytes
- desiredRequestParallelism - Variable in class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromBytesWithContext
- desiredRequestParallelism - Variable in class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromGcsUri
- desiredRequestParallelism - Variable in class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromGcsUriWithContext
- Destination() - Constructor for class org.apache.beam.sdk.io.solace.data.Solace.Destination
- DESTINATION - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.providers.PortableBigQueryDestinations
- Detailed description - Search tag in org.apache.beam.sdk.io.BoundedSource.BoundedReader.splitAtFraction(double)
- Section
- Detailed description - Search tag in org.apache.beam.sdk.io.BoundedSource.BoundedReader.splitAtFraction(double)
- Section
- detect(String) - Static method in enum class org.apache.beam.sdk.io.Compression
- DETECT_NEW_PARTITION_SUFFIX - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
- detectNewPartitionsAction(ChangeStreamMetrics, MetadataTableDao, Instant, GenerateInitialPartitionsAction, ResumeFromPreviousPipelineAction, ProcessNewPartitionsAction) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ActionFactory
-
Creates and returns a singleton instance of an action class for processing
DetectNewPartitionsDoFn. - detectNewPartitionsAction(PartitionMetadataDao, PartitionMetadataMapper, WatermarkCache, ChangeStreamMetrics, Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.ActionFactory
-
Creates and returns a single instance of an action class capable of detecting and scheduling new partitions to be queried.
- DetectNewPartitionsAction - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.action
-
This class processes
DetectNewPartitionsDoFn. - DetectNewPartitionsAction - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.action
-
This class is responsible for scheduling partitions.
- DetectNewPartitionsAction(ChangeStreamMetrics, MetadataTableDao, Instant, GenerateInitialPartitionsAction, ResumeFromPreviousPipelineAction, ProcessNewPartitionsAction) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.DetectNewPartitionsAction
- DetectNewPartitionsAction(PartitionMetadataDao, PartitionMetadataMapper, WatermarkCache, ChangeStreamMetrics, Duration) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.DetectNewPartitionsAction
-
Constructs an action class for detecting / scheduling new partitions.
- DetectNewPartitionsDoFn - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn
- DetectNewPartitionsDoFn - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn
-
A SplittableDoFn (SDF) that is responsible for scheduling partitions to be queried.
- DetectNewPartitionsDoFn(DaoFactory, MapperFactory, ActionFactory, CacheFactory, ChangeStreamMetrics) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
-
This class needs a
DaoFactoryto build DAOs to access the partition metadata tables. - DetectNewPartitionsDoFn(Instant, ActionFactory, DaoFactory, ChangeStreamMetrics) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.DetectNewPartitionsDoFn
- DetectNewPartitionsRangeTracker - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction
-
This restriction tracker delegates most of its behavior to an internal
TimestampRangeTracker. - DetectNewPartitionsRangeTracker(TimestampRange) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.DetectNewPartitionsRangeTracker
- DetectNewPartitionsState - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.model
-
Metadata of the progress of
DetectNewPartitionsDoFnfrom the metadata table. - DetectNewPartitionsState(Instant, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.DetectNewPartitionsState
- DetectNewPartitionsTracker - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction
- DetectNewPartitionsTracker(long) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.DetectNewPartitionsTracker
- detectStreamingMode(Pipeline, StreamingOptions) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator
-
Analyse the pipeline to determine if we have to switch to streaming mode for the pipeline translation and update
StreamingOptionsaccordingly. - DicomIO - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The DicomIO connectors allows Beam pipelines to make calls to the Dicom API of the Google Cloud Healthcare API (https://cloud.google.com/healthcare/docs/how-tos#dicom-guide).
- DicomIO() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.DicomIO
- DicomIO.ReadStudyMetadata - Class in org.apache.beam.sdk.io.gcp.healthcare
-
This class makes a call to the retrieve metadata endpoint (https://cloud.google.com/healthcare/docs/how-tos/dicomweb#retrieving_metadata).
- DicomIO.ReadStudyMetadata.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
- dicomStorePath - Variable in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
- DicomWebPath() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
- DIRECT_READ - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead.Method
-
Read the contents of a table directly using the BigQuery storage API.
- Direct and persistent messages, and latency metrics - Search tag in class org.apache.beam.sdk.io.solace.SolaceIO
- Section
- DirectOptions - Interface in org.apache.beam.runners.direct
-
Options that can be used to configure the
DirectRunner. - DirectOptions.AvailableParallelismFactory - Class in org.apache.beam.runners.direct
-
A
DefaultValueFactorythat returns the result ofRuntime.availableProcessors()from theDirectOptions.AvailableParallelismFactory.create(PipelineOptions)method. - DIRECTORY_CONTAINER - Static variable in class org.apache.beam.sdk.io.ShardNameTemplate
-
Shard is a file within a directory.
- DirectRegistrar - Class in org.apache.beam.runners.direct
- DirectRegistrar.Options - Class in org.apache.beam.runners.direct
-
Registers the
DirectOptions. - DirectRegistrar.Runner - Class in org.apache.beam.runners.direct
-
Registers the
DirectRunner. - DirectRunner - Class in org.apache.beam.runners.direct
- DirectRunner() - Constructor for class org.apache.beam.sdk.options.PipelineOptions.DirectRunner
- DirectRunner.DirectPipelineResult - Class in org.apache.beam.runners.direct
-
The result of running a
Pipelinewith theDirectRunner. - DirectStreamObserver<T> - Class in org.apache.beam.sdk.fn.stream
-
A
StreamObserverwhich uses synchronization on the underlyingCallStreamObserverto provide thread safety. - DirectStreamObserver(Phaser, CallStreamObserver<T>) - Constructor for class org.apache.beam.sdk.fn.stream.DirectStreamObserver
- DirectTestOptions - Interface in org.apache.beam.runners.direct
-
Internal-only options for tweaking the behavior of the
PipelineOptions.DirectRunnerin ways that users should never do. - DISALLOW - Enum constant in enum class org.apache.beam.sdk.io.fs.EmptyMatchTreatment
-
Filepatterns matching no resources are disallowed.
- DISALLOW_COMBINER_LIFTING - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- DISALLOWED_CONSUMER_PROPERTIES - Static variable in class org.apache.beam.sdk.io.kafka.KafkaIOUtils
- discard() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator
- discardDataset(Dataset) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- DISCARDING_FIRED_PANES - Enum constant in enum class org.apache.beam.sdk.values.WindowingStrategy.AccumulationMode
- discardingFiredPanes() - Method in class org.apache.beam.sdk.transforms.windowing.Window
-
Returns a new
WindowPTransformthat uses the registered WindowFn and Triggering behavior, and that discards elements in a pane after they are triggered. - discoverSchemaTransform(ExpansionApi.DiscoverSchemaTransformRequest, StreamObserver<ExpansionApi.DiscoverSchemaTransformResponse>) - Method in class org.apache.beam.sdk.expansion.service.ExpansionService
- dispatchBag(Coder<?>) - Method in interface org.apache.beam.sdk.state.StateSpec.Cases
- dispatchBag(Coder<?>) - Method in class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
- dispatchCombining(Combine.CombineFn<?, ?, ?>, Coder<?>) - Method in interface org.apache.beam.sdk.state.StateSpec.Cases
- dispatchCombining(Combine.CombineFn<?, ?, ?>, Coder<?>) - Method in class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
- dispatchDefault() - Method in class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
- dispatchMap(Coder<?>, Coder<?>) - Method in interface org.apache.beam.sdk.state.StateSpec.Cases
- dispatchMap(Coder<?>, Coder<?>) - Method in class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
- dispatchMultimap(Coder<?>, Coder<?>) - Method in interface org.apache.beam.sdk.state.StateSpec.Cases
- dispatchMultimap(Coder<?>, Coder<?>) - Method in class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
- dispatchOrderedList(Coder<?>) - Method in interface org.apache.beam.sdk.state.StateSpec.Cases
- dispatchSet(Coder<?>) - Method in interface org.apache.beam.sdk.state.StateSpec.Cases
- dispatchSet(Coder<?>) - Method in class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
- dispatchValue(Coder<?>) - Method in interface org.apache.beam.sdk.state.StateSpec.Cases
- dispatchValue(Coder<?>) - Method in class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
- DISPLAY_DATA - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- displayData - Variable in class org.apache.beam.sdk.transforms.PTransform
- DisplayData - Class in org.apache.beam.sdk.transforms.display
-
Static display data associated with a pipeline component.
- DisplayData.Builder - Interface in org.apache.beam.sdk.transforms.display
-
Utility to build up display data from a component and its included subcomponents.
- DisplayData.Identifier - Class in org.apache.beam.sdk.transforms.display
-
Unique identifier for a display data item within a component.
- DisplayData.Item - Class in org.apache.beam.sdk.transforms.display
-
Itemsare the unit of display data. - DisplayData.ItemSpec<T> - Class in org.apache.beam.sdk.transforms.display
-
Specifies an
DisplayData.Itemto register as display data. - DisplayData.Path - Class in org.apache.beam.sdk.transforms.display
-
Structured path of registered display data within a component hierarchy.
- DisplayData.Type - Enum Class in org.apache.beam.sdk.transforms.display
-
Display data type.
- Distinct<T> - Class in org.apache.beam.sdk.transforms
-
Distinct<T>takes aPCollection<T>and returns aPCollection<T>that has all distinct elements of the input. - Distinct() - Constructor for class org.apache.beam.sdk.transforms.Distinct
- Distinct.WithRepresentativeValues<T,
IdT> - Class in org.apache.beam.sdk.transforms -
A
DistinctPTransformthat uses aSerializableFunctionto obtain a representative value for each input element. - distribution(Class<?>, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
-
Create a metric that records various statistics about the distribution of reported values.
- distribution(String, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
-
Create a metric that records various statistics about the distribution of reported values.
- Distribution - Interface in org.apache.beam.sdk.metrics
-
A metric that reports information about the distribution of reported values.
- DistributionImpl - Class in org.apache.beam.runners.jet.metrics
-
Implementation of
Distribution. - DistributionImpl(MetricName) - Constructor for class org.apache.beam.runners.jet.metrics.DistributionImpl
- DistributionResult - Class in org.apache.beam.sdk.metrics
-
The result of a
Distributionmetric. - DistributionResult() - Constructor for class org.apache.beam.sdk.metrics.DistributionResult
- DIVIDE - Enum constant in enum class org.apache.beam.sdk.extensions.sql.impl.cep.CEPKind
- divideBy(RelOptCost) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
- DLP_PAYLOAD_LIMIT_BYTES - Static variable in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
- DLP_PAYLOAD_LIMIT_BYTES - Static variable in class org.apache.beam.sdk.extensions.ml.DLPInspectText
- DLP_PAYLOAD_LIMIT_BYTES - Static variable in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
- DLPDeidentifyText - Class in org.apache.beam.sdk.extensions.ml
-
A
PTransformconnecting to Cloud DLP (https://cloud.google.com/dlp/docs/libraries) and deidentifying text according to provided settings. - DLPDeidentifyText() - Constructor for class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
- DLPDeidentifyText.Builder - Class in org.apache.beam.sdk.extensions.ml
- DLPInspectText - Class in org.apache.beam.sdk.extensions.ml
-
A
PTransformconnecting to Cloud DLP (https://cloud.google.com/dlp/docs/libraries) and inspecting text for identifying data according to provided settings. - DLPInspectText() - Constructor for class org.apache.beam.sdk.extensions.ml.DLPInspectText
- DLPInspectText.Builder - Class in org.apache.beam.sdk.extensions.ml
- DLPReidentifyText - Class in org.apache.beam.sdk.extensions.ml
-
A
PTransformconnecting to Cloud DLP (https://cloud.google.com/dlp/docs/libraries) and inspecting text for identifying data according to provided settings. - DLPReidentifyText() - Constructor for class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
- DLPReidentifyText.Builder - Class in org.apache.beam.sdk.extensions.ml
- DlqProvider - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
- DlqProvider() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.DlqProvider
- DO_NOT_CLONE - Enum constant in enum class org.apache.beam.sdk.transforms.DoFnTester.CloningBehavior
-
Deprecated.
- DO_NOT_ENTER_TRANSFORM - Enum constant in enum class org.apache.beam.sdk.Pipeline.PipelineVisitor.CompositeBehavior
- doChecks(PAssert.PAssertionSite, ActualT, SerializableFunction<ActualT, Void>) - Static method in class org.apache.beam.sdk.testing.PAssert
- DockerEnvironmentFactory - Class in org.apache.beam.runners.fnexecution.environment
-
An
EnvironmentFactorythat creates docker containers by shelling out to docker. - DockerEnvironmentFactory.Provider - Class in org.apache.beam.runners.fnexecution.environment
-
Provider for DockerEnvironmentFactory.
- docToBulk() - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO
- DocToBulk() - Constructor for class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocToBulk
- Document() - Constructor for class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Document
- doesMetadataTableExist() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
- doFn - Variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- DoFn<InputT,
OutputT> - Class in org.apache.beam.sdk.transforms -
The argument to
ParDoproviding the code to use to process elements of the inputPCollection. - DoFn() - Constructor for class org.apache.beam.sdk.transforms.DoFn
- DoFn.AlwaysFetched - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for declaring that a state parameter is always fetched.
- DoFn.BoundedPerElement - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation on a splittable
DoFnspecifying that theDoFnperforms a bounded amount of work per input element, so applying it to a boundedPCollectionwill produce also a boundedPCollection. - DoFn.BundleFinalizer - Interface in org.apache.beam.sdk.transforms
-
A parameter that is accessible during
@StartBundle,@ProcessElementand@FinishBundlethat allows the caller to register a callback that will be invoked after the bundle has been successfully completed and the runner has commit the output. - DoFn.BundleFinalizer.Callback - Interface in org.apache.beam.sdk.transforms
-
An instance of a function that will be invoked after bundle finalization.
- DoFn.Element - Annotation Interface in org.apache.beam.sdk.transforms
-
Parameter annotation for the input element for
DoFn.ProcessElement,DoFn.GetInitialRestriction,DoFn.GetSize,DoFn.SplitRestriction,DoFn.GetInitialWatermarkEstimatorState,DoFn.NewWatermarkEstimator, andDoFn.NewTrackermethods. - DoFn.FieldAccess - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for specifying specific fields that are accessed in a Schema PCollection.
- DoFn.FinishBundle - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for the method to use to finish processing a batch of elements.
- DoFn.FinishBundleContext - Class in org.apache.beam.sdk.transforms
-
Information accessible while within the
DoFn.FinishBundlemethod. - DoFn.GetInitialRestriction - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for the method that maps an element to an initial restriction for a splittable
DoFn. - DoFn.GetInitialWatermarkEstimatorState - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for the method that maps an element and restriction to initial watermark estimator state for a splittable
DoFn. - DoFn.GetRestrictionCoder - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for the method that returns the coder to use for the restriction of a splittable
DoFn. - DoFn.GetSize - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for the method that returns the corresponding size for an element and restriction pair.
- DoFn.GetWatermarkEstimatorStateCoder - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for the method that returns the coder to use for the watermark estimator state of a splittable
DoFn. - DoFn.Key - Annotation Interface in org.apache.beam.sdk.transforms
-
Parameter annotation for dereferencing input element key in
KVpair. - DoFn.MultiOutputReceiver - Interface in org.apache.beam.sdk.transforms
-
Receives tagged output for a multi-output function.
- DoFn.NewTracker - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for the method that creates a new
RestrictionTrackerfor the restriction of a splittableDoFn. - DoFn.NewWatermarkEstimator - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for the method that creates a new
WatermarkEstimatorfor the watermark state of a splittableDoFn. - DoFn.OnTimer - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for registering a callback for a timer.
- DoFn.OnTimerContext - Class in org.apache.beam.sdk.transforms
-
Information accessible when running a
DoFn.OnTimermethod. - DoFn.OnTimerFamily - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for registering a callback for a timerFamily.
- DoFn.OnWindowExpiration - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for the method to use for performing actions on window expiration.
- DoFn.OnWindowExpirationContext - Class in org.apache.beam.sdk.transforms
- DoFn.OutputReceiver<T> - Interface in org.apache.beam.sdk.transforms
-
Receives values of the given type.
- DoFn.ProcessContext - Class in org.apache.beam.sdk.transforms
-
Information accessible when running a
DoFn.ProcessElementmethod. - DoFn.ProcessContinuation - Class in org.apache.beam.sdk.transforms
-
When used as a return value of
DoFn.ProcessElement, indicates whether there is more work to be done for the current element. - DoFn.ProcessElement - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for the method to use for processing elements.
- DoFn.RequiresStableInput - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation that may be added to a
DoFn.ProcessElement,DoFn.OnTimer, orDoFn.OnWindowExpirationmethod to indicate that the runner must ensure that the observable contents of the inputPCollectionor mutable state must be stable upon retries. - DoFn.RequiresTimeSortedInput - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation that may be added to a
DoFn.ProcessElementmethod to indicate that the runner must ensure that the observable contents of the inputPCollectionis sorted by time, in ascending order. - DoFn.Restriction - Annotation Interface in org.apache.beam.sdk.transforms
-
Parameter annotation for the restriction for
DoFn.GetSize,DoFn.SplitRestriction,DoFn.GetInitialWatermarkEstimatorState,DoFn.NewWatermarkEstimator, andDoFn.NewTrackermethods. - DoFn.Setup - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for the method to use to prepare an instance for processing bundles of elements.
- DoFn.SideInput - Annotation Interface in org.apache.beam.sdk.transforms
-
Parameter annotation for the SideInput for a
DoFn.ProcessElementmethod. - DoFn.SplitRestriction - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for the method that splits restriction of a splittable
DoFninto multiple parts to be processed in parallel. - DoFn.StartBundle - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for the method to use to prepare an instance for processing a batch of elements.
- DoFn.StartBundleContext - Class in org.apache.beam.sdk.transforms
-
Information accessible while within the
DoFn.StartBundlemethod. - DoFn.StateId - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for declaring and dereferencing state cells.
- DoFn.Teardown - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for the method to use to clean up this instance before it is discarded.
- DoFn.TimerFamily - Annotation Interface in org.apache.beam.sdk.transforms
-
Parameter annotation for the TimerMap for a
DoFn.ProcessElementmethod. - DoFn.TimerId - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for declaring and dereferencing timers.
- DoFn.Timestamp - Annotation Interface in org.apache.beam.sdk.transforms
-
Parameter annotation for the input element timestamp for
DoFn.ProcessElement,DoFn.GetInitialRestriction,DoFn.GetSize,DoFn.SplitRestriction,DoFn.GetInitialWatermarkEstimatorState,DoFn.NewWatermarkEstimator, andDoFn.NewTrackermethods. - DoFn.TruncateRestriction - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for the method that truncates the restriction of a splittable
DoFninto a bounded one. - DoFn.UnboundedPerElement - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation on a splittable
DoFnspecifying that theDoFnperforms an unbounded amount of work per input element, so applying it to a boundedPCollectionwill produce an unboundedPCollection. - DoFn.WatermarkEstimatorState - Annotation Interface in org.apache.beam.sdk.transforms
-
Parameter annotation for the watermark estimator state for the
DoFn.NewWatermarkEstimatormethod. - DoFn.WindowedContext - Class in org.apache.beam.sdk.transforms
-
Information accessible to all methods in this
DoFnwhere the context is in some window. - DoFnFunction<OutputT,
InputT> - Class in org.apache.beam.runners.twister2.translators.functions -
DoFn function.
- DoFnFunction() - Constructor for class org.apache.beam.runners.twister2.translators.functions.DoFnFunction
- DoFnFunction(Twister2TranslationContext, DoFn<InputT, OutputT>, Coder<InputT>, Map<TupleTag<?>, Coder<?>>, List<TupleTag<?>>, WindowingStrategy<?, ?>, Map<PCollectionView<?>, WindowingStrategy<?, ?>>, TupleTag<OutputT>, DoFnSchemaInformation, Map<TupleTag<?>, Integer>, Map<String, PCollectionView<?>>) - Constructor for class org.apache.beam.runners.twister2.translators.functions.DoFnFunction
- DoFnOperator<PreInputT,
InputT, - Class in org.apache.beam.runners.flink.translation.wrappers.streamingOutputT> -
Flink operator for executing
DoFns. - DoFnOperator(DoFn<InputT, OutputT>, String, Coder<WindowedValue<InputT>>, Map<TupleTag<?>, Coder<?>>, TupleTag<OutputT>, List<TupleTag<?>>, DoFnOperator.OutputManagerFactory<OutputT>, WindowingStrategy<?, ?>, Map<Integer, PCollectionView<?>>, Collection<PCollectionView<?>>, PipelineOptions, Coder<?>, KeySelector<WindowedValue<InputT>, ?>, DoFnSchemaInformation, Map<String, PCollectionView<?>>) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
-
Constructor for DoFnOperator.
- DoFnOperator.BufferedOutputManager<OutputT> - Class in org.apache.beam.runners.flink.translation.wrappers.streaming
-
A
WindowedValueReceiverthat can buffer its outputs. - DoFnOperator.FlinkStepContext - Class in org.apache.beam.runners.flink.translation.wrappers.streaming
-
StepContextfor runningDoFnson Flink. - DoFnOperator.MultiOutputOutputManagerFactory<OutputT> - Class in org.apache.beam.runners.flink.translation.wrappers.streaming
-
Implementation of
DoFnOperator.OutputManagerFactorythat creates anDoFnOperator.BufferedOutputManagerthat can write to multiple logical outputs by Flink side output. - DoFnOutputReceivers - Class in org.apache.beam.sdk.transforms
-
Common
DoFn.OutputReceiverandDoFn.MultiOutputReceiverclasses. - DoFnOutputReceivers() - Constructor for class org.apache.beam.sdk.transforms.DoFnOutputReceivers
- doFnRunner - Variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- DoFnRunnerWithMetrics<InputT,
OutputT> - Class in org.apache.beam.runners.spark.translation -
DoFnRunner decorator which registers
MetricsContainerImpl. - DoFnRunnerWithMetrics(String, DoFnRunner<InputT, OutputT>, MetricsContainerStepMapAccumulator) - Constructor for class org.apache.beam.runners.spark.translation.DoFnRunnerWithMetrics
- DoFnRunnerWithMetricsUpdate<InputT,
OutputT> - Class in org.apache.beam.runners.flink.metrics -
DoFnRunnerdecorator which registersMetricsContainerImpl. - DoFnRunnerWithMetricsUpdate(String, DoFnRunner<InputT, OutputT>, FlinkMetricContainer) - Constructor for class org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate
- DoFns - Search tag in class org.apache.beam.sdk.transforms.ParDo
- Section
- DoFnSchemaInformation - Class in org.apache.beam.sdk.transforms
-
Represents information about how a DoFn extracts schemas.
- DoFnSchemaInformation() - Constructor for class org.apache.beam.sdk.transforms.DoFnSchemaInformation
- DoFnSchemaInformation.Builder - Class in org.apache.beam.sdk.transforms
-
The builder object.
- DoFnTester<InputT,
OutputT> - Class in org.apache.beam.sdk.transforms -
Deprecated.Use
TestPipelinewith theDirectRunner. - DoFnTester.CloningBehavior - Enum Class in org.apache.beam.sdk.transforms
-
Deprecated.Use
TestPipelinewith theDirectRunner. - doHoldLock(Range.ByteStringRange, String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
-
Return true if the uuid holds the lock of the partition.
- DONE - Enum constant in enum class org.apache.beam.sdk.PipelineResult.State
-
The job has successfully completed.
- doPartitionsOverlap(Range.ByteStringRange, Range.ByteStringRange) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ByteStringRangeHelper
-
Returns true if the two ByteStringRange overlaps, otherwise false.
- dotExpression() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- dotExpression() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.FieldSpecifierContext
- dotExpressionComponent() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- dotExpressionComponent() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionContext
- dotExpressionComponent(int) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionContext
- DotExpressionComponentContext() - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionComponentContext
- DotExpressionComponentContext(ParserRuleContext, int) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionComponentContext
- DotExpressionContext(ParserRuleContext, int) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionContext
- DOUBLE - Enum constant in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- DOUBLE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
- DOUBLE - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
-
The type of double fields.
- DOUBLE_NAN_WRAPPER - Static variable in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamBigQuerySqlDialect
- DOUBLE_NEGATIVE_INF_WRAPPER - Static variable in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamBigQuerySqlDialect
- DOUBLE_POSITIVE_INF_WRAPPER - Static variable in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamBigQuerySqlDialect
- DoubleCoder - Class in org.apache.beam.sdk.coders
-
A
DoubleCoderencodesDoublevalues in 8 bytes using Java serialization. - doubles() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
The
TypeDescriptorfor Double. - doublesGlobally() - Static method in class org.apache.beam.sdk.transforms.Max
-
Returns a
PTransformthat takes an inputPCollection<Double>and returns aPCollection<Double>whose contents is the maximum of the inputPCollection's elements, orDouble.NEGATIVE_INFINITYif there are no elements. - doublesGlobally() - Static method in class org.apache.beam.sdk.transforms.Min
-
Returns a
PTransformthat takes an inputPCollection<Double>and returns aPCollection<Double>whose contents is the minimum of the inputPCollection's elements, orDouble.POSITIVE_INFINITYif there are no elements. - doublesGlobally() - Static method in class org.apache.beam.sdk.transforms.Sum
-
Returns a
PTransformthat takes an inputPCollection<Double>and returns aPCollection<Double>whose contents is the sum of the inputPCollection's elements, or0if there are no elements. - doublesPerKey() - Static method in class org.apache.beam.sdk.transforms.Max
-
Returns a
PTransformthat takes an inputPCollection<KV<K, Double>>and returns aPCollection<KV<K, Double>>that contains an output element mapping each distinct key in the inputPCollectionto the maximum of the values associated with that key in the inputPCollection. - doublesPerKey() - Static method in class org.apache.beam.sdk.transforms.Min
-
Returns a
PTransformthat takes an inputPCollection<KV<K, Double>>and returns aPCollection<KV<K, Double>>that contains an output element mapping each distinct key in the inputPCollectionto the minimum of the values associated with that key in the inputPCollection. - doublesPerKey() - Static method in class org.apache.beam.sdk.transforms.Sum
-
Returns a
PTransformthat takes an inputPCollection<KV<K, Double>>and returns aPCollection<KV<K, Double>>that contains an output element mapping each distinct key in the inputPCollectionto the sum of the values associated with that key in the inputPCollection. - doubleToByteArray(double) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigtableUtils
- drive() - Method in interface org.apache.beam.runners.local.ExecutionDriver
- DriverConfiguration() - Constructor for class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverConfiguration
- Driver configuration - Search tag in class org.apache.beam.sdk.io.neo4j.Neo4jIO
- Section
- dropCatalog(String) - Method in interface org.apache.beam.sdk.extensions.sql.meta.catalog.CatalogManager
-
Drops the catalog with this name.
- dropCatalog(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.catalog.EmptyCatalogManager
- dropCatalog(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.catalog.InMemoryCatalogManager
- dropCatalog(SqlIdentifier, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.CatalogManagerSchema
- dropDatabase(String, boolean) - Method in interface org.apache.beam.sdk.extensions.sql.meta.catalog.Catalog
-
Drops the database with this name.
- dropDatabase(String, boolean) - Method in class org.apache.beam.sdk.extensions.sql.meta.catalog.InMemoryCatalog
- dropDatabase(String, boolean) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.iceberg.IcebergCatalog
- dropDatabase(SqlIdentifier, boolean, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.CatalogSchema
- dropExpiredTimers(SparkTimerInternals, WindowingStrategy<?, W>) - Static method in class org.apache.beam.runners.spark.util.TimerUtils
- DropFields - Class in org.apache.beam.sdk.schemas.transforms
-
A transform to drop fields from a schema.
- DropFields() - Constructor for class org.apache.beam.sdk.schemas.transforms.DropFields
- DropFields.Inner<T> - Class in org.apache.beam.sdk.schemas.transforms
-
Implementation class for DropFields.
- dropNamespace(String, boolean) - Method in class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig
- dropping(List<String>) - Method in class org.apache.beam.sdk.io.iceberg.IcebergIO.ReadRows
- dropTable(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
- dropTable(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.iceberg.IcebergMetastore
- dropTable(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.InMemoryMetaTableProvider
- dropTable(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.ReadOnlyTableProvider
- dropTable(String) - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider
-
Drops a table.
- dropTable(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider
- dropTable(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.store.InMemoryMetaStore
- dropTable(String) - Method in class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig
- dropTable(SqlParserPos, boolean, SqlIdentifier) - Static method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlDdlNodes
-
Creates a DROP TABLE.
- dryRunQuery(String, JobConfigurationQuery, String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
-
Dry runs the query in the given project.
- dryRunQuery(String, JobConfigurationQuery, String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
- dStreamValues(JavaPairDStream<T1, T2>) - Static method in class org.apache.beam.runners.spark.translation.TranslationUtils
-
Transform a pair stream into a value stream.
- duplicate - Enum constant in enum class org.apache.beam.sdk.extensions.ordered.UnprocessedEvent.Reason
- duplicate() - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeSerializer
- duplicate() - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueComparator
- duplicate() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.FlinkStateNamespaceKeySerializer
- DURATION - Enum constant in enum class org.apache.beam.sdk.transforms.display.DisplayData.Type
- DurationCoder - Class in org.apache.beam.sdk.coders
- DurationConvert() - Constructor for class org.apache.beam.sdk.extensions.protobuf.ProtoSchemaLogicalTypes.DurationConvert
- durationMilliSec - Variable in class org.apache.beam.sdk.transforms.PeriodicSequence.SequenceDefinition
- DYNAMIC_DESTINATIONS - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider
- DYNAMIC_DESTINATIONS - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
- DynamicAvroDestinations<UserT,
DestinationT, - Class in org.apache.beam.sdk.extensions.avro.ioOutputT> -
A specialization of
FileBasedSink.DynamicDestinationsforAvroIO. - DynamicAvroDestinations() - Constructor for class org.apache.beam.sdk.extensions.avro.io.DynamicAvroDestinations
- DynamicDestinations<T,
DestinationT> - Class in org.apache.beam.sdk.io.gcp.bigquery -
This class provides the most general way of specifying dynamic BigQuery table destinations.
- DynamicDestinations - Interface in org.apache.beam.sdk.io.iceberg
- DynamicDestinations() - Constructor for class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
- DynamicDestinations() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
- Dynamic destinations - Search tag in class org.apache.beam.sdk.io.FileIO
- Section
- Dynamic Destinations - Search tag in class org.apache.beam.sdk.io.iceberg.IcebergIO
- Section
- DynamicFileDestinations - Class in org.apache.beam.sdk.io
-
Some helper classes that derive from
FileBasedSink.DynamicDestinations. - DynamicFileDestinations() - Constructor for class org.apache.beam.sdk.io.DynamicFileDestinations
- DynamicProtoCoder - Class in org.apache.beam.sdk.extensions.protobuf
-
A
Coderusing Google Protocol Buffers binary format. - dynamicWrite() - Static method in class org.apache.beam.sdk.io.mqtt.MqttIO
- Dynamic Writing to a MQTT Broker - Search tag in class org.apache.beam.sdk.io.mqtt.MqttIO
- Section
- DynamoDBIO - Class in org.apache.beam.sdk.io.aws2.dynamodb
-
IO to read from and write to DynamoDB tables.
- DynamoDBIO() - Constructor for class org.apache.beam.sdk.io.aws2.dynamodb.DynamoDBIO
- DynamoDBIO.Read<T> - Class in org.apache.beam.sdk.io.aws2.dynamodb
-
Read data from DynamoDB using
DynamoDBIO.Read.getScanRequestFn()and emit an element of typeDynamoDBIO.Readfor eachScanResponseusing the mapping functionDynamoDBIO.Read.getScanResponseMapperFn(). - DynamoDBIO.Write<T> - Class in org.apache.beam.sdk.io.aws2.dynamodb
-
Write a PCollection
data into DynamoDB.
E
- EARLIEST - Enum constant in enum class org.apache.beam.sdk.io.iceberg.IcebergIO.ReadRows.StartingStrategy
- EARLIEST - Enum constant in enum class org.apache.beam.sdk.transforms.windowing.TimestampCombiner
-
The policy of taking at the earliest of a set of timestamps.
- EARLY - Enum constant in enum class org.apache.beam.sdk.transforms.windowing.PaneInfo.Timing
-
Pane was fired before the input watermark had progressed after the end of the window.
- EarlyBinder(KeyedStateBackend, SerializablePipelineOptions, Coder<? extends BoundedWindow>) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.EarlyBinder
- eitherOf(Watch.Growth.TerminationCondition<InputT, FirstStateT>, Watch.Growth.TerminationCondition<InputT, SecondStateT>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
-
Returns a
Watch.Growth.TerminationConditionthat holds when at least one of the given two conditions holds. - ElasticsearchIO - Class in org.apache.beam.sdk.io.elasticsearch
-
Transforms for reading and writing data from/to Elasticsearch.
- ElasticsearchIO.BoundedElasticsearchSource - Class in org.apache.beam.sdk.io.elasticsearch
-
A
BoundedSourcereading from Elasticsearch. - ElasticsearchIO.BulkIO - Class in org.apache.beam.sdk.io.elasticsearch
-
A
PTransformwriting Bulk API entities created byElasticsearchIO.DocToBulkto an Elasticsearch cluster. - ElasticsearchIO.ConnectionConfiguration - Class in org.apache.beam.sdk.io.elasticsearch
-
A POJO describing a connection configuration to Elasticsearch.
- ElasticsearchIO.DocToBulk - Class in org.apache.beam.sdk.io.elasticsearch
-
A
PTransformconverting docs to their Bulk API counterparts. - ElasticsearchIO.Document - Class in org.apache.beam.sdk.io.elasticsearch
- ElasticsearchIO.DocumentCoder - Class in org.apache.beam.sdk.io.elasticsearch
- ElasticsearchIO.Read - Class in org.apache.beam.sdk.io.elasticsearch
-
A
PTransformreading data from Elasticsearch. - ElasticsearchIO.RetryConfiguration - Class in org.apache.beam.sdk.io.elasticsearch
-
A POJO encapsulating a configuration for retry behavior when issuing requests to ES.
- ElasticsearchIO.Write - Class in org.apache.beam.sdk.io.elasticsearch
-
A
PTransformwriting data to Elasticsearch. - ElasticsearchIO.Write.BooleanFieldValueExtractFn - Interface in org.apache.beam.sdk.io.elasticsearch
- ElasticsearchIO.Write.FieldValueExtractFn - Interface in org.apache.beam.sdk.io.elasticsearch
- ElasticsearchIOITCommon - Class in org.apache.beam.sdk.io.elasticsearch
-
Manipulates test data used by the
ElasticsearchIOintegration tests. - ElasticsearchIOITCommon() - Constructor for class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOITCommon
- ElasticsearchIOITCommon.ElasticsearchPipelineOptions - Interface in org.apache.beam.sdk.io.elasticsearch
-
Pipeline options for elasticsearch tests.
- element() - Method in class org.apache.beam.runners.twister2.utils.Twister2AssignContext
- element() - Method in class org.apache.beam.sdk.transforms.DoFn.ProcessContext
-
Returns the input element to be processed.
- element() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn.AssignContext
-
Returns the current element.
- element() - Method in class org.apache.beam.sdk.transforms.WithFailures.ExceptionElement
- ELEMENT - Enum constant in enum class org.apache.beam.sdk.testing.TestStream.EventType
- elementCoder() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.SideInputSpec
- elementCoder() - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.IterableSideInputHandler
-
Returns the
Coderto use for the elements of the resulting values iterable. - elementCountAtLeast(int) - Static method in class org.apache.beam.sdk.transforms.windowing.AfterPane
-
Creates a trigger that fires when the pane contains at least
countElemselements. - ElementDelimitedOutputStream(DataStreams.OutputChunkConsumer<ByteString>, int) - Constructor for class org.apache.beam.sdk.fn.stream.DataStreams.ElementDelimitedOutputStream
- ElementEvent() - Constructor for class org.apache.beam.sdk.testing.TestStream.ElementEvent
- elements() - Static method in class org.apache.beam.sdk.transforms.ToString
- elementsIterable() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.SingletonKeyedWorkItem
- elementsRead() - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
-
Counter of elements read by a source.
- elementsReadBySplit(String) - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
-
Counter of elements read by a source split.
- elementsWritten() - Static method in class org.apache.beam.sdk.metrics.SinkMetrics
-
Counter of elements written to a sink.
- ElemToBytesFunction<V> - Class in org.apache.beam.runners.twister2.translators.functions
-
Map to tuple function.
- ElemToBytesFunction() - Constructor for class org.apache.beam.runners.twister2.translators.functions.ElemToBytesFunction
- ElemToBytesFunction(WindowedValues.WindowedValueCoder<V>) - Constructor for class org.apache.beam.runners.twister2.translators.functions.ElemToBytesFunction
- EMBEDDED_ARTIFACT_URN - Static variable in class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
- EmbeddedEnvironmentFactory - Class in org.apache.beam.runners.fnexecution.environment
-
An
EnvironmentFactorythat communicates to aFnHarnesswhich is executing in the same process. - EmbeddedEnvironmentFactory.Provider - Class in org.apache.beam.runners.fnexecution.environment
-
Provider of EmbeddedEnvironmentFactory.
- empty() - Static method in class org.apache.beam.runners.local.StructuralKey
-
Get the empty
StructuralKey. - empty() - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.SparkSideInputReader
- empty() - Static method in class org.apache.beam.sdk.expansion.service.ExpansionServiceConfig
- empty() - Static method in class org.apache.beam.sdk.metrics.BoundedTrieResult
- empty() - Static method in class org.apache.beam.sdk.metrics.GaugeResult
- empty() - Static method in class org.apache.beam.sdk.metrics.StringSetResult
- empty() - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
-
Asserts that the iterable in question is empty.
- empty() - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
- empty() - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcherwith identical criteria toMatchers.empty(). - empty() - Static method in class org.apache.beam.sdk.transforms.join.CoGbkResult
-
Returns an empty
CoGbkResult. - empty() - Static method in class org.apache.beam.sdk.transforms.Requirements
-
Describes an empty set of requirements.
- empty() - Static method in class org.apache.beam.sdk.values.TupleTagList
-
Returns an empty
TupleTagList. - empty(Coder<T>) - Static method in class org.apache.beam.sdk.transforms.Create
-
Returns a new
Create.Valuestransform that produces an emptyPCollection. - empty(Pipeline) - Static method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
-
Returns an empty
KeyedPCollectionTuple<K>on the given pipeline. - empty(Pipeline) - Static method in class org.apache.beam.sdk.values.PCollectionList
-
Returns an empty
PCollectionListthat is part of the givenPipeline. - empty(Pipeline) - Static method in class org.apache.beam.sdk.values.PCollectionRowTuple
-
Returns an empty
PCollectionRowTuplethat is part of the givenPipeline. - empty(Pipeline) - Static method in class org.apache.beam.sdk.values.PCollectionTuple
-
Returns an empty
PCollectionTuplethat is part of the givenPipeline. - empty(Schema) - Static method in class org.apache.beam.sdk.transforms.Create
-
Returns a new
Create.Valuestransform that produces an emptyPCollectionof rows. - empty(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.transforms.Create
-
Returns a new
Create.Valuestransform that produces an emptyPCollection. - EMPTY - Enum constant in enum class org.apache.beam.sdk.io.snowflake.enums.WriteDisposition
- EMPTY - Static variable in class org.apache.beam.sdk.extensions.ordered.ContiguousSequenceRange
- EMPTY - Static variable in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration
- EMPTY - Static variable in class org.apache.beam.sdk.io.range.ByteKey
-
An empty key.
- EMPTY - Static variable in class org.apache.beam.sdk.schemas.utils.SchemaZipFold.Context
- EMPTY_BYTE_ARRAY - Static variable in class org.apache.beam.runners.spark.util.TimerUtils
- EMPTY_ROW - Static variable in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.AggregationCombineFnAdapter
- EMPTY_SCHEMA - Static variable in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.AggregationCombineFnAdapter
- emptyArray() - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcherwith identical criteria toMatchers.emptyArray(). - emptyBatch() - Method in class org.apache.beam.runners.spark.io.CreateStream
-
Adds an empty batch.
- EmptyCatalogManager - Class in org.apache.beam.sdk.extensions.sql.meta.catalog
- EmptyCatalogManager() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.catalog.EmptyCatalogManager
- EmptyCheckpointMark - Class in org.apache.beam.runners.spark.io
-
Passing null values to Spark's Java API may cause problems because of Guava preconditions.
- emptyIterable() - Static method in class org.apache.beam.sdk.fn.stream.PrefetchableIterables
-
Returns an empty
PrefetchableIterable. - emptyIterable() - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcherwith identical criteria toMatchers.emptyIterable(). - emptyIterator() - Static method in class org.apache.beam.sdk.fn.stream.PrefetchableIterators
-
Returns an empty
PrefetchableIterator. - emptyList() - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.utils.ScalaInterop
- EmptyListDefault() - Constructor for class org.apache.beam.sdk.transforms.resourcehints.ResourceHintsOptions.EmptyListDefault
- EmptyListenersList() - Constructor for class org.apache.beam.runners.spark.SparkContextOptions.EmptyListenersList
- EmptyMatchTreatment - Enum Class in org.apache.beam.sdk.io.fs
-
Options for allowing or disallowing filepatterns that match no resources in
FileSystems.match(java.util.List<java.lang.String>). - emptyProperties() - Static method in class org.apache.beam.sdk.extensions.sql.TableUtils
- emptyVoidFunction() - Static method in class org.apache.beam.runners.spark.translation.TranslationUtils
- ENABLE_CUSTOM_PUBSUB_SINK - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
- ENABLE_CUSTOM_PUBSUB_SOURCE - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
- enableAbandonedNodeEnforcement(boolean) - Method in class org.apache.beam.sdk.testing.TestPipeline
-
Enables the abandoned node detection.
- enableAutoRunIfMissing(boolean) - Method in class org.apache.beam.sdk.testing.TestPipeline
-
If enabled, a
pipeline.run()statement will be added automatically in case it is missing in the test. - Enable client side metrics - Search tag in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO
- Section
- enableSSL() - Method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
-
Enable SSL connection to Redis server.
- EnableStreamingEngineFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.EnableStreamingEngineFactory
- EnableWindmillServiceDirectPathFactory() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions.EnableWindmillServiceDirectPathFactory
- EncodableThrowable - Class in org.apache.beam.sdk.values
-
A wrapper around a
Throwablefor use with coders. - encode(byte[], OutputStream) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
- encode(byte[], OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
- encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.NullableCoder
- encode(T, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.NullableCoder
- encode(HyperLogLogPlus, OutputStream) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.HyperLogLogPlusCoder
- encode(TableRow, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
- encode(TableRow, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
- encode(JsonArray, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.JsonArrayCoder
- encode(ByteString, OutputStream) - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
- encode(ByteString, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
- encode(IterableT, OutputStream) - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
- encode(Boolean, OutputStream) - Method in class org.apache.beam.sdk.coders.BooleanCoder
- encode(Byte, OutputStream) - Method in class org.apache.beam.sdk.coders.ByteCoder
- encode(Double, OutputStream) - Method in class org.apache.beam.sdk.coders.DoubleCoder
- encode(Float, OutputStream) - Method in class org.apache.beam.sdk.coders.FloatCoder
- encode(Integer, OutputStream) - Method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
- encode(Integer, OutputStream) - Method in class org.apache.beam.sdk.coders.TextualIntegerCoder
- encode(Integer, OutputStream) - Method in class org.apache.beam.sdk.coders.VarIntCoder
- encode(Integer, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.TextualIntegerCoder
- encode(Long, OutputStream) - Method in class org.apache.beam.sdk.coders.BigEndianLongCoder
- encode(Long, OutputStream) - Method in class org.apache.beam.sdk.coders.VarLongCoder
- encode(Short, OutputStream) - Method in class org.apache.beam.sdk.coders.BigEndianShortCoder
- encode(String, OutputStream) - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
- encode(String, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
- encode(Void, OutputStream) - Method in class org.apache.beam.sdk.coders.VoidCoder
- encode(BigDecimal, OutputStream) - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
- encode(BigDecimal, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
- encode(BigInteger, OutputStream) - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
- encode(BigInteger, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
- encode(BitSet, OutputStream) - Method in class org.apache.beam.sdk.coders.BitSetCoder
- encode(BitSet, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.BitSetCoder
- encode(Map<K, V>, OutputStream) - Method in class org.apache.beam.sdk.coders.MapCoder
- encode(Map<K, V>, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.MapCoder
- encode(Optional<T>, OutputStream) - Method in class org.apache.beam.sdk.coders.OptionalCoder
- encode(SortedMap<K, V>, OutputStream) - Method in class org.apache.beam.sdk.coders.SortedMapCoder
- encode(SortedMap<K, V>, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.SortedMapCoder
- encode(K, OutputStream) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.MetadataKeyCoder
- encode(KeyedWorkItem<K, ElemT>, OutputStream) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.SingletonKeyedWorkItemCoder
- encode(KeyedWorkItem<K, ElemT>, OutputStream, Coder.Context) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.SingletonKeyedWorkItemCoder
- encode(IsmFormat.Footer, OutputStream) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.FooterCoder
- encode(IsmFormat.IsmRecord<V>, OutputStream) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmRecordCoder
- encode(IsmFormat.IsmShard, OutputStream) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmShardCoder
- encode(IsmFormat.KeyPrefix, OutputStream) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.KeyPrefixCoder
- encode(RandomAccessData, OutputStream) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
- encode(RandomAccessData, OutputStream, Coder.Context) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
- encode(SequenceRangeAccumulator, OutputStream) - Method in class org.apache.beam.sdk.extensions.ordered.combiner.SequenceRangeAccumulator.SequenceRangeAccumulatorCoder
- encode(EncodedBoundedWindow, OutputStream) - Method in class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow.Coder
- encode(CountingSource.CounterMark, OutputStream) - Method in class org.apache.beam.sdk.io.CountingSource.CounterMarkCoder
- encode(DefaultFilenamePolicy.Params, OutputStream) - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy.ParamsCoder
- encode(ElasticsearchIO.Document, OutputStream) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocumentCoder
- encode(FileBasedSink.FileResult<DestinationT>, OutputStream) - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResultCoder
- encode(FileIO.ReadableFile, OutputStream) - Method in class org.apache.beam.sdk.io.ReadableFileCoder
- encode(MatchResult.Metadata, OutputStream) - Method in class org.apache.beam.sdk.io.fs.MetadataCoder
- encode(MatchResult.Metadata, OutputStream) - Method in class org.apache.beam.sdk.io.fs.MetadataCoderV2
- encode(ResourceId, OutputStream) - Method in class org.apache.beam.sdk.io.fs.ResourceIdCoder
- encode(BigQueryInsertError, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
- encode(BigQueryStorageApiInsertError, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertErrorCoder
- encode(RowMutation, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.RowMutation.RowMutationCoder
- encode(TableDestination, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoder
- encode(TableDestination, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV2
- encode(TableDestination, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV3
- encode(BigtableWriteResult, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResultCoder
- encode(FhirSearchParameter<T>, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameterCoder
- encode(HealthcareIOError<T>, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOErrorCoder
- encode(HL7v2Message, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2MessageCoder
- encode(HL7v2ReadResponse, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadResponseCoder
- encode(PubsubMessage, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
- encode(PubsubMessage, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdAndOrderingKeyCoder
- encode(PubsubMessage, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdCoder
- encode(PubsubMessage, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
- encode(PubsubMessage, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithMessageIdCoder
- encode(PubsubMessage, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithTopicCoder
- encode(PubsubMessage, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
- encode(PubsubMessage, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
- encode(OffsetByteRange, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.OffsetByteRangeCoder
- encode(SubscriptionPartition, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.SubscriptionPartitionCoder
- encode(Uuid, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UuidCoder
- encode(KafkaRecord<K, V>, OutputStream) - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
- encode(OffsetRange, OutputStream) - Method in class org.apache.beam.sdk.io.range.OffsetRange.Coder
- encode(SplunkEvent, OutputStream) - Method in class org.apache.beam.sdk.io.splunk.SplunkEventCoder
- encode(TestStream<T>, OutputStream) - Method in class org.apache.beam.sdk.testing.TestStream.TestStreamCoder
- encode(CoGbkResult, OutputStream) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
- encode(RawUnionValue, OutputStream) - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
- encode(RawUnionValue, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
- encode(GlobalWindow, OutputStream) - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow.Coder
- encode(IntervalWindow, OutputStream) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
- encode(PaneInfo, OutputStream) - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo.PaneInfoCoder
- encode(FailsafeValueInSingleWindow<T, ErrorT>, OutputStream) - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow.Coder
- encode(FailsafeValueInSingleWindow<T, ErrorT>, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow.Coder
- encode(KV<K, V>, OutputStream) - Method in class org.apache.beam.sdk.coders.KvCoder
- encode(KV<K, V>, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.KvCoder
- encode(PCollectionViews.ValueOrMetadata<T, MetaT>, OutputStream) - Method in class org.apache.beam.sdk.values.PCollectionViews.ValueOrMetadataCoder
- encode(ShardedKey<KeyT>, OutputStream) - Method in class org.apache.beam.sdk.coders.ShardedKeyCoder
- encode(TimestampedValue<T>, OutputStream) - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
- encode(ValueInSingleWindow<T>, OutputStream) - Method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
- encode(ValueInSingleWindow<T>, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
- encode(ValueWithRecordId<ValueT>, OutputStream) - Method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
- encode(ValueWithRecordId<ValueT>, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
- encode(WindowedValue<T>, OutputStream) - Method in class org.apache.beam.sdk.values.WindowedValues.FullWindowedValueCoder
- encode(WindowedValue<T>, OutputStream) - Method in class org.apache.beam.sdk.values.WindowedValues.ParamWindowedValueCoder
- encode(WindowedValue<T>, OutputStream) - Method in class org.apache.beam.sdk.values.WindowedValues.ValueOnlyWindowedValueCoder
-
Deprecated.
- encode(WindowedValue<T>, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.WindowedValues.FullWindowedValueCoder
- encode(WindowedValue<T>, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.WindowedValues.ParamWindowedValueCoder
- encode(WindowedValue<T>, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.WindowedValues.ValueOnlyWindowedValueCoder
-
Deprecated.
- encode(ByteString, OutputStream) - Method in class org.apache.beam.runners.fnexecution.wire.ByteStringCoder
- encode(ByteString, OutputStream, Coder.Context) - Method in class org.apache.beam.runners.fnexecution.wire.ByteStringCoder
- encode(ProducerRecord<K, V>, OutputStream) - Method in class org.apache.beam.sdk.io.kafka.ProducerRecordCoder
- encode(TopicPartition, OutputStream) - Method in class org.apache.beam.sdk.io.kafka.TopicPartitionCoder
- encode(Message, OutputStream) - Method in class org.apache.beam.sdk.io.amqp.AmqpMessageCoder
- encode(Instant, OutputStream) - Method in class org.apache.beam.sdk.coders.InstantCoder
- encode(ReadableDuration, OutputStream) - Method in class org.apache.beam.sdk.coders.DurationCoder
- encode(AttributeValue, OutputStream) - Method in class org.apache.beam.sdk.io.aws2.dynamodb.AttributeValueCoder
- encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.Coder
-
Encodes the given value of type
Tonto the given output stream. - encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.DelegateCoder
- encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
- encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.SerializableCoder
- encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.SnappyCoder
- encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
- encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.TimestampPrefixingWindowCoder
- encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.ZstdCoder
- encode(T, OutputStream) - Method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
- encode(T, OutputStream) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
- encode(T, OutputStream) - Method in class org.apache.beam.sdk.io.hadoop.WritableCoder
- encode(T, OutputStream) - Method in class org.apache.beam.sdk.io.thrift.ThriftCoder
-
Encodes the given value of type
Tonto the given output stream using providedThriftCoder.protocolFactory. - encode(T, OutputStream) - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
- encode(T, OutputStream) - Method in class org.apache.beam.sdk.schemas.SchemaCoder
- encode(T, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.Coder
-
Deprecated.only implement and call
Coder.encode(Object value, OutputStream) - encode(T, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.DelegateCoder
- encode(T, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
- encode(T, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
- encode(T, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
- encode(T, Coder<T>) - Static method in class org.apache.beam.runners.jet.Utils
- encodeAndHash(List<?>, RandomAccessData) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmRecordCoder
-
Computes the shard id for the given key component(s).
- encodeAndHash(List<?>, RandomAccessData, List<Integer>) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmRecordCoder
-
Computes the shard id for the given key component(s).
- encodeAndOwn(byte[], OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
-
Encodes the provided
valuewith the identical encoding toByteArrayCoder.encode(byte[], java.io.OutputStream), but with optimizations that take ownership of the value. - EncodedBoundedWindow - Class in org.apache.beam.sdk.fn.windowing
-
An encoded
BoundedWindowused within Runners to track window information without needing to decode the window. - EncodedBoundedWindow() - Constructor for class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow
- EncodedBoundedWindow.Coder - Class in org.apache.beam.sdk.fn.windowing
- encodeDoLoopBlackhole(VarIntBenchmark.Longs, VarIntBenchmark.BlackholeOutput) - Method in class org.apache.beam.sdk.jmh.util.VarIntBenchmark
- encodeDoLoopByteString(VarIntBenchmark.Longs, VarIntBenchmark.ByteStringOutput) - Method in class org.apache.beam.sdk.jmh.util.VarIntBenchmark
- encodeDoLoopTwiddleBlackhole(VarIntBenchmark.Longs, VarIntBenchmark.BlackholeOutput) - Method in class org.apache.beam.sdk.jmh.util.VarIntBenchmark
- encodeDoLoopTwiddleByteString(VarIntBenchmark.Longs, VarIntBenchmark.ByteStringOutput) - Method in class org.apache.beam.sdk.jmh.util.VarIntBenchmark
- EncodedValueComparator - Class in org.apache.beam.runners.flink.translation.types
-
Flink
TypeComparatorfor Beam values that have been encoded to byte data by aCoder. - EncodedValueComparator(boolean) - Constructor for class org.apache.beam.runners.flink.translation.types.EncodedValueComparator
- EncodedValueSerializer - Class in org.apache.beam.runners.flink.translation.types
-
TypeSerializerfor values that were encoded using aCoder. - EncodedValueSerializer() - Constructor for class org.apache.beam.runners.flink.translation.types.EncodedValueSerializer
- EncodedValueTypeInformation - Class in org.apache.beam.runners.flink.translation.types
-
Flink
TypeInformationfor Beam values that have been encoded to byte data by aCoder. - EncodedValueTypeInformation() - Constructor for class org.apache.beam.runners.flink.translation.types.EncodedValueTypeInformation
- encodeKey(K, Coder<K>) - Static method in class org.apache.beam.runners.flink.translation.wrappers.streaming.FlinkKeyUtils
-
Encodes a key to a byte array wrapped inside a ByteBuffer.
- encodeLoopBlackhole(VarIntBenchmark.Longs, VarIntBenchmark.BlackholeOutput) - Method in class org.apache.beam.sdk.jmh.util.VarIntBenchmark
- encodeLoopByteString(VarIntBenchmark.Longs, VarIntBenchmark.ByteStringOutput) - Method in class org.apache.beam.sdk.jmh.util.VarIntBenchmark
- encodePacked32TimeSeconds(LocalTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Encodes
timeas a 4-byte integer with seconds precision. - encodePacked32TimeSeconds(LocalTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Encodes
timeas a 4-byte integer with seconds precision. - encodePacked64DatetimeMicros(LocalDateTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Encodes
dateTimeas a 8-byte integer with microseconds precision. - encodePacked64DatetimeMicros(LocalDateTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Encodes
dateTimeas a 8-byte integer with microseconds precision. - encodePacked64DatetimeSeconds(LocalDateTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Encodes
dateTimeas a 8-byte integer with seconds precision. - encodePacked64DatetimeSeconds(LocalDateTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Encodes
dateTimeas a 8-byte integer with seconds precision. - encodePacked64TimeMicros(LocalTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Encodes
timeas a 8-byte integer with microseconds precision. - encodePacked64TimeMicros(LocalTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Encodes
timeas a 8-byte integer with microseconds precision. - encodePacked64TimeNanos(LocalTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Encodes
timeas a 8-byte integer with nanoseconds precision. - encodePacked64TimeNanos(LocalTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Encodes
timeas a 8-byte integer with nanoseconds precision. - encodeQueryResult(Table) - Static method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
- encodeQueryResult(Table, List<TableRow>) - Static method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
- encoderFactory() - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderProvider
- EncoderFactory - Class in org.apache.beam.runners.spark.structuredstreaming.translation.helpers
- EncoderFactory() - Constructor for class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderFactory
- encoderFor(Coder<T>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers
- EncoderHelpers - Class in org.apache.beam.runners.spark.structuredstreaming.translation.helpers
-
Encodersutility class. - EncoderHelpers() - Constructor for class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers
- EncoderHelpers.Utils - Class in org.apache.beam.runners.spark.structuredstreaming.translation.helpers
-
Encoder / expression utils that are called from generated code.
- encoderOf(Class<? super T>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers
-
Gets or creates a default
EncoderforEncoderHelpers. - encoderOf(Coder<T>) - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderProvider
- encoderOf(Coder<T>, EncoderProvider.Factory<T>) - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderProvider
- encoderOf(Coder<T>, EncoderProvider.Factory<T>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
- EncoderProvider - Interface in org.apache.beam.runners.spark.structuredstreaming.translation.helpers
- EncoderProvider.Factory<T> - Interface in org.apache.beam.runners.spark.structuredstreaming.translation.helpers
- encodeToTimerDataTimerId(String, String) - Static method in class org.apache.beam.runners.fnexecution.control.TimerReceiverFactory
-
Encodes transform and timer family ids into a single string which retains the human readable format
len(transformId):transformId:timerId. - encodeUnrolledBlackhole(VarIntBenchmark.Longs, VarIntBenchmark.BlackholeOutput) - Method in class org.apache.beam.sdk.jmh.util.VarIntBenchmark
- encodeUnrolledByteString(VarIntBenchmark.Longs, VarIntBenchmark.ByteStringOutput) - Method in class org.apache.beam.sdk.jmh.util.VarIntBenchmark
- ENCODING - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- EncodingException - Exception Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator
-
Represents an error during encoding (serializing) a class.
- EncodingException - Exception Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator
-
Represents an error during encoding (serializing) a class.
- EncodingException(Throwable) - Constructor for exception class org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator.EncodingException
- EncodingException(Throwable) - Constructor for exception class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.EncodingException
- end() - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
-
Returns the end of this window, exclusive.
- END_CURSOR - Static variable in class org.apache.beam.sdk.io.redis.RedisCursor
- END_OF_WINDOW - Enum constant in enum class org.apache.beam.sdk.transforms.windowing.TimestampCombiner
-
The policy of using the end of the window, regardless of input timestamps.
- endpoint() - Method in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration
-
Optional service endpoint to use AWS compatible services instead, e.g.
- endpoint(URI) - Method in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration.Builder
-
Optional service endpoint to use AWS compatible services instead, e.g.
- endsWith(String) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- endsWith(String) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcherwith identical criteria toMatchers.endsWith(java.lang.String). - endsWith(String, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinStringFunctions
- endsWith(Path) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- Enhanced Fan-Out - Search tag in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO
- Section
- Enhanced Fan-Out and KinesisIO state management - Search tag in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO
- Section
- Enhanced Fan-Out and other KinesisIO settings - Search tag in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO
- Section
- ensureUsableAsCloudPubsub() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.CloudPubsubTransforms
-
Ensure that all messages that pass through can be converted to Cloud Pub/Sub messages using the standard transformation methods in the client library.
- ENTER_TRANSFORM - Enum constant in enum class org.apache.beam.sdk.Pipeline.PipelineVisitor.CompositeBehavior
- enterArrayQualifier(FieldSpecifierNotationParser.ArrayQualifierContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
Enter a parse tree produced by
FieldSpecifierNotationParser.arrayQualifier(). - enterArrayQualifier(FieldSpecifierNotationParser.ArrayQualifierContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
-
Enter a parse tree produced by
FieldSpecifierNotationParser.arrayQualifier(). - enterArrayQualifierList(FieldSpecifierNotationParser.ArrayQualifierListContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
Enter a parse tree produced by the
arrayQualifierListlabeled alternative inFieldSpecifierNotationParser.qualifierList(). - enterArrayQualifierList(FieldSpecifierNotationParser.ArrayQualifierListContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
-
Enter a parse tree produced by the
arrayQualifierListlabeled alternative inFieldSpecifierNotationParser.qualifierList(). - enterCompositeTransform(TransformHierarchy.Node) - Method in class org.apache.beam.runners.flink.translation.utils.LookupPipelineVisitor
- enterCompositeTransform(TransformHierarchy.Node) - Method in class org.apache.beam.runners.spark.SparkNativePipelineVisitor
- enterCompositeTransform(TransformHierarchy.Node) - Method in class org.apache.beam.runners.spark.SparkRunner.Evaluator
- enterCompositeTransform(TransformHierarchy.Node) - Method in class org.apache.beam.runners.spark.translation.GroupByKeyVisitor
- enterCompositeTransform(TransformHierarchy.Node) - Method in class org.apache.beam.sdk.Pipeline.PipelineVisitor.Defaults
- enterCompositeTransform(TransformHierarchy.Node) - Method in interface org.apache.beam.sdk.Pipeline.PipelineVisitor
-
Called for each composite transform after all topological predecessors have been visited but before any of its component transforms.
- enterDotExpression(FieldSpecifierNotationParser.DotExpressionContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
Enter a parse tree produced by
FieldSpecifierNotationParser.dotExpression(). - enterDotExpression(FieldSpecifierNotationParser.DotExpressionContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
-
Enter a parse tree produced by
FieldSpecifierNotationParser.dotExpression(). - enterEveryRule(ParserRuleContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
- enterFieldSpecifier(FieldSpecifierNotationParser.FieldSpecifierContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
Enter a parse tree produced by
FieldSpecifierNotationParser.fieldSpecifier(). - enterFieldSpecifier(FieldSpecifierNotationParser.FieldSpecifierContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
-
Enter a parse tree produced by
FieldSpecifierNotationParser.fieldSpecifier(). - enterMapQualifier(FieldSpecifierNotationParser.MapQualifierContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
Enter a parse tree produced by
FieldSpecifierNotationParser.mapQualifier(). - enterMapQualifier(FieldSpecifierNotationParser.MapQualifierContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
-
Enter a parse tree produced by
FieldSpecifierNotationParser.mapQualifier(). - enterMapQualifierList(FieldSpecifierNotationParser.MapQualifierListContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
Enter a parse tree produced by the
mapQualifierListlabeled alternative inFieldSpecifierNotationParser.qualifierList(). - enterMapQualifierList(FieldSpecifierNotationParser.MapQualifierListContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
-
Enter a parse tree produced by the
mapQualifierListlabeled alternative inFieldSpecifierNotationParser.qualifierList(). - enterPipeline(Pipeline) - Method in class org.apache.beam.sdk.Pipeline.PipelineVisitor.Defaults
- enterPipeline(Pipeline) - Method in interface org.apache.beam.sdk.Pipeline.PipelineVisitor
-
Called before visiting anything values or transforms, as many uses of a visitor require access to the
Pipelineobject itself. - enterQualifiedComponent(FieldSpecifierNotationParser.QualifiedComponentContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
Enter a parse tree produced by
FieldSpecifierNotationParser.qualifiedComponent(). - enterQualifiedComponent(FieldSpecifierNotationParser.QualifiedComponentContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
-
Enter a parse tree produced by
FieldSpecifierNotationParser.qualifiedComponent(). - enterQualifyComponent(FieldSpecifierNotationParser.QualifyComponentContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
Enter a parse tree produced by the
qualifyComponentlabeled alternative inFieldSpecifierNotationParser.dotExpressionComponent(). - enterQualifyComponent(FieldSpecifierNotationParser.QualifyComponentContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
-
Enter a parse tree produced by the
qualifyComponentlabeled alternative inFieldSpecifierNotationParser.dotExpressionComponent(). - enterRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.ArrayQualifierContext
- enterRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.ArrayQualifierListContext
- enterRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionContext
- enterRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.FieldSpecifierContext
- enterRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.MapQualifierContext
- enterRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.MapQualifierListContext
- enterRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifiedComponentContext
- enterRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifyComponentContext
- enterRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.SimpleIdentifierContext
- enterRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.WildcardContext
- enterSimpleIdentifier(FieldSpecifierNotationParser.SimpleIdentifierContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
Enter a parse tree produced by the
simpleIdentifierlabeled alternative inFieldSpecifierNotationParser.dotExpressionComponent(). - enterSimpleIdentifier(FieldSpecifierNotationParser.SimpleIdentifierContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
-
Enter a parse tree produced by the
simpleIdentifierlabeled alternative inFieldSpecifierNotationParser.dotExpressionComponent(). - enterWildcard(FieldSpecifierNotationParser.WildcardContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
Enter a parse tree produced by the
wildcardlabeled alternative inFieldSpecifierNotationParser.dotExpressionComponent(). - enterWildcard(FieldSpecifierNotationParser.WildcardContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
-
Enter a parse tree produced by the
wildcardlabeled alternative inFieldSpecifierNotationParser.dotExpressionComponent(). - EntityToRow - Class in org.apache.beam.sdk.io.gcp.datastore
- entries() - Method in interface org.apache.beam.sdk.state.MapState
-
Returns an
Iterableover the key-value pairs contained in this map. - entries() - Method in interface org.apache.beam.sdk.state.MultimapState
-
Returns an
Iterableover all key-value pairs contained in this multimap. - entrySet() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel.WrappedMap
- entrySet() - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
- enum16(Map<String, Integer>) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- ENUM16 - Enum constant in enum class org.apache.beam.sdk.io.clickhouse.TableSchema.TypeName
- enum8(Map<String, Integer>) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- ENUM8 - Enum constant in enum class org.apache.beam.sdk.io.clickhouse.TableSchema.TypeName
- EnumerationType - Class in org.apache.beam.sdk.schemas.logicaltypes
-
This
Schema.LogicalTyperepresent an enumeration over a fixed set of values. - EnumerationType.Value - Class in org.apache.beam.sdk.schemas.logicaltypes
-
This class represents a single enum value.
- enumValues() - Method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- ENVIRONMENT_VERSION_JOB_TYPE_KEY - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- ENVIRONMENT_VERSION_MAJOR_KEY - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- EnvironmentFactory - Interface in org.apache.beam.runners.fnexecution.environment
-
Creates
environmentswhich communicate to anSdkHarnessClient. - EnvironmentFactory.Provider - Interface in org.apache.beam.runners.fnexecution.environment
-
Provider for a
EnvironmentFactoryandServerFactoryfor the environment. - equal(T) - Static method in class org.apache.beam.sdk.transforms.Filter
-
Returns a
PTransformthat takes an inputPCollection<T>and returns aPCollection<T>with elements that equals to a given value. - equals(Object) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmRecordCoder
- equals(Object) - Method in class org.apache.beam.runners.dataflow.util.CloudObject
- equals(Object) - Method in class org.apache.beam.runners.dataflow.util.OutputReference
- equals(Object) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
- equals(Object) - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeInformation
- equals(Object) - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeSerializer
- equals(Object) - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueTypeInformation
- equals(Object) - Method in class org.apache.beam.runners.jet.Utils.ByteArrayKey
- equals(Object) - Method in class org.apache.beam.runners.spark.io.EmptyCheckpointMark
- equals(Object) - Method in class org.apache.beam.runners.spark.io.MicrobatchSource
- equals(Object) - Method in class org.apache.beam.runners.spark.util.ByteArray
- equals(Object) - Method in class org.apache.beam.runners.spark.util.TimerUtils.TimerMarker
- equals(Object) - Method in class org.apache.beam.sdk.coders.AtomicCoder
-
.
- equals(Object) - Method in class org.apache.beam.sdk.coders.Coder.Context
-
Deprecated.
- equals(Object) - Method in class org.apache.beam.sdk.coders.DelegateCoder
- equals(Object) - Method in class org.apache.beam.sdk.coders.RowCoder
- equals(Object) - Method in class org.apache.beam.sdk.coders.SerializableCoder
- equals(Object) - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
- equals(Object) - Method in class org.apache.beam.sdk.coders.StructuralByteArray
- equals(Object) - Method in class org.apache.beam.sdk.coders.StructuredCoder
- equals(Object) - Method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
- equals(Object) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory
- equals(Object) - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsResourceId
- equals(Object) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- equals(Object) - Method in class org.apache.beam.sdk.extensions.protobuf.DynamicProtoCoder
- equals(Object) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
- equals(Object) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoDomain
- equals(Object) - Method in class org.apache.beam.sdk.extensions.sql.example.model.Customer
- equals(Object) - Method in class org.apache.beam.sdk.extensions.sql.example.model.Order
- equals(Object) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
- equals(Object) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
- equals(Object) - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy.Params
- equals(Object) - Method in class org.apache.beam.sdk.io.FileIO.ReadableFile
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertError
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.DetectNewPartitionsState
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.InitialPipelineState
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.WriteSuccessSummary
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteFailure
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteSuccessSummary
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.ProjectPath
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.TopicPath
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
- equals(Object) - Method in class org.apache.beam.sdk.io.hadoop.WritableCoder
- equals(Object) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
- equals(Object) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Write
- equals(Object) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.WriteRowMutations
- equals(Object) - Method in class org.apache.beam.sdk.io.jms.JmsRecord
- equals(Object) - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
- equals(Object) - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
- equals(Object) - Method in class org.apache.beam.sdk.io.range.ByteKey
- equals(Object) - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
- equals(Object) - Method in class org.apache.beam.sdk.io.range.OffsetRange
- equals(Object) - Method in class org.apache.beam.sdk.io.redis.RedisCursor
- equals(Object) - Method in class org.apache.beam.sdk.io.solace.broker.SessionService
-
You need to override this method to be able to compare these objects by value.
- equals(Object) - Method in class org.apache.beam.sdk.io.solace.broker.SessionServiceFactory
-
You need to override this method to be able to compare these objects by value.
- equals(Object) - Method in class org.apache.beam.sdk.io.solace.read.SolaceCheckpointMark
- equals(Object) - Method in class org.apache.beam.sdk.io.tika.ParseResult
- equals(Object) - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
- equals(Object) - Method in class org.apache.beam.sdk.options.ValueProvider.NestedValueProvider
- equals(Object) - Method in class org.apache.beam.sdk.options.ValueProvider.RuntimeValueProvider
- equals(Object) - Method in class org.apache.beam.sdk.options.ValueProvider.StaticValueProvider
- equals(Object) - Method in class org.apache.beam.sdk.schemas.CachingFactory
- equals(Object) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProvider
-
Deprecated.
- equals(Object) - Method in class org.apache.beam.sdk.schemas.JavaBeanSchema
- equals(Object) - Method in class org.apache.beam.sdk.schemas.JavaBeanSchema.GetterTypeSupplier
- equals(Object) - Method in class org.apache.beam.sdk.schemas.JavaBeanSchema.SetterTypeSupplier
- equals(Object) - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType.Value
- equals(Object) - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType.Value
- equals(Object) - Method in class org.apache.beam.sdk.schemas.Schema
-
Returns true if two Schemas have the same fields in the same order.
- equals(Object) - Method in class org.apache.beam.sdk.schemas.Schema.Field
- equals(Object) - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
- equals(Object) - Method in class org.apache.beam.sdk.schemas.Schema.Options
- equals(Object) - Method in class org.apache.beam.sdk.schemas.SchemaCoder
- equals(Object) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
- equals(Object) - Method in class org.apache.beam.sdk.testing.PAssert.PAssertionSite
- equals(Object) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
-
Deprecated.
Object.equals(Object)is not supported on PAssert objects. If you meant to test object equality, use a variant ofPAssert.PCollectionContentsAssert.containsInAnyOrder(T...)instead. - equals(Object) - Method in class org.apache.beam.sdk.testing.SuccessOrFailure
- equals(Object) - Method in class org.apache.beam.sdk.testing.TestStream
- equals(Object) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique
-
Deprecated.
- equals(Object) - Method in class org.apache.beam.sdk.transforms.CombineFns.CoCombineResult
- equals(Object) - Method in class org.apache.beam.sdk.transforms.display.DisplayData
- equals(Object) - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Path
- equals(Object) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
- equals(Object) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
- equals(Object) - Method in class org.apache.beam.sdk.transforms.join.RawUnionValue
- equals(Object) - Method in class org.apache.beam.sdk.transforms.PeriodicSequence.SequenceDefinition
- equals(Object) - Method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHint
- equals(Object) - Method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHints
- equals(Object) - Method in class org.apache.beam.sdk.transforms.Watch.Growth.PollResult
- equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.AfterPane
- equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
- equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.AfterSynchronizedProcessingTime
- equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.FromEndOfWindow
- equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
- equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow
- equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
- equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
- equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
- equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
- equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
- equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
- equals(Object) - Method in class org.apache.beam.sdk.values.KV
- equals(Object) - Method in class org.apache.beam.sdk.values.PCollectionList
- equals(Object) - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
- equals(Object) - Method in class org.apache.beam.sdk.values.PCollectionTuple
- equals(Object) - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryListFromMultimapViewFn
- equals(Object) - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryListViewFn
- equals(Object) - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableBackedListViewFn
- equals(Object) - Method in class org.apache.beam.sdk.values.PCollectionViews.ListViewFn
-
Deprecated.
- equals(Object) - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
- equals(Object) - Method in class org.apache.beam.sdk.values.Row
- equals(Object) - Method in class org.apache.beam.sdk.values.RowWithGetters
- equals(Object) - Method in class org.apache.beam.sdk.values.ShardedKey
-
Deprecated.
- equals(Object) - Method in class org.apache.beam.sdk.values.TimestampedValue
- equals(Object) - Method in class org.apache.beam.sdk.values.TupleTag
- equals(Object) - Method in class org.apache.beam.sdk.values.TypeDescriptor
-
Two type descriptor are equal if and only if they represent the same type.
- equals(Object) - Method in class org.apache.beam.sdk.values.TypeParameter
- equals(Object) - Method in class org.apache.beam.sdk.values.ValueWithRecordId
- equals(Object) - Method in class org.apache.beam.sdk.values.WindowingStrategy
- equals(Object) - Method in class org.apache.beam.runners.flink.adapter.FlinkKey
- equals(Object) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.FlinkStateNamespaceKeySerializer
- equals(Object) - Method in class org.apache.beam.sdk.coders.ZstdCoder
- equals(Object) - Method in class org.apache.beam.sdk.extensions.ordered.combiner.SequenceRangeAccumulator
- equals(Object) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus
- equals(Object) - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
- equals(Object) - Method in class org.apache.beam.sdk.io.cassandra.RingRange
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.StreamPartitionWithWatermark
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadParameter
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadResponse
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataTableNames
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartition
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.HeartbeatRecord
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.Mod
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionEndRecord
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionEventRecord
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionStartRecord
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.TypeCode
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRange
- equals(Object) - Method in class org.apache.beam.sdk.values.EncodableThrowable
- equals(WindowedValue<T>, WindowedValue<T>) - Static method in class org.apache.beam.sdk.values.WindowedValues
- equals(RelOptCost) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
- Equals() - Constructor for class org.apache.beam.sdk.values.Row.Equals
- EQUALS - Enum constant in enum class org.apache.beam.sdk.extensions.sql.impl.cep.CEPKind
- equalTo(Coder<T>, T) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcherwith identical criteria toMatchers.equalTo(Object). - equalTo(T) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcherwith identical criteria toMatchers.equalTo(Object). - equalToReference(byte[]) - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueComparator
- equivalent(Schema) - Method in class org.apache.beam.sdk.schemas.Schema
-
Returns true if two Schemas have the same fields, but possibly in different orders.
- equivalent(Schema.FieldType, Schema.EquivalenceNullablePolicy) - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
-
Check whether two types are equivalent.
- ERROR - Enum constant in enum class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.Level
-
Deprecated.Level for logging error messages.
- ERROR - Enum constant in enum class org.apache.beam.sdk.io.fs.MatchResult.Status
- ERROR - Enum constant in enum class org.apache.beam.sdk.io.snowflake.enums.StreamingLogLevel
- ERROR - Enum constant in enum class org.apache.beam.sdk.options.PipelineOptions.CheckEnabled
- ERROR - Enum constant in enum class org.apache.beam.sdk.options.SdkHarnessOptions.LogLevel
-
LogLevel for logging error messages.
- ERROR - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils
- ERROR_MESSAGE - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata
-
TupleTag for any error response.
- ERROR_MSG_QUERY_FN - Static variable in class org.apache.beam.sdk.io.mongodb.MongoDbIO
- ERROR_ROW_SCHEMA - Static variable in class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn
- ERROR_ROW_WITH_ERR_MSG_SCHEMA - Static variable in class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn
- ERROR_SCHEMA - Static variable in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformProvider
- ERROR_SCHEMA - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformProvider
- ERROR_SCHEMA - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider
- ERROR_SCHEMA - Static variable in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider
- ERROR_TAG - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformProvider
- ERROR_TAG - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformProvider
- ERROR_TAG - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
- ERROR_TAG - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
- ERROR_TAG - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider
- ERROR_TAG - Static variable in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider
- ERROR_TAG - Static variable in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider
- ERROR_TAG - Static variable in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider
- ERROR_TAG - Static variable in class org.apache.beam.sdk.io.TFRecordReadSchemaTransformProvider
- ERROR_TAG - Static variable in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformProvider
- errorCodeFn - Variable in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
- ErrorContainer<T> - Interface in org.apache.beam.sdk.io.gcp.bigquery
-
ErrorContainer interface.
- ErrorCounterFn(String, SerializableFunction<Row, byte[]>, Schema, boolean) - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.ErrorCounterFn
- ErrorCounterFn(String, SerializableFunction<Row, byte[]>, Schema, boolean, List<String>, Schema) - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.ErrorCounterFn
- ErrorFn(String, Schema, Schema, boolean) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.ErrorFn
- ErrorFn(String, SerializableFunction<byte[], Row>, Schema, boolean) - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.ErrorFn
- ErrorFn(String, SerializableFunction<byte[], Row>, Schema, boolean) - Constructor for class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider.ErrorFn
- ErrorFn(String, SerializableFunction<byte[], Row>, Schema, boolean) - Constructor for class org.apache.beam.sdk.io.TFRecordReadSchemaTransformProvider.ErrorFn
- ErrorFn(String, SerializableFunction<byte[], Row>, Schema, List<String>, String, Schema, boolean) - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.ErrorFn
- ErrorFn(String, SerializableFunction<Row, byte[]>, Schema, boolean) - Constructor for class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformProvider.ErrorFn
- ErrorHandler<ErrorT,
OutputT> - Interface in org.apache.beam.sdk.transforms.errorhandling -
An Error Handler is a utility object used for plumbing error PCollections to a configured sink Error Handlers must be closed before a pipeline is run to properly pipe error collections to the sink, and the pipeline will be rejected if any handlers aren't closed.
- ErrorHandler.BadRecordErrorHandler<OutputT> - Class in org.apache.beam.sdk.transforms.errorhandling
- ErrorHandler.DefaultErrorHandler<ErrorT,
OutputT> - Class in org.apache.beam.sdk.transforms.errorhandling -
A default, placeholder error handler that exists to allow usage of .addErrorCollection() without effects.
- ErrorHandler.PTransformErrorHandler<ErrorT,
OutputT> - Class in org.apache.beam.sdk.transforms.errorhandling - ErrorHandler.PTransformErrorHandler.WriteErrorMetrics<ErrorT> - Class in org.apache.beam.sdk.transforms.errorhandling
- ErrorHandler.PTransformErrorHandler.WriteErrorMetrics.CountErrors<ErrorT> - Class in org.apache.beam.sdk.transforms.errorhandling
- ErrorHandling - Class in org.apache.beam.sdk.schemas.transforms.providers
- ErrorHandling() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.ErrorHandling
- ErrorHandling() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.ErrorHandling
- ErrorHandling() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.ErrorHandling
- ErrorHandling() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.ErrorHandling
- ErrorHandling.Builder - Class in org.apache.beam.sdk.schemas.transforms.providers
- errorRecord(Schema, byte[], Throwable) - Static method in class org.apache.beam.sdk.schemas.transforms.providers.ErrorHandling
- errorRecord(Schema, Row, Throwable) - Static method in class org.apache.beam.sdk.schemas.transforms.providers.ErrorHandling
- errorSchema(Schema) - Static method in class org.apache.beam.sdk.schemas.transforms.providers.ErrorHandling
- errorSchemaBytes() - Static method in class org.apache.beam.sdk.schemas.transforms.providers.ErrorHandling
- estimate() - Method in interface org.apache.beam.sdk.transforms.splittabledofn.GrowableOffsetRangeTracker.RangeEndEstimator
- estimateCount(T, Coder<T>) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.Sketch
-
Utility class to retrieve the estimate frequency of an element from a
CountMinSketch. - estimateFractionForKey(ByteKey) - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
-
Returns the fraction of this range
[startKey, endKey)that is in the interval[startKey, key). - estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.AbstractBeamCalcRel
- estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamAggregationRel
- estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIntersectRel
- estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSinkRel
- estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
- estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
- estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMatchRel
- estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMinusRel
- estimateNodeStats(BeamRelMetadataQuery) - Method in interface org.apache.beam.sdk.extensions.sql.impl.rel.BeamRelNode
-
This method is called by
org.apache.beam.sdk.extensions.sql.impl.planner.RelMdNodeStats. - estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel
- estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamTableFunctionScanRel
- estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUncollectRel
- estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnionRel
- estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnnestRel
- estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamValuesRel
- estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamWindowRel
- estimateRowCount(PipelineOptions) - Method in class org.apache.beam.sdk.io.TextRowCountEstimator
-
Estimates the number of non empty rows.
- estimateRowCount(RelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
- eval(BatchTSetEnvironment, SinkTSet<?>) - Method in class org.apache.beam.runners.twister2.BeamBatchWorker
- eval(SinkTSet<?>) - Method in class org.apache.beam.runners.twister2.Twister2BatchTranslationContext
- eval(SinkTSet<?>) - Method in class org.apache.beam.runners.twister2.Twister2StreamTranslationContext
- eval(SinkTSet<?>) - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
- eval(Row) - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.PatternCondition
- evaluate() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.EvaluationContext
-
Trigger evaluation of all leaf datasets.
- evaluate(String, Dataset<T>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.EvaluationContext
-
The purpose of this utility is to mark the evaluation of Spark actions, both during Pipeline translation, when evaluation is required, and when finally evaluating the pipeline.
- evaluate(ParDo.MultiOutput<KV<KeyT, ValueT>, OutputT>, EvaluationContext) - Method in class org.apache.beam.runners.spark.translation.streaming.StatefulStreamingParDoEvaluator
- evaluate(TransformT, EvaluationContext) - Method in interface org.apache.beam.runners.spark.translation.TransformEvaluator
- EvaluationContext - Class in org.apache.beam.runners.spark.structuredstreaming.translation
-
The
EvaluationContextis the result of a pipelinetranslationand can be used to evaluate / run the pipeline. - EvaluationContext - Class in org.apache.beam.runners.spark.translation
-
The EvaluationContext allows us to define pipeline instructions and translate between
PObject<T>s orPCollection<T>s and Ts or DStreams/RDDs of Ts. - EvaluationContext(JavaSparkContext, Pipeline, PipelineOptions) - Constructor for class org.apache.beam.runners.spark.translation.EvaluationContext
- EvaluationContext(JavaSparkContext, Pipeline, PipelineOptions, JavaStreamingContext) - Constructor for class org.apache.beam.runners.spark.translation.EvaluationContext
- Evaluator(SparkPipelineTranslator, EvaluationContext) - Constructor for class org.apache.beam.runners.spark.SparkRunner.Evaluator
- event() - Method in class org.apache.beam.sdk.io.splunk.SplunkEvent
- EVENT_TIME - Enum constant in enum class org.apache.beam.sdk.state.TimeDomain
-
The
TimeDomain.EVENT_TIMEdomain corresponds to the timestamps on the elements. - EventExaminer<EventT,
StateT> - Interface in org.apache.beam.sdk.extensions.ordered -
Classes extending this interface will be called by
OrderedEventProcessorto examine every incoming event. - eventStore() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportUserEvents
- eventStore() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIPredict
- eventStore() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIWriteUserEvent
- Event Timestamps and Watermark - Search tag in class org.apache.beam.sdk.io.kafka.KafkaIO
- Section
- eventually(Matcher<Iterable<? extends Row>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery.RowsAssertion
- ever() - Static method in class org.apache.beam.sdk.transforms.windowing.Never
-
Returns a trigger which never fires.
- every(Duration) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
-
Returns a new
SlidingWindowswith the original size, that assigns timestamps into half-open intervals of the form [N * period, N * period + size), where 0 is the epoch. - EXACTLY_ONCE - Static variable in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- Example - Search tag in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO
- Section
- Example - Search tag in class org.apache.beam.sdk.io.mqtt.MqttIO
- Section
- Example - Search tag in interface org.apache.beam.sdk.io.range.RangeTracker
- Section
- Example - Search tag in org.apache.beam.sdk.io.csv.CsvIOParse.withCustomRecordParsing(String, SerializableFunction<String, OutputT>)
- Section
- Example: Matching a PCollection of filepatterns arriving from Kafka - Search tag in class org.apache.beam.sdk.io.FileIO
- Section
- Example: Returning filenames and contents of compressed files matching a filepattern - Search tag in class org.apache.beam.sdk.io.FileIO
- Section
- Example: Watching a single filepattern for new files - Search tag in class org.apache.beam.sdk.io.FileIO
- Section
- Example: Writing CSV files - Search tag in class org.apache.beam.sdk.io.FileIO
- Section
- Example: Writing CSV files to different directories and with different headers - Search tag in class org.apache.beam.sdk.io.FileIO
- Section
- Example 1: Approximate Count of Ints PCollection<Integer> and specify precision - Search tag in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct
- Section
- Example 1: basic use - Search tag in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct
- Section
- Example 1: Create long-type sketch for a PCollection<Long> and specify precision - Search tag in class org.apache.beam.sdk.extensions.zetasketch.HllCount
- Section
- Example 1: default use - Search tag in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies
- Section
- Example 1: Default use - Search tag in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles
- Section
- Example 1: globally default use - Search tag in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct
- Section
- Example 2: Approximate Count of Key Value PCollection<KV<Integer,Foo>> - Search tag in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct
- Section
- Example 2: Create bytes-type sketch for a PCollection<KV<String, byte[]>> - Search tag in class org.apache.beam.sdk.extensions.zetasketch.HllCount
- Section
- Example 2: per key default use - Search tag in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct
- Section
- Example 2: tune accuracy parameters - Search tag in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies
- Section
- Example 2: tune accuracy parameters - Search tag in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles
- Section
- Example 2: use the CombineFn in a stateful ParDo - Search tag in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct
- Section
- Example 3: Approximate Count of Key Value PCollection<KV<Integer,Foo>> - Search tag in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct
- Section
- Example 3: Merge existing sketches in a PCollection<byte[]> into a new one - Search tag in class org.apache.beam.sdk.extensions.zetasketch.HllCount
- Section
- Example 3: query the resulting sketch - Search tag in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies
- Section
- Example 3 : Query the resulting structure - Search tag in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles
- Section
- Example 3: tune precision and use sparse representation - Search tag in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct
- Section
- Example 3: use the RetrieveCardinality utility class - Search tag in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct
- Section
- Example 4: Estimates the count of distinct elements in a PCollection<String> - Search tag in class org.apache.beam.sdk.extensions.zetasketch.HllCount
- Section
- Example 4: Using the CombineFn - Search tag in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies
- Section
- Example 4: Using the CombineFn - Search tag in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles
- Section
- Example PubsubIO read usage - Search tag in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
- Section
- Example PubsubIO write usage - Search tag in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
- Section
- Examples - Search tag in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct
- Section
- Examples - Search tag in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies
- Section
- Examples - Search tag in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles
- Section
- Examples - Search tag in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct
- Section
- Examples - Search tag in class org.apache.beam.sdk.extensions.zetasketch.HllCount
- Section
- Example usage - Search tag in org.apache.beam.sdk.io.csv.CsvIO.parse(Class<T>, CSVFormat)
- Section
- Example usage - Search tag in org.apache.beam.sdk.io.csv.CsvIO.parseRows(Schema, CSVFormat)
- Section
- Example usage: - Search tag in class org.apache.beam.sdk.io.csv.CsvIO
- Section
- Example usage: - Search tag in class org.apache.beam.sdk.io.json.JsonIO
- Section
- exceptAll() - Static method in class org.apache.beam.sdk.transforms.Sets
-
Returns a new
PTransformtransform that follows SET ALL semantics which takes aPCollectionList<PCollection<T>>and returns aPCollection<T>containing the difference all (exceptAll) of collections done in order for all collections inPCollectionList<T>. - exceptAll(PCollection<T>) - Static method in class org.apache.beam.sdk.transforms.Sets
-
Returns a new
PTransformtransform that follows SET ALL semantics to compute the difference all (exceptAll) with providedPCollection<T>. - exceptDistinct() - Static method in class org.apache.beam.sdk.transforms.Sets
-
Returns a
PTransformthat takes aPCollectionList<PCollection<T>>and returns aPCollection<T>containing the difference (except) of collections done in order for all collections inPCollectionList<T>. - exceptDistinct(PCollection<T>) - Static method in class org.apache.beam.sdk.transforms.Sets
-
Returns a new
PTransformtransform that follows SET DISTINCT semantics to compute the difference (except) with providedPCollection<T>. - exception() - Method in class org.apache.beam.sdk.transforms.WithFailures.ExceptionElement
- exception_thrown - Enum constant in enum class org.apache.beam.sdk.extensions.ordered.UnprocessedEvent.Reason
- ExceptionAsMapHandler() - Constructor for class org.apache.beam.sdk.transforms.WithFailures.ExceptionAsMapHandler
- ExceptionElement() - Constructor for class org.apache.beam.sdk.transforms.WithFailures.ExceptionElement
- exceptionHandler - Variable in class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceTransform
- exceptionsInto(TypeDescriptor<FailureT>) - Method in class org.apache.beam.sdk.transforms.MapKeys
-
Returns a new
SimpleMapWithFailurestransform that catches exceptions raised while mapping elements, with the given type descriptor used for the failure collection but the exception handler yet to be specified usingSimpleMapWithFailures.exceptionsVia(ProcessFunction). - exceptionsInto(TypeDescriptor<FailureT>) - Method in class org.apache.beam.sdk.transforms.MapValues
-
Returns a new
SimpleMapWithFailurestransform that catches exceptions raised while mapping elements, with the given type descriptor used for the failure collection but the exception handler yet to be specified usingSimpleMapWithFailures.exceptionsVia(ProcessFunction). - exceptionsInto(TypeDescriptor<NewFailureT>) - Method in class org.apache.beam.sdk.extensions.jackson.AsJsons
-
Returns a new
AsJsons.AsJsonsWithFailurestransform that catches exceptions raised while writing JSON elements, with the given type descriptor used for the failure collection but the exception handler yet to be specified usingAsJsons.AsJsonsWithFailures.exceptionsVia(ProcessFunction). - exceptionsInto(TypeDescriptor<NewFailureT>) - Method in class org.apache.beam.sdk.extensions.jackson.ParseJsons
-
Returns a new
ParseJsons.ParseJsonsWithFailurestransform that catches exceptions raised while parsing elements, with the given type descriptor used for the failure collection but the exception handler yet to be specified usingParseJsons.ParseJsonsWithFailures.exceptionsVia(ProcessFunction). - exceptionsInto(TypeDescriptor<NewFailureT>) - Method in class org.apache.beam.sdk.transforms.FlatMapElements
-
Returns a new
FlatMapElements.FlatMapWithFailurestransform that catches exceptions raised while mapping elements, with the given type descriptor used for the failure collection but the exception handler yet to be specified usingFlatMapElements.FlatMapWithFailures.exceptionsVia(ProcessFunction). - exceptionsInto(TypeDescriptor<NewFailureT>) - Method in class org.apache.beam.sdk.transforms.MapElements
-
Returns a new
MapElements.MapWithFailurestransform that catches exceptions raised while mapping elements, with the given type descriptor used for the failure collection but the exception handler yet to be specified usingMapElements.MapWithFailures.exceptionsVia(ProcessFunction). - exceptionsVia() - Method in class org.apache.beam.sdk.extensions.jackson.AsJsons
-
Returns a new
AsJsons.AsJsonsWithFailurestransform that catches exceptions raised while writing JSON elements, passing the raised exception instance and the input element being processed through the default exception handlerAsJsons.DefaultExceptionAsMapHandlerand emitting the result to a failure collection. - exceptionsVia() - Method in class org.apache.beam.sdk.extensions.jackson.ParseJsons
-
Returns a new
ParseJsons.ParseJsonsWithFailurestransform that catches exceptions raised while parsing elements, passing the raised exception instance and the input element being processed through the default exception handlerParseJsons.DefaultExceptionAsMapHandlerand emitting the result to a failure collection. - exceptionsVia(InferableFunction<WithFailures.ExceptionElement<InputT>, FailureT>) - Method in class org.apache.beam.sdk.extensions.jackson.AsJsons
-
Returns a new
AsJsons.AsJsonsWithFailurestransform that catches exceptions raised while writing JSON elements, passing the raised exception instance and the input element being processed through the givenexceptionHandlerand emitting the result to a failure collection. - exceptionsVia(InferableFunction<WithFailures.ExceptionElement<InputT>, FailureT>) - Method in class org.apache.beam.sdk.transforms.FlatMapElements
-
Returns a new
FlatMapElements.FlatMapWithFailurestransform that catches exceptions raised while mapping elements, passing the raised exception instance and the input element being processed through the givenexceptionHandlerand emitting the result to a failure collection. - exceptionsVia(InferableFunction<WithFailures.ExceptionElement<InputT>, FailureT>) - Method in class org.apache.beam.sdk.transforms.MapElements
-
Returns a new
MapElements.MapWithFailurestransform that catches exceptions raised while mapping elements, passing the raised exception instance and the input element being processed through the givenexceptionHandlerand emitting the result to a failure collection. - exceptionsVia(InferableFunction<WithFailures.ExceptionElement<String>, FailureT>) - Method in class org.apache.beam.sdk.extensions.jackson.ParseJsons
-
Returns a new
ParseJsons.ParseJsonsWithFailurestransform that catches exceptions raised while parsing elements, passing the raised exception instance and the input element being processed through the givenexceptionHandlerand emitting the result to a failure collection. - exceptionsVia(InferableFunction<WithFailures.ExceptionElement<KV<K, V1>>, FailureT>) - Method in class org.apache.beam.sdk.transforms.MapValues
-
Returns a new
SimpleMapWithFailurestransform that catches exceptions raised while mapping elements, passing the raised exception instance and the input element being processed through the givenexceptionHandlerand emitting the result to a failure collection. - exceptionsVia(InferableFunction<WithFailures.ExceptionElement<KV<K1, V>>, FailureT>) - Method in class org.apache.beam.sdk.transforms.MapKeys
-
Returns a new
SimpleMapWithFailurestransform that catches exceptions raised while mapping elements, passing the raised exception instance and the input element being processed through the givenexceptionHandlerand emitting the result to a failure collection. - exceptionsVia(ProcessFunction<WithFailures.ExceptionElement<InputT>, FailureT>) - Method in class org.apache.beam.sdk.extensions.jackson.AsJsons.AsJsonsWithFailures
-
Returns a new
AsJsons.AsJsonsWithFailurestransform that catches exceptions raised while writing JSON elements, passing the raised exception instance and the input element being processed through the givenexceptionHandlerand emitting the result to a failure collection. - exceptionsVia(ProcessFunction<WithFailures.ExceptionElement<InputT>, FailureT>) - Method in class org.apache.beam.sdk.transforms.FlatMapElements.FlatMapWithFailures
-
Returns a new
FlatMapElements.FlatMapWithFailurestransform that catches exceptions raised while mapping elements, passing the raised exception instance and the input element being processed through the givenexceptionHandlerand emitting the result to a failure collection. - exceptionsVia(ProcessFunction<WithFailures.ExceptionElement<InputT>, FailureT>) - Method in class org.apache.beam.sdk.transforms.MapElements.MapWithFailures
-
Returns a
PTransformthat catches exceptions raised while mapping elements, passing the raised exception instance and the input element being processed through the givenexceptionHandlerand emitting the result to a failure collection. - exceptionsVia(ProcessFunction<WithFailures.ExceptionElement<String>, FailureT>) - Method in class org.apache.beam.sdk.extensions.jackson.ParseJsons.ParseJsonsWithFailures
-
Returns a new
ParseJsons.ParseJsonsWithFailurestransform that catches exceptions raised while parsing elements, passing the raised exception instance and the input element being processed through the givenexceptionHandlerand emitting the result to a failure collection. - ExecutableGraph<ExecutableT,
CollectionT> - Interface in org.apache.beam.runners.direct -
The interface that enables querying of a graph of independently executable stages and the inputs and outputs of those stages.
- ExecutableProcessBundleDescriptor() - Constructor for class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.ExecutableProcessBundleDescriptor
- ExecutableStageContext - Interface in org.apache.beam.runners.fnexecution.control
-
The context required in order to execute
stages. - ExecutableStageContext.Factory - Interface in org.apache.beam.runners.fnexecution.control
-
Creates
ExecutableStageContextinstances. - ExecutableStageDoFnOperator<InputT,
OutputT> - Class in org.apache.beam.runners.flink.translation.wrappers.streaming -
This operator is the streaming equivalent of the
FlinkExecutableStageFunction. - ExecutableStageDoFnOperator(String, Coder<WindowedValue<InputT>>, Map<TupleTag<?>, Coder<?>>, TupleTag<OutputT>, List<TupleTag<?>>, DoFnOperator.OutputManagerFactory<OutputT>, Map<Integer, PCollectionView<?>>, Collection<PCollectionView<?>>, Map<RunnerApi.ExecutableStagePayload.SideInputId, PCollectionView<?>>, PipelineOptions, RunnerApi.ExecutableStagePayload, JobInfo, FlinkExecutableStageContextFactory, Map<String, TupleTag<?>>, WindowingStrategy, Coder, KeySelector<WindowedValue<InputT>, ?>) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.ExecutableStageDoFnOperator
-
Constructor.
- execute() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter
- execute(BatchTSetEnvironment) - Method in class org.apache.beam.runners.twister2.BeamBatchWorker
- execute(Runnable) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- execute(String) - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.BatchTranslationContext
- execute(String) - Method in interface org.apache.beam.runners.flink.FlinkPortablePipelineTranslator.Executor
- execute(String) - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator.StreamingTranslationContext
- execute(String) - Method in class org.apache.beam.sdk.extensions.sql.BeamSqlCli
-
Executes the given sql.
- execute(CalcitePrepare.Context) - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateCatalog
- execute(CalcitePrepare.Context) - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateDatabase
- execute(CalcitePrepare.Context) - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateExternalTable
- execute(CalcitePrepare.Context) - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateFunction
- execute(CalcitePrepare.Context) - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlDropCatalog
- execute(CalcitePrepare.Context) - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlDropDatabase
- execute(CalcitePrepare.Context) - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlDropTable
- execute(CalcitePrepare.Context) - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlSetOptionBeam
- execute(CalcitePrepare.Context) - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlUseCatalog
- execute(CalcitePrepare.Context) - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlUseDatabase
- execute(Expression, Class<T>) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- execute(Expression, Type) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- EXECUTE_BUNDLE - Enum constant in enum class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.WriteMethod
- executeBundles(String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
- executeBundles(ValueProvider<String>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
- ExecuteBundles(String) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundles
- ExecuteBundles(ValueProvider<String>) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundles
-
Instantiates a new Execute bundles.
- executeDdl(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
- executeFhirBundle(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Execute fhir bundle http body.
- executeFhirBundle(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- executePipeline(BatchTSetEnvironment) - Method in class org.apache.beam.runners.twister2.BeamBatchWorker
- executeQuery(Queryable<T>) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- ExecutionDriver - Interface in org.apache.beam.runners.local
-
Drives the execution of a
Pipelineby scheduling work. - ExecutionDriver.DriverState - Enum Class in org.apache.beam.runners.local
-
The state of the driver.
- ExecutorOptions - Interface in org.apache.beam.sdk.options
-
Options for configuring the
ScheduledExecutorServiceused throughout the Java runtime. - ExecutorOptions.ScheduledExecutorServiceFactory - Class in org.apache.beam.sdk.options
-
Returns the default
ScheduledExecutorServiceto use within the Apache Beam SDK. - ExecutorServiceFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcsOptions.ExecutorServiceFactory
- exists() - Method in class org.apache.beam.io.debezium.KafkaSourceConsumerFn.DebeziumSDFDatabaseHistory
- exitArrayQualifier(FieldSpecifierNotationParser.ArrayQualifierContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
Exit a parse tree produced by
FieldSpecifierNotationParser.arrayQualifier(). - exitArrayQualifier(FieldSpecifierNotationParser.ArrayQualifierContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
-
Exit a parse tree produced by
FieldSpecifierNotationParser.arrayQualifier(). - exitArrayQualifierList(FieldSpecifierNotationParser.ArrayQualifierListContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
Exit a parse tree produced by the
arrayQualifierListlabeled alternative inFieldSpecifierNotationParser.qualifierList(). - exitArrayQualifierList(FieldSpecifierNotationParser.ArrayQualifierListContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
-
Exit a parse tree produced by the
arrayQualifierListlabeled alternative inFieldSpecifierNotationParser.qualifierList(). - exitDotExpression(FieldSpecifierNotationParser.DotExpressionContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
Exit a parse tree produced by
FieldSpecifierNotationParser.dotExpression(). - exitDotExpression(FieldSpecifierNotationParser.DotExpressionContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
-
Exit a parse tree produced by
FieldSpecifierNotationParser.dotExpression(). - exitEveryRule(ParserRuleContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
- exitFieldSpecifier(FieldSpecifierNotationParser.FieldSpecifierContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
Exit a parse tree produced by
FieldSpecifierNotationParser.fieldSpecifier(). - exitFieldSpecifier(FieldSpecifierNotationParser.FieldSpecifierContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
-
Exit a parse tree produced by
FieldSpecifierNotationParser.fieldSpecifier(). - exitMapQualifier(FieldSpecifierNotationParser.MapQualifierContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
Exit a parse tree produced by
FieldSpecifierNotationParser.mapQualifier(). - exitMapQualifier(FieldSpecifierNotationParser.MapQualifierContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
-
Exit a parse tree produced by
FieldSpecifierNotationParser.mapQualifier(). - exitMapQualifierList(FieldSpecifierNotationParser.MapQualifierListContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
Exit a parse tree produced by the
mapQualifierListlabeled alternative inFieldSpecifierNotationParser.qualifierList(). - exitMapQualifierList(FieldSpecifierNotationParser.MapQualifierListContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
-
Exit a parse tree produced by the
mapQualifierListlabeled alternative inFieldSpecifierNotationParser.qualifierList(). - exitQualifiedComponent(FieldSpecifierNotationParser.QualifiedComponentContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
Exit a parse tree produced by
FieldSpecifierNotationParser.qualifiedComponent(). - exitQualifiedComponent(FieldSpecifierNotationParser.QualifiedComponentContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
-
Exit a parse tree produced by
FieldSpecifierNotationParser.qualifiedComponent(). - exitQualifyComponent(FieldSpecifierNotationParser.QualifyComponentContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
Exit a parse tree produced by the
qualifyComponentlabeled alternative inFieldSpecifierNotationParser.dotExpressionComponent(). - exitQualifyComponent(FieldSpecifierNotationParser.QualifyComponentContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
-
Exit a parse tree produced by the
qualifyComponentlabeled alternative inFieldSpecifierNotationParser.dotExpressionComponent(). - exitRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.ArrayQualifierContext
- exitRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.ArrayQualifierListContext
- exitRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionContext
- exitRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.FieldSpecifierContext
- exitRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.MapQualifierContext
- exitRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.MapQualifierListContext
- exitRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifiedComponentContext
- exitRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifyComponentContext
- exitRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.SimpleIdentifierContext
- exitRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.WildcardContext
- exitSimpleIdentifier(FieldSpecifierNotationParser.SimpleIdentifierContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
Exit a parse tree produced by the
simpleIdentifierlabeled alternative inFieldSpecifierNotationParser.dotExpressionComponent(). - exitSimpleIdentifier(FieldSpecifierNotationParser.SimpleIdentifierContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
-
Exit a parse tree produced by the
simpleIdentifierlabeled alternative inFieldSpecifierNotationParser.dotExpressionComponent(). - exitWildcard(FieldSpecifierNotationParser.WildcardContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
Exit a parse tree produced by the
wildcardlabeled alternative inFieldSpecifierNotationParser.dotExpressionComponent(). - exitWildcard(FieldSpecifierNotationParser.WildcardContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
-
Exit a parse tree produced by the
wildcardlabeled alternative inFieldSpecifierNotationParser.dotExpressionComponent(). - expand() - Method in class org.apache.beam.io.requestresponse.Result
- expand() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedEventProcessorResult
- expand() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Write.Result
- expand() - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.WriteBatches.Result
- expand() - Method in class org.apache.beam.sdk.io.csv.CsvIOParseResult
- expand() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
- expand() - Method in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata.Result
- expand() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
- expand() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read.Result
- expand() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search.Result
- expand() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.Result
- expand() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything.Result
- expand() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read.Result
- expand() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.Result
- expand() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write.Result
- expand() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteResult
- expand() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteResult
- expand() - Method in class org.apache.beam.sdk.io.jms.WriteJmsResult
- expand() - Method in class org.apache.beam.sdk.io.solace.write.SolaceOutput
- expand() - Method in class org.apache.beam.sdk.io.WriteFilesResult
- expand() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
-
Expands the component
PCollections, stripping off any tag-specific information. - expand() - Method in class org.apache.beam.sdk.transforms.JsonToRow.ParseResult
- expand() - Method in class org.apache.beam.sdk.transforms.WithFailures.Result
- expand() - Method in class org.apache.beam.sdk.values.PBegin
- expand() - Method in class org.apache.beam.sdk.values.PCollection
- expand() - Method in class org.apache.beam.sdk.values.PCollectionList
- expand() - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
- expand() - Method in class org.apache.beam.sdk.values.PCollectionTuple
- expand() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
- expand() - Method in class org.apache.beam.sdk.values.PDone
- expand() - Method in interface org.apache.beam.sdk.values.PInput
- expand() - Method in interface org.apache.beam.sdk.values.POutput
- expand() - Method in interface org.apache.beam.sdk.values.PValue
-
Deprecated.A
PValuealways expands into itself. CallingPValue.expand()on a PValue is almost never appropriate. - expand(InputT) - Method in class org.apache.beam.sdk.extensions.python.PythonExternalTransform
- expand(InputT) - Method in class org.apache.beam.sdk.extensions.yaml.YamlTransform
- expand(InputT) - Method in class org.apache.beam.sdk.transforms.PTransform
-
Override this method to specify how this
PTransformshould be expanded on the givenInputT. - expand(ExpansionApi.ExpansionRequest, StreamObserver<ExpansionApi.ExpansionResponse>) - Method in class org.apache.beam.sdk.expansion.service.ExpansionService
- expand(GcsPath) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
-
Expands a pattern into matched paths.
- expand(KeyedPCollectionTuple<K>) - Method in class org.apache.beam.sdk.transforms.join.CoGroupByKey
- expand(PBegin) - Method in class org.apache.beam.io.debezium.DebeziumIO.Read
- expand(PBegin) - Method in class org.apache.beam.runners.spark.io.CreateStream
- expand(PBegin) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Parse
- expand(PBegin) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datagen.DataGeneratorPTransform
- expand(PBegin) - Method in class org.apache.beam.sdk.io.amqp.AmqpIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.aws2.dynamodb.DynamoDBIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.KinesisReadToBytes
- expand(PBegin) - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.azure.cosmos.CosmosIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.BoundedReadFromUnboundedSource
- expand(PBegin) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.cdap.CdapIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.FileIO.Match
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Deidentify
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Export
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.ListHL7v2Messages
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.SubscribeTransform
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
- expand(PBegin) - Method in class org.apache.beam.sdk.io.GenerateSequence
- expand(PBegin) - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.iceberg.IcebergIO.ReadRows
- expand(PBegin) - Method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadRows
- expand(PBegin) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadWithPartitions
- expand(PBegin) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.TypedWithoutMetadata
- expand(PBegin) - Method in class org.apache.beam.sdk.io.kudu.KuduIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Parse
- expand(PBegin) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.pulsar.PulsarIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.Read.Bounded
- expand(PBegin) - Method in class org.apache.beam.sdk.io.Read.Unbounded
- expand(PBegin) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.ReadWithPartitions
- expand(PBegin) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.solr.SolrIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.TextIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.TFRecordIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.tika.TikaIO.Parse
- expand(PBegin) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.testing.PAssert.OneSideInputAssert
- expand(PBegin) - Method in class org.apache.beam.sdk.testing.TestStream
- expand(PBegin) - Method in class org.apache.beam.sdk.transforms.Create.OfValueProvider
- expand(PBegin) - Method in class org.apache.beam.sdk.transforms.Create.TimestampedValues
- expand(PBegin) - Method in class org.apache.beam.sdk.transforms.Create.Values
- expand(PBegin) - Method in class org.apache.beam.sdk.transforms.Create.WindowedValues
- expand(PBegin) - Method in class org.apache.beam.sdk.transforms.Impulse
- expand(PBegin) - Method in class org.apache.beam.sdk.transforms.PeriodicImpulse
- expand(PCollection) - Method in class org.apache.beam.sdk.schemas.transforms.Join.Impl
- expand(PCollection<?>) - Method in class org.apache.beam.sdk.extensions.python.transforms.RunInference
- expand(PCollection<? extends InputT>) - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.ParDoSingle
- expand(PCollection<? extends InputT>) - Method in class org.apache.beam.sdk.extensions.python.transforms.PythonMap
- expand(PCollection<? extends InputT>) - Method in class org.apache.beam.sdk.schemas.io.DeadLetteredTransform
- expand(PCollection<? extends InputT>) - Method in class org.apache.beam.sdk.transforms.FlatMapElements
- expand(PCollection<? extends InputT>) - Method in class org.apache.beam.sdk.transforms.MapElements
- expand(PCollection<? extends InputT>) - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
- expand(PCollection<? extends InputT>) - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
- expand(PCollection<? extends Iterable<T>>) - Method in class org.apache.beam.sdk.transforms.Flatten.Iterables
- expand(PCollection<? extends KV<?, V>>) - Method in class org.apache.beam.sdk.transforms.Values
- expand(PCollection<? extends KV<K, ?>>) - Method in class org.apache.beam.sdk.transforms.Keys
- expand(PCollection<? extends KV<K, ? extends Iterable<InputT>>>) - Method in class org.apache.beam.sdk.transforms.Combine.GroupedValues
- expand(PCollection<byte[]>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.ProtoFromBytes
- expand(PCollection<byte[]>) - Method in class org.apache.beam.sdk.io.pulsar.PulsarIO.Write
- expand(PCollection<byte[]>) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
- expand(PCollection<SearchGoogleAdsStreamRequest>) - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsV19.ReadAll
- expand(PCollection<GenericJson>) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAICreateCatalogItem
-
The transform converts the contents of input PCollection into
CatalogItems and then calls the Recommendation AI service to create the catalog item. - expand(PCollection<GenericJson>) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIPredict
- expand(PCollection<GenericJson>) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIWriteUserEvent
-
The transform converts the contents of input PCollection into
UserEvents and then calls the Recommendation AI service to create the user event. - expand(PCollection<Row>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableRowToBeamRow
- expand(PCollection<Row>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableRowToBeamRowFlat
- expand(PCollection<Document>) - Method in class org.apache.beam.sdk.extensions.ml.AnnotateText
- expand(PCollection<PubSubMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.AddUuidsTransform
- expand(PCollection<PubSubMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.SetUuidFromPubSubMessage
- expand(PCollection<SequencedMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UuidDeduplicationTransform
- expand(PCollection<Mutation>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
- expand(PCollection<Entity>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
- expand(PCollection<Entity>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
- expand(PCollection<Entity>) - Method in class org.apache.beam.sdk.io.gcp.datastore.EntityToRow
- expand(PCollection<Key>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
- expand(PCollection<BatchGetDocumentsRequest>) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchGetDocuments
- expand(PCollection<ListCollectionIdsRequest>) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.ListCollectionIds
- expand(PCollection<ListDocumentsRequest>) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.ListDocuments
- expand(PCollection<PartitionQueryRequest>) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.PartitionQuery
- expand(PCollection<RunQueryRequest>) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.RunQuery
- expand(PCollection<Write>) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchWriteWithDeadLetterQueue
- expand(PCollection<Write>) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchWriteWithSummary
- expand(PCollection<ByteString>) - Method in class org.apache.beam.sdk.extensions.ml.VideoIntelligence.AnnotateVideoFromBytes
- expand(PCollection<ElemT>) - Method in class org.apache.beam.runners.dataflow.CreateDataflowView
- expand(PCollection<ElemT>) - Method in class org.apache.beam.runners.spark.translation.streaming.CreateStreamingSparkView
- expand(PCollection<ElemT>) - Method in class org.apache.beam.sdk.transforms.View.CreatePCollectionView
- expand(PCollection<ErrorT>) - Method in class org.apache.beam.sdk.transforms.errorhandling.ErrorHandler.PTransformErrorHandler.WriteErrorMetrics
- expand(PCollection<EventT>) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Write
- expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.extensions.jackson.AsJsons.AsJsonsWithFailures
- expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.extensions.jackson.AsJsons
- expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.GloballyDistinct
- expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.GlobalSketch
- expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite
- expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Write
- expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
- expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
- expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
- expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineGlobally
- expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
- expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.transforms.Combine.Globally
- expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.transforms.Combine.GloballyAsSingletonView
- expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.transforms.FlatMapElements.FlatMapWithFailures
- expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.transforms.MapElements.MapWithFailures
- expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.transforms.Watch.Growth
- expand(PCollection<Double>) - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.GlobalDigest
- expand(PCollection<String>) - Method in class org.apache.beam.runners.spark.structuredstreaming.examples.WordCount.CountWords
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ParseAll
-
Deprecated.
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ReadAll
-
Deprecated.
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.extensions.jackson.ParseJsons
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.extensions.jackson.ParseJsons.ParseJsonsWithFailures
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.extensions.ml.VideoIntelligence.AnnotateVideoFromUri
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTableProvider.CsvToRow
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTableProvider.LinesReadConverter
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.csv.CsvIOParse
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocToBulk
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.FileIO.MatchAll
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Import
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.FetchHL7v2Message
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsV19.Read
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Write
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.redis.RedisIO.ReadKeyPatterns
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.TextIO.ReadAll
-
Deprecated.
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.TextIO.Write
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.AllMatches
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.Find
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.FindAll
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.FindKV
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.FindName
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.FindNameKV
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.Matches
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.MatchesKV
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.MatchesName
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.MatchesNameKV
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.ReplaceAll
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.ReplaceFirst
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.Split
- expand(PCollection<List<ElemT>>) - Method in class org.apache.beam.runners.spark.translation.streaming.CreateStreamingSparkView.CreateSparkPCollectionView
- expand(PCollection<CassandraIO.Read<T>>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.ReadAll
- expand(PCollection<ElasticsearchIO.Document>) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BulkIO
- expand(PCollection<FileIO.ReadableFile>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ParseFiles
- expand(PCollection<FileIO.ReadableFile>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ReadFiles
- expand(PCollection<FileIO.ReadableFile>) - Method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.ReadFiles
- expand(PCollection<FileIO.ReadableFile>) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.ParseFiles
- expand(PCollection<FileIO.ReadableFile>) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.ReadFiles
- expand(PCollection<FileIO.ReadableFile>) - Method in class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceTransform
- expand(PCollection<FileIO.ReadableFile>) - Method in class org.apache.beam.sdk.io.TextIO.ReadFiles
- expand(PCollection<FileIO.ReadableFile>) - Method in class org.apache.beam.sdk.io.TFRecordIO.ReadFiles
- expand(PCollection<FileIO.ReadableFile>) - Method in class org.apache.beam.sdk.io.thrift.ThriftIO.ReadFiles
- expand(PCollection<FileIO.ReadableFile>) - Method in class org.apache.beam.sdk.io.tika.TikaIO.ParseFiles
- expand(PCollection<FileIO.ReadableFile>) - Method in class org.apache.beam.sdk.io.xml.XmlIO.ReadFiles
- expand(PCollection<MatchResult.Metadata>) - Method in class org.apache.beam.sdk.io.FileIO.ReadMatches
- expand(PCollection<FhirBundleParameter>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundles
- expand(PCollection<FhirIOPatientEverything.PatientEverythingParameter>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything
- expand(PCollection<FhirSearchParameter<T>>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search
- expand(PCollection<HL7v2Message>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write
- expand(PCollection<HL7v2ReadParameter>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read
- expand(PCollection<HL7v2ReadParameter>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read.FetchHL7v2Message
- expand(PCollection<PubsubMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
- expand(PCollection<MutationGroup>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.WriteGrouped
- expand(PCollection<ReadOperation>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
- expand(PCollection<HBaseIO.Read>) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.ReadAll
- expand(PCollection<KafkaRecord<K, V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadRedistribute
- expand(PCollection<KafkaSourceDescriptor>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
- expand(PCollection<RabbitMqMessage>) - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO.Write
- expand(PCollection<SolrIO.Read>) - Method in class org.apache.beam.sdk.io.solr.SolrIO.ReadAll
- expand(PCollection<SplunkEvent>) - Method in class org.apache.beam.sdk.io.splunk.SplunkIO.Write
- expand(PCollection<SuccessOrFailure>) - Method in class org.apache.beam.sdk.testing.PAssert.DefaultConcludeTransform
- expand(PCollection<PeriodicSequence.SequenceDefinition>) - Method in class org.apache.beam.sdk.transforms.PeriodicSequence
- expand(PCollection<KV<byte[], RowMutations>>) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.WriteRowMutations
- expand(PCollection<KV<ByteString, VideoContext>>) - Method in class org.apache.beam.sdk.extensions.ml.VideoIntelligence.AnnotateVideoFromBytesWithContext
- expand(PCollection<KV<ByteString, Iterable<Mutation>>>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
- expand(PCollection<KV<ByteString, Iterable<Mutation>>>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.WriteWithResults
- expand(PCollection<KV<DestinationT, ElementT>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.CreateTables
- expand(PCollection<KV<DestinationT, ElementT>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiConvertMessages
- expand(PCollection<KV<DestinationT, ElementT>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiLoads
- expand(PCollection<KV<DestinationT, ElementT>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingInserts
- expand(PCollection<KV<DestinationT, StorageApiWritePayload>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteRecordsInconsistent
- expand(PCollection<KV<DestinationT, StorageApiWritePayload>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords
- expand(PCollection<KV<EventKeyT, KV<Long, EventT>>>) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedEventProcessor
- expand(PCollection<KV<String, GenericJson>>) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportCatalogItems
-
The transform converts the contents of input PCollection into
CatalogItems and then calls the Recommendation AI service to create the catalog item. - expand(PCollection<KV<String, GenericJson>>) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportUserEvents
-
The transform converts the contents of input PCollection into
UserEvents and then calls the Recommendation AI service to create the user event. - expand(PCollection<KV<String, VideoContext>>) - Method in class org.apache.beam.sdk.extensions.ml.VideoIntelligence.AnnotateVideoFromURIWithContext
- expand(PCollection<KV<String, String>>) - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
-
The transform converts the contents of input PCollection into
Table.Rows and then calls Cloud DLP service to perform the deidentification according to provided settings. - expand(PCollection<KV<String, String>>) - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText
-
The transform converts the contents of input PCollection into
Table.Rows and then calls Cloud DLP service to perform the data inspection according to provided settings. - expand(PCollection<KV<String, String>>) - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
-
The transform converts the contents of input PCollection into
Table.Rows and then calls Cloud DLP service to perform the reidentification according to provided settings. - expand(PCollection<KV<String, String>>) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Write
- expand(PCollection<KV<String, Map<String, String>>>) - Method in class org.apache.beam.sdk.io.redis.RedisIO.WriteStreams
- expand(PCollection<KV<K, InputT>>) - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
- expand(PCollection<KV<K, InputT>>) - Method in class org.apache.beam.sdk.transforms.Combine.PerKeyWithHotKeyFanout
- expand(PCollection<KV<K, InputT>>) - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches
- expand(PCollection<KV<K, InputT>>) - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches.WithShardedKey
- expand(PCollection<KV<K, Double>>) - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.PerKeyDigest
- expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.runners.dataflow.internal.DataflowGroupByKey
- expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.PerKeyDistinct
- expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.PerKeySketch
- expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.PerKey
- expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.io.cdap.CdapIO.Write
- expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
- expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.PerKey
-
Deprecated.
- expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.Deduplicate.KeyedValues
- expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.GroupByEncryptedKey
- expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.GroupByKey
- expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.KvSwap
- expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.Redistribute.RedistributeByKey
- expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.Reshuffle
- expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.View.AsMap
- expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.View.AsMultimap
- expand(PCollection<KV<K, V1>>) - Method in class org.apache.beam.sdk.extensions.joinlibrary.Join.FullOuterJoin
- expand(PCollection<KV<K, V1>>) - Method in class org.apache.beam.sdk.extensions.joinlibrary.Join.InnerJoin
- expand(PCollection<KV<K, V1>>) - Method in class org.apache.beam.sdk.extensions.joinlibrary.Join.LeftOuterJoin
- expand(PCollection<KV<K, V1>>) - Method in class org.apache.beam.sdk.extensions.joinlibrary.Join.RightOuterJoin
- expand(PCollection<KV<K, V1>>) - Method in class org.apache.beam.sdk.transforms.MapValues
- expand(PCollection<KV<K1, V>>) - Method in class org.apache.beam.sdk.transforms.MapKeys
- expand(PCollection<KV<KeyT, ValueT>>) - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Write
- expand(PCollection<KV<TableDestination, ElementT>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingWriteTables
- expand(PCollection<KV<KafkaSourceDescriptor, KafkaRecord<K, V>>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaCommitOffset
- expand(PCollection<KV<ShardedKey<DestinationT>, Iterable<StorageApiWritePayload>>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritesShardedRecords
- expand(PCollection<KV<PrimaryKeyT, Iterable<KV<SecondaryKeyT, ValueT>>>>) - Method in class org.apache.beam.sdk.extensions.sorter.SortValues
- expand(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.python.transforms.DataframeTransform
- expand(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamJoinTransforms.JoinAsLookup
- expand(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTable.RowToDocument
- expand(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTableProvider.LinesWriteConverter
- expand(PCollection<Row>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BeamRowToBigtableMutation
- expand(PCollection<Row>) - Method in class org.apache.beam.sdk.io.gcp.datastore.RowToEntity
- expand(PCollection<Row>) - Method in class org.apache.beam.sdk.io.iceberg.IcebergIO.WriteRows
- expand(PCollection<Mutation>) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Write
- expand(PCollection<HCatRecord>) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Write
- expand(PCollection<ProducerRecord<K, V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- expand(PCollection<Message>) - Method in class org.apache.beam.sdk.io.amqp.AmqpIO.Write
- expand(PCollection<SolrInputDocument>) - Method in class org.apache.beam.sdk.io.solr.SolrIO.Write
- expand(PCollection<Document>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTable.DocumentToRow
- expand(PCollection<Document>) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
- expand(PCollection<ParameterT>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadAll
- expand(PCollection<ParameterT>) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.ReadAll
- expand(PCollection<ParameterT>) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.WriteUnwind
- expand(PCollection<RequestT>) - Method in class org.apache.beam.io.requestresponse.RequestResponseIO
- expand(PCollection<SendMessageRequest>) - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.Write
-
Deprecated.
- expand(PCollection<T>) - Method in class org.apache.beam.runners.spark.io.ConsoleIO.Write.Unbound
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
- expand(PCollection<ByteString>) - Method in class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromBytes
-
Applies all necessary transforms to call the Vision API.
- expand(PCollection<KV<ByteString, ImageContext>>) - Method in class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromBytesWithContext
-
Applies all necessary transforms to call the Vision API.
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromGcsUri
-
Applies all necessary transforms to call the Vision API.
- expand(PCollection<KV<String, ImageContext>>) - Method in class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromGcsUriWithContext
-
Applies all necessary transforms to call the Vision API.
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.Globally
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.aws2.dynamodb.DynamoDBIO.Write
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Write
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.aws2.sns.SnsIO.Write
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.WriteBatches
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.csv.CsvIO.Write
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.CombineAsIterable
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.ReifyAsIterable
- expand(PCollection<Entity>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntityWithSummary
- expand(PCollection<Key>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKeyWithSummary
- expand(PCollection<Entity>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.WriteWithSummary
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.ProtoToBytes
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteVoid
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteWithResults
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.json.JsonIO.Write
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.kudu.KuduIO.Write
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Write
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.Write
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Write
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Write
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.schemas.transforms.AddFields.Inner
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.schemas.transforms.Cast
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.schemas.transforms.DropFields.Inner
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.schemas.transforms.Filter.Inner
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.schemas.transforms.RenameFields.Inner
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.schemas.transforms.Select.Fields
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.schemas.transforms.Select.Flattened
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.schemas.transforms.WithKeys
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.testing.PAssert.GroupThenAssert
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.testing.PAssert.GroupThenAssertForSingleton
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.Globally
-
Deprecated.
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.Deduplicate.Values
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.Deduplicate.WithRepresentativeValues
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.Distinct
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.Distinct.WithRepresentativeValues
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.Filter
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.Partition
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.Redistribute.RedistributeArbitrarily
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.Reshuffle.ViaRandomKey
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.Tee
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.ToJson
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.View.AsIterable
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.View.AsList
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.View.AsSingleton
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.Wait.OnSignal
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.windowing.Window.Assign
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.windowing.Window
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.WithTimestamps
- expand(PCollection<UserT>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
- expand(PCollection<UserT>) - Method in class org.apache.beam.sdk.io.FileIO.Write
- expand(PCollection<UserT>) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
- expand(PCollection<UserT>) - Method in class org.apache.beam.sdk.io.WriteFiles
- expand(PCollection<V>) - Method in class org.apache.beam.sdk.transforms.WithKeys
- expand(PCollection<ValueT>) - Method in class org.apache.beam.sdk.extensions.timeseries.FillGaps
- expand(PCollectionList<Row>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSetOperatorRelBase
- expand(PCollectionList<T>) - Method in class org.apache.beam.sdk.transforms.Flatten.PCollections
- expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider.CsvWriteTransform
- expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformProvider.BigQueryExportSchemaTransform
- expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransform
- expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryFileLoadsSchemaTransformProvider.BigQueryFileLoadsSchemaTransform
- expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform
- expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteSchemaTransformProvider.BigQueryWriteSchemaTransform
- expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransform
- expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransform
- expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.io.json.providers.JsonWriteTransformProvider.JsonWriteTransform
- expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceSchemaTransform
- expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider.ExplodeTransform
- expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider.JavaFilterTransform
- expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider.JavaMapToFieldsTransform
- expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider.LoggingTransform
- expand(PCollectionTuple) - Method in class org.apache.beam.sdk.schemas.transforms.CoGroup.ExpandCrossProduct
- expand(PCollectionTuple) - Method in class org.apache.beam.sdk.schemas.transforms.CoGroup.Impl
- expand(PInput) - Method in class org.apache.beam.sdk.extensions.sql.SqlTransform
- expand(PInput) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
- expand(PInput) - Method in class org.apache.beam.sdk.managed.Managed.ManagedTransform
- expandInconsistent(PCollection<KV<DestinationT, ElementT>>, Coder<KV<DestinationT, StorageApiWritePayload>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiLoads
- expandInput(PInput) - Static method in class org.apache.beam.sdk.values.PValues
- expandOutput(POutput) - Static method in class org.apache.beam.sdk.values.PValues
- expandTriggered(PCollection<KV<DestinationT, ElementT>>, Coder<KV<DestinationT, StorageApiWritePayload>>, Coder<StorageApiWritePayload>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiLoads
- expandUntriggered(PCollection<KV<DestinationT, ElementT>>, Coder<KV<DestinationT, StorageApiWritePayload>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiLoads
- expandValue(PValue) - Static method in class org.apache.beam.sdk.values.PValues
- ExpansionServer - Class in org.apache.beam.sdk.expansion.service
-
A
gRPC Serverfor an ExpansionService. - ExpansionService - Class in org.apache.beam.sdk.expansion.service
-
A service that allows pipeline expand transforms from a remote SDK.
- ExpansionService() - Constructor for class org.apache.beam.sdk.expansion.service.ExpansionService
- ExpansionService(String[]) - Constructor for class org.apache.beam.sdk.expansion.service.ExpansionService
- ExpansionService(PipelineOptions) - Constructor for class org.apache.beam.sdk.expansion.service.ExpansionService
- ExpansionService(PipelineOptions, String) - Constructor for class org.apache.beam.sdk.expansion.service.ExpansionService
- ExpansionService.ExpansionServiceRegistrar - Interface in org.apache.beam.sdk.expansion.service
-
A registrar that creates
TransformProviderinstances fromRunnerApi.FunctionSpecs. - ExpansionService.ExternalTransformRegistrarLoader - Class in org.apache.beam.sdk.expansion.service
-
Exposes Java transforms via
ExternalTransformRegistrar. - ExpansionServiceConfig - Class in org.apache.beam.sdk.expansion.service
- ExpansionServiceConfig() - Constructor for class org.apache.beam.sdk.expansion.service.ExpansionServiceConfig
- ExpansionServiceConfigFactory() - Constructor for class org.apache.beam.sdk.expansion.service.ExpansionServiceOptions.ExpansionServiceConfigFactory
- ExpansionServiceOptions - Interface in org.apache.beam.sdk.expansion.service
-
Options used to configure the
ExpansionService. - ExpansionServiceOptions.ExpansionServiceConfigFactory - Class in org.apache.beam.sdk.expansion.service
-
Loads the ExpansionService config.
- ExpansionServiceOptions.JavaClassLookupAllowListFactory - Class in org.apache.beam.sdk.expansion.service
-
Loads the allow list from
ExpansionServiceOptions.getJavaClassLookupAllowlistFile(), defaulting to an emptyJavaClassLookupTransformProvider.AllowList. - ExpansionServiceSchemaTransformProvider - Class in org.apache.beam.sdk.expansion.service
- expectDryRunQuery(String, String, JobStatistics) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
- EXPECTED_SQN_PATTERN - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiCDC
-
Expected valid pattern for a
StorageApiCDC.CHANGE_SQN_COLUMNvalue for use with BigQuery's_CHANGE_SEQUENCE_NUMBERformat. - expectFileToNotExist() - Method in class org.apache.beam.sdk.io.fs.CreateOptions
-
True if the file is expected to not exist.
- EXPERIMENTAL_HOST_INSTANCE_ID - Static variable in class org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor
-
Instance ID to use when connecting to an experimental host.
- ExperimentalOptions - Interface in org.apache.beam.sdk.options
-
Apache Beam provides a number of experimental features that can be enabled with this flag.
- explain(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
- explainLazily(RelNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils
-
A lazy explain via
Object.toString()for logging purposes. - explainQuery(String) - Method in class org.apache.beam.sdk.extensions.sql.BeamSqlCli
-
Returns a human readable representation of the query execution plan.
- explainTerms(RelWriter) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamAggregationRel
- explainTerms(RelWriter) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamPushDownIOSourceRel
- explainTerms(RelWriter) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnnestRel
- explicitRandomPartitioner(int) - Static method in interface org.apache.beam.sdk.io.aws2.kinesis.KinesisPartitioner
-
Explicit hash key partitioner that randomly returns one of x precalculated hash keys.
- EXPLODE_WINDOWS - Enum constant in enum class org.apache.beam.runners.spark.translation.SparkCombineFn.WindowedAccumulator.Type
- explodeWindows() - Method in interface org.apache.beam.sdk.values.WindowedValue
-
A representation of each of the actual values represented by this compressed
WindowedValue, one per window. - explodeWindows() - Method in class org.apache.beam.sdk.values.WindowedValues.Builder
- Export(ValueProvider<String>, ValueProvider<String>) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Export
- EXPORT - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead.Method
-
Export data to Google Cloud Storage in Avro format and read data files from that location.
- exportFhirResourceToBigQuery(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Export a FHIR Resource to BigQuery.
- exportFhirResourceToBigQuery(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- exportFhirResourceToGcs(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Export a FHIR Resource to GCS.
- exportFhirResourceToGcs(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- exportResources(String, String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
-
Export resources to GCS.
- exportResources(ValueProvider<String>, ValueProvider<String>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
- exportResources(DoFn.ProcessContext) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Export.ExportResourcesFn
- ExportResourcesFn(ValueProvider<String>) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Export.ExportResourcesFn
- expressionsInFilter(List<RexNode>) - Static method in interface org.apache.beam.sdk.extensions.sql.meta.BeamSqlTableFilter
-
Count a number of
RexNodes involved in all supported filters. - extend(String) - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Path
-
Extend the path by appending a sub-component path.
- External() - Constructor for class org.apache.beam.sdk.io.GenerateSequence.External
- External() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External
- External() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaIO.Write.External
- ExternalConfiguration() - Constructor for class org.apache.beam.sdk.io.GenerateSequence.External.ExternalConfiguration
- ExternalEnvironmentFactory - Class in org.apache.beam.runners.fnexecution.environment
-
An
EnvironmentFactorywhich requests workers via the given URL in the Environment. - ExternalEnvironmentFactory.Provider - Class in org.apache.beam.runners.fnexecution.environment
-
Provider of ExternalEnvironmentFactory.
- ExternalRead - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Exposes
PubsubIO.Readas an external transform for cross-language usage. - ExternalRead() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead
- ExternalRead.Configuration - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Parameters class to expose the transform to an external SDK.
- ExternalRead.ReadBuilder - Class in org.apache.beam.sdk.io.gcp.pubsub
- ExternalSchemaIOTransformRegistrar - Class in org.apache.beam.sdk.extensions.schemaio.expansion
- ExternalSchemaIOTransformRegistrar() - Constructor for class org.apache.beam.sdk.extensions.schemaio.expansion.ExternalSchemaIOTransformRegistrar
- ExternalSchemaIOTransformRegistrar.Configuration - Class in org.apache.beam.sdk.extensions.schemaio.expansion
- ExternalSorter - Class in org.apache.beam.sdk.extensions.sorter
-
Does an external sort of the provided values.
- ExternalSorter.Options - Class in org.apache.beam.sdk.extensions.sorter
-
ExternalSorter.Optionscontains configuration of the sorter. - ExternalSorter.Options.SorterType - Enum Class in org.apache.beam.sdk.extensions.sorter
-
Sorter type.
- ExternalSqlTransformRegistrar - Class in org.apache.beam.sdk.extensions.sql.expansion
- ExternalSqlTransformRegistrar() - Constructor for class org.apache.beam.sdk.extensions.sql.expansion.ExternalSqlTransformRegistrar
- ExternalSqlTransformRegistrar.Configuration - Class in org.apache.beam.sdk.extensions.sql.expansion
- ExternalSynchronization - Interface in org.apache.beam.sdk.io.hadoop.format
-
Provides mechanism for acquiring locks related to the job.
- ExternalTransformBuilder<ConfigT,
InputT, - Interface in org.apache.beam.sdk.transformsOutputT> -
An interface for building a transform from an externally provided configuration.
- ExternalTransformRegistrar - Interface in org.apache.beam.sdk.expansion
-
A registrar which contains a mapping from URNs to available
ExternalTransformBuilders. - ExternalTransformRegistrarImpl - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
- ExternalTransformRegistrarImpl() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.ExternalTransformRegistrarImpl
- ExternalTransformRegistrarLoader() - Constructor for class org.apache.beam.sdk.expansion.service.ExpansionService.ExternalTransformRegistrarLoader
- externalWithMetadata() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- ExternalWrite - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Exposes
PubsubIO.Writeas an external transform for cross-language usage. - ExternalWrite() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite
- ExternalWrite.Configuration - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Parameters class to expose the transform to an external SDK.
- ExternalWrite.ParsePubsubMessageProtoAsPayloadFromWindowedValue - Class in org.apache.beam.sdk.io.gcp.pubsub
- ExternalWrite.WriteBuilder - Class in org.apache.beam.sdk.io.gcp.pubsub
- extractFromTypeParameters(T, Class<? super T>, TypeDescriptors.TypeVariableExtractor<T, V>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
Extracts a type from the actual type parameters of a parameterized class, subject to Java type erasure.
- extractFromTypeParameters(TypeDescriptor<T>, Class<? super T>, TypeDescriptors.TypeVariableExtractor<T, V>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
Like
TypeDescriptors.extractFromTypeParameters(Object, Class, TypeVariableExtractor), but takes aTypeDescriptorof the instance being analyzed rather than the instance itself. - extractKeys(Object, Object[], int) - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueComparator
- extractOutput() - Method in interface org.apache.beam.runners.spark.translation.SparkCombineFn.WindowedAccumulator
-
Extract output.
- extractOutput() - Method in interface org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn.Accumulator
-
Returns the output value that is the result of combining all the input values represented by this accumulator.
- extractOutput(double[]) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
- extractOutput(int[]) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
- extractOutput(long[]) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CountIf.CountIfFn
- extractOutput(long[]) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
- extractOutput(AccumT) - Method in class org.apache.beam.sdk.extensions.sql.impl.LazyAggregateCombineFn
- extractOutput(AccumT) - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
- extractOutput(AccumT) - Method in interface org.apache.beam.sdk.extensions.sql.udf.AggregateFn
-
Returns the output value that is the result of combining all the input values represented by the given accumulator.
- extractOutput(AccumT) - Method in class org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn
- extractOutput(AccumT) - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
-
Returns the output value that is the result of combining all the input values represented by the given accumulator.
- extractOutput(AccumT, CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
-
Returns the output value that is the result of combining all the input values represented by the given accumulator.
- extractOutput(HyperLogLogPlus) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
-
Output the whole structure so it can be queried, reused or stored easily.
- extractOutput(MergingDigest) - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
-
Output the whole structure so it can be queried, reused or stored easily.
- extractOutput(Iterable<T>) - Method in class org.apache.beam.runners.flink.FlinkStreamingAggregationsTranslators.ConcatenateAsIterable
- extractOutput(Long) - Method in class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.Sum
- extractOutput(Object[]) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
- extractOutput(Object[], CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
- extractOutput(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggByte
- extractOutput(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggString
- extractOutput(List<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Concatenate
- extractOutput(List<T>) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.ArrayAgg.ArrayAggArray
- extractOutput(List<V>) - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
- extractOutput(K, AccumT, PipelineOptions, SideInputReader, Collection<? extends BoundedWindow>) - Method in class org.apache.beam.runners.flink.translation.functions.AbstractFlinkCombineRunner.CompleteFlinkCombiner
- extractOutput(K, AccumT, PipelineOptions, SideInputReader, Collection<? extends BoundedWindow>) - Method in class org.apache.beam.runners.flink.translation.functions.AbstractFlinkCombineRunner.FinalFlinkCombiner
- extractOutput(K, AccumT, PipelineOptions, SideInputReader, Collection<? extends BoundedWindow>) - Method in interface org.apache.beam.runners.flink.translation.functions.AbstractFlinkCombineRunner.FlinkCombiner
- extractOutput(K, AccumT, PipelineOptions, SideInputReader, Collection<? extends BoundedWindow>) - Method in class org.apache.beam.runners.flink.translation.functions.AbstractFlinkCombineRunner.PartialFlinkCombiner
- extractOutput(SequenceRangeAccumulator) - Method in class org.apache.beam.sdk.extensions.ordered.combiner.DefaultSequenceCombiner
- extractOutput(SketchFrequencies.Sketch<InputT>) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
-
Output the whole structure so it can be queried, reused or stored easily.
- extractOutput(CovarianceAccumulator) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CovarianceFn
- extractOutput(VarianceAccumulator) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.VarianceFn
- extractOutput(BeamBuiltinAggregations.BitXOr.Accum) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations.BitXOr
- extractOutput(ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
-
Deprecated.
- extractOutput(Combine.Holder<V>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
- extractOutput(Top.BoundedHeap<KV<Integer, T>, SerializableComparator<KV<Integer, T>>>) - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
- extractOutputs(PCollectionRowTuple) - Method in class org.apache.beam.sdk.expansion.service.ExpansionServiceSchemaTransformProvider
- extractOutputs(OutputT) - Method in interface org.apache.beam.sdk.expansion.service.TransformProvider
- extractOutputStream(SparkCombineFn.WindowedAccumulator<?, ?, AccumT, ?>) - Method in class org.apache.beam.runners.spark.translation.SparkCombineFn
-
Extracts the stream of accumulated values.
- extractTableNamesFromNode(SqlNode) - Static method in class org.apache.beam.sdk.extensions.sql.TableNameExtractionUtils
- extractTimestampAttribute(String, Map<String, String>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Return the timestamp (in ms since unix epoch) to use for a Pubsub message with
timestampAttributeandattriutes. - extractTimestampsFromValues() - Static method in class org.apache.beam.sdk.transforms.Reify
-
Extracts the timestamps from each value in a
KV.
F
- factory - Variable in class org.apache.beam.sdk.extensions.sql.impl.CalciteFactoryWrapper
- Factory<T> - Interface in org.apache.beam.sdk.schemas
-
A Factory interface for schema-related objects for a specific Java type.
- Factory() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel.Factory
- FACTORY - Static variable in class org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner
- FACTORY - Static variable in class org.apache.beam.sdk.extensions.sql.impl.parser.BeamSqlParser
-
Parser factory.
- FACTORY - Static variable in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
- FACTORY - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
-
Factory for creating Pubsub clients using gRPC transport.
- FACTORY - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
-
Factory for creating Pubsub clients using Json transport.
- FAIL_FAST - Enum constant in enum class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.FailureMode
-
Invalid write to Spanner will cause the pipeline to fail.
- FAIL_IF_EXISTS - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ExistingPipelineOptions
- failed(Error) - Method in interface org.apache.beam.runners.local.PipelineMessageReceiver
-
Report that a failure has occurred.
- failed(Exception) - Method in interface org.apache.beam.runners.local.PipelineMessageReceiver
-
Report that a failure has occurred.
- FAILED - Enum constant in enum class org.apache.beam.runners.local.ExecutionDriver.DriverState
- FAILED - Enum constant in enum class org.apache.beam.sdk.PipelineResult.State
-
The job has failed.
- FAILED - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write
-
The tag for the failed writes to HL7v2 store`.
- FAILED_BODY - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
-
The tag for the failed writes to FHIR store.
- FAILED_BUNDLES - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundles
-
The TupleTag used for bundles that failed to be executed for any reason.
- FAILED_FILES - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
-
The tag for the files that failed to FHIR store.
- FAILED_PUBLISH_TAG - Static variable in class org.apache.beam.sdk.io.solace.SolaceIO.Write
- FAILED_WRITES - Static variable in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
- failedRecords(List<RecT>, List<ResT>) - Method in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
- FailedRunningPipelineResults - Class in org.apache.beam.runners.jet
-
Alternative implementation of
PipelineResultused to avoid throwing Exceptions in certain situations. - FailedRunningPipelineResults(RuntimeException) - Constructor for class org.apache.beam.runners.jet.FailedRunningPipelineResults
- FailedWritesException(List<FirestoreV1.WriteFailure>) - Constructor for exception class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.FailedWritesException
- failOnInsert(Map<TableRow, List<TableDataInsertAllResponse.InsertErrors>>) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
-
Cause a given
TableRowobject to fail when it's inserted. - FailsafeValueInSingleWindow<T,
ErrorT> - Class in org.apache.beam.sdk.values -
An immutable tuple of value, timestamp, window, and pane.
- FailsafeValueInSingleWindow() - Constructor for class org.apache.beam.sdk.values.FailsafeValueInSingleWindow
- FailsafeValueInSingleWindow.Coder<T,
ErrorT> - Class in org.apache.beam.sdk.values -
A coder for
FailsafeValueInSingleWindow. - failure(String, String, Metadata, Throwable) - Static method in class org.apache.beam.sdk.io.tika.ParseResult
- failure(PAssert.PAssertionSite, Throwable) - Static method in class org.apache.beam.sdk.testing.SuccessOrFailure
- Failure - Class in org.apache.beam.sdk.schemas.io
-
A generic failure of an SQL transform.
- Failure() - Constructor for class org.apache.beam.sdk.schemas.io.Failure
- Failure() - Constructor for class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Failure
- FAILURE_COUNTER - Static variable in class org.apache.beam.sdk.testing.PAssert
- FAILURE_TAG - Static variable in class org.apache.beam.sdk.extensions.ml.RecommendationAICreateCatalogItem
- FAILURE_TAG - Static variable in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportCatalogItems
- FAILURE_TAG - Static variable in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportUserEvents
- FAILURE_TAG - Static variable in class org.apache.beam.sdk.extensions.ml.RecommendationAIPredict
- FAILURE_TAG - Static variable in class org.apache.beam.sdk.extensions.ml.RecommendationAIWriteUserEvent
- Failure.Builder - Class in org.apache.beam.sdk.schemas.io
- FailureCollectorWrapper - Class in org.apache.beam.sdk.io.cdap.context
-
Class FailureCollectorWrapper is a class for collecting ValidationFailure.
- FailureCollectorWrapper() - Constructor for class org.apache.beam.sdk.io.cdap.context.FailureCollectorWrapper
- failures() - Method in class org.apache.beam.sdk.transforms.WithFailures.Result
- failuresTo(List<PCollection<FailureElementT>>) - Method in class org.apache.beam.sdk.transforms.WithFailures.Result
-
Adds the failure collection to the passed list and returns just the output collection.
- FakeBigQueryServerStream(List<T>) - Constructor for class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices.FakeBigQueryServerStream
- FakeBigQueryServices - Class in org.apache.beam.sdk.io.gcp.testing
-
A fake implementation of BigQuery's query service..
- FakeBigQueryServices() - Constructor for class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
- FakeBigQueryServices.FakeBigQueryServerStream<T> - Class in org.apache.beam.sdk.io.gcp.testing
-
An implementation of
BigQueryServices.BigQueryServerStreamwhich takes aListas theIterableto simulate a server stream. - FakeDatasetService - Class in org.apache.beam.sdk.io.gcp.testing
-
A fake dataset service that can be serialized, for use in testReadFromTable.
- FakeDatasetService() - Constructor for class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- FakeJobService - Class in org.apache.beam.sdk.io.gcp.testing
-
A fake implementation of BigQuery's job service.
- FakeJobService() - Constructor for class org.apache.beam.sdk.io.gcp.testing.FakeJobService
- FakeJobService(int) - Constructor for class org.apache.beam.sdk.io.gcp.testing.FakeJobService
- Fanout() - Constructor for class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields.Fanout
- Fault Tolerance - Search tag in class org.apache.beam.sdk.transforms.ParDo
- Section
- featureList - Variable in class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromBytes
- featureList - Variable in class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromBytesWithContext
- featureList - Variable in class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromGcsUri
- featureList - Variable in class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromGcsUriWithContext
- features() - Method in class org.apache.beam.sdk.extensions.ml.AnnotateText
- fetchDataflowJobId() - Static method in class org.apache.beam.sdk.extensions.gcp.util.GceMetadataUtil
- fetchDataflowJobName() - Static method in class org.apache.beam.sdk.extensions.gcp.util.GceMetadataUtil
- fetchDataflowWorkerId() - Static method in class org.apache.beam.sdk.extensions.gcp.util.GceMetadataUtil
- FetchHL7v2Message() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read.FetchHL7v2Message
-
Instantiates a new Fetch HL7v2 message DoFn.
- FetchHL7v2Message() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.FetchHL7v2Message
-
Instantiates a new Fetch HL7v2 message DoFn.
- fewKeys() - Method in class org.apache.beam.sdk.transforms.GroupByKey
-
Returns whether it groups just few keys.
- fewKeys(CombineFnBase.GlobalCombineFn<? super InputT, ?, OutputT>) - Static method in class org.apache.beam.sdk.transforms.Combine
-
Returns a
Combine.PerKey, and set fewKeys inGroupByKey. - FhirBundleParameter - Class in org.apache.beam.sdk.io.gcp.healthcare
-
FhirBundleParameter represents a FHIR bundle in JSON format to be executed on a FHIR store.
- FhirBundleParameter() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleParameter
- FhirBundleResponse - Class in org.apache.beam.sdk.io.gcp.healthcare
- FhirBundleResponse() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleResponse
- FhirIO - Class in org.apache.beam.sdk.io.gcp.healthcare
-
FhirIOprovides an API for reading and writing resources to Google Cloud Healthcare Fhir API. - FhirIO() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
- FhirIO.Deidentify - Class in org.apache.beam.sdk.io.gcp.healthcare
-
Deidentify FHIR resources from a FHIR store to a destination FHIR store.
- FhirIO.Deidentify.DeidentifyFn - Class in org.apache.beam.sdk.io.gcp.healthcare
-
A function that schedules a deidentify operation and monitors the status.
- FhirIO.ExecuteBundles - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type Execute bundles.
- FhirIO.ExecuteBundlesResult - Class in org.apache.beam.sdk.io.gcp.healthcare
-
ExecuteBundlesResult contains both successfully executed bundles and information help debugging failed executions (eg metadata invalid input: '&' error msgs).
- FhirIO.Export - Class in org.apache.beam.sdk.io.gcp.healthcare
-
Export FHIR resources from a FHIR store to new line delimited json files on GCS or BigQuery.
- FhirIO.Export.ExportResourcesFn - Class in org.apache.beam.sdk.io.gcp.healthcare
-
A function that schedules an export operation and monitors the status.
- FhirIO.Import - Class in org.apache.beam.sdk.io.gcp.healthcare
-
Writes each bundle of elements to a new-line delimited JSON file on GCS and issues a fhirStores.import Request for that file.
- FhirIO.Import.ContentStructure - Enum Class in org.apache.beam.sdk.io.gcp.healthcare
-
The enum Content structure.
- FhirIO.Read - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type Read.
- FhirIO.Read.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type Result.
- FhirIO.Search<T> - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type Search.
- FhirIO.Search.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
- FhirIO.Write - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type Write.
- FhirIO.Write.AbstractResult - Class in org.apache.beam.sdk.io.gcp.healthcare
- FhirIO.Write.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type Result.
- FhirIO.Write.WriteMethod - Enum Class in org.apache.beam.sdk.io.gcp.healthcare
-
The enum Write method.
- FhirIOPatientEverything - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type FhirIOPatientEverything for querying a FHIR Patient resource's compartment.
- FhirIOPatientEverything() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything
- FhirIOPatientEverything.PatientEverythingParameter - Class in org.apache.beam.sdk.io.gcp.healthcare
-
PatientEverythingParameter defines required attributes for a FHIR GetPatientEverything request in
FhirIOPatientEverything. - FhirIOPatientEverything.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The Result for a
FhirIOPatientEverythingrequest. - FhirResourcePagesIterator(HttpHealthcareApiClient.FhirResourcePagesIterator.FhirMethod, HealthcareApiClient, String, String, String, Map<String, Object>) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.FhirResourcePagesIterator
- FhirSearchParameter<T> - Class in org.apache.beam.sdk.io.gcp.healthcare
-
FhirSearchParameter represents the query parameters for a FHIR search request, used as a parameter for
FhirIO.Search. - FhirSearchParameterCoder<T> - Class in org.apache.beam.sdk.io.gcp.healthcare
-
FhirSearchParameterCoder is the coder for
FhirSearchParameter, which takes a coder for type T. - fhirStoresImport(String, String, String, FhirIO.Import.ContentStructure) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
-
Import method for batch writing resources.
- fhirStoresImport(String, String, FhirIO.Import.ContentStructure) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
- fhirStoresImport(ValueProvider<String>, ValueProvider<String>, ValueProvider<String>, FhirIO.Import.ContentStructure) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
- field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.ArrayOfNestedStringBundle.Field
- field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.ArrayOfStringBundle.Field
- field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.ByteBufferBundle.Field
- field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.BytesBundle.Field
- field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.DateTimeBundle.Field
- field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.IntBundle.Field
- field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.MapOfIntBundle.Field
- field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.MapOfNestedIntBundle.Field
- field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.NestedBytesBundle.Field
- field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.NestedIntBundle.Field
- field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.StringBuilderBundle.Field
- field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.StringBundle.Field
- field(String, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.transforms.AddFields.Inner
-
Add a new field of the specified type.
- field(String, Schema.FieldType, Object) - Method in class org.apache.beam.sdk.schemas.transforms.AddFields.Inner
-
Add a new field of the specified type.
- field(Row, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel.WrappedRow
- Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.ArrayOfNestedStringBundle.Field
- Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.ArrayOfStringBundle.Field
- Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.ByteBufferBundle.Field
- Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.BytesBundle.Field
- Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.DateTimeBundle.Field
- Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.IntBundle.Field
- Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.MapOfIntBundle.Field
- Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.MapOfNestedIntBundle.Field
- Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.NestedBytesBundle.Field
- Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.NestedIntBundle.Field
- Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.StringBuilderBundle.Field
- Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.StringBundle.Field
- Field() - Constructor for class org.apache.beam.sdk.schemas.Schema.Field
- fieldAccess(FieldAccessDescriptor) - Static method in class org.apache.beam.sdk.schemas.transforms.Select
-
Select a set of fields described in a
FieldAccessDescriptor. - fieldAccessDescriptor(FieldAccessDescriptor) - Static method in class org.apache.beam.sdk.schemas.transforms.CoGroup.By
-
Join by the following field access descriptor.
- FieldAccessDescriptor - Class in org.apache.beam.sdk.schemas
-
Used inside of a
DoFnto describe which fields in a schema type need to be accessed for processing. - FieldAccessDescriptor() - Constructor for class org.apache.beam.sdk.schemas.FieldAccessDescriptor
- FieldAccessDescriptor.FieldDescriptor - Class in org.apache.beam.sdk.schemas
-
Description of a single field.
- FieldAccessDescriptor.FieldDescriptor.Builder - Class in org.apache.beam.sdk.schemas
-
Builder class.
- FieldAccessDescriptor.FieldDescriptor.ListQualifier - Enum Class in org.apache.beam.sdk.schemas
-
Qualifier for a list selector.
- FieldAccessDescriptor.FieldDescriptor.MapQualifier - Enum Class in org.apache.beam.sdk.schemas
-
Qualifier for a map selector.
- FieldAccessDescriptor.FieldDescriptor.Qualifier - Class in org.apache.beam.sdk.schemas
-
OneOf union for a collection selector.
- FieldAccessDescriptor.FieldDescriptor.Qualifier.Kind - Enum Class in org.apache.beam.sdk.schemas
-
The kind of qualifier.
- FieldAccessDescriptorParser - Class in org.apache.beam.sdk.schemas.parser
-
Parser for textual field-access selector.
- FieldAccessDescriptorParser() - Constructor for class org.apache.beam.sdk.schemas.parser.FieldAccessDescriptorParser
- FieldDescriptor() - Constructor for class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor
- fieldFromType(TypeDescriptor, FieldValueTypeSupplier) - Static method in class org.apache.beam.sdk.schemas.utils.StaticSchemaInference
-
Map a Java field type to a Beam Schema FieldType.
- fieldIds(Integer...) - Static method in class org.apache.beam.sdk.schemas.transforms.CoGroup.By
-
Join by the following field ids.
- fieldIds(Integer...) - Static method in class org.apache.beam.sdk.schemas.transforms.Select
-
Select a set of top-level field ids from the row.
- fieldIdsAccessed() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
-
Return the field ids accessed.
- fieldNames(String...) - Static method in class org.apache.beam.sdk.schemas.transforms.CoGroup.By
-
Join by the following field names.
- fieldNames(String...) - Static method in class org.apache.beam.sdk.schemas.transforms.Select
-
Select a set of top-level field names from the row.
- fieldNamesAccessed() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
-
Return the field names accessed.
- fields() - Method in class org.apache.beam.sdk.io.splunk.SplunkEvent
- fields() - Static method in class org.apache.beam.sdk.state.StateKeySpec
- fields(Integer...) - Static method in class org.apache.beam.sdk.schemas.transforms.DropFields
- fields(String...) - Static method in class org.apache.beam.sdk.schemas.transforms.DropFields
- fields(FieldAccessDescriptor) - Static method in class org.apache.beam.sdk.schemas.transforms.DropFields
- Fields() - Constructor for class org.apache.beam.sdk.schemas.transforms.Select.Fields
- Fields: - Search tag in class org.apache.beam.sdk.io.contextualtextio.RecordWithMetadata
- Section
- FieldsEqual() - Constructor for class org.apache.beam.sdk.schemas.transforms.Join.FieldsEqual
- fieldSpecifier() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- FieldSpecifierContext(ParserRuleContext, int) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.FieldSpecifierContext
- FieldSpecifierNotationBaseListener - Class in org.apache.beam.sdk.schemas.parser.generated
-
This class provides an empty implementation of
FieldSpecifierNotationListener, which can be extended to create a listener which only needs to handle a subset of the available methods. - FieldSpecifierNotationBaseListener() - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
- FieldSpecifierNotationBaseVisitor<T> - Class in org.apache.beam.sdk.schemas.parser.generated
-
This class provides an empty implementation of
FieldSpecifierNotationVisitor, which can be extended to create a visitor which only needs to handle a subset of the available methods. - FieldSpecifierNotationBaseVisitor() - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseVisitor
- FieldSpecifierNotationLexer - Class in org.apache.beam.sdk.schemas.parser.generated
- FieldSpecifierNotationLexer(CharStream) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
- FieldSpecifierNotationListener - Interface in org.apache.beam.sdk.schemas.parser.generated
-
This interface defines a complete listener for a parse tree produced by
FieldSpecifierNotationParser. - FieldSpecifierNotationParser - Class in org.apache.beam.sdk.schemas.parser.generated
- FieldSpecifierNotationParser(TokenStream) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- FieldSpecifierNotationParser.ArrayQualifierContext - Class in org.apache.beam.sdk.schemas.parser.generated
- FieldSpecifierNotationParser.ArrayQualifierListContext - Class in org.apache.beam.sdk.schemas.parser.generated
- FieldSpecifierNotationParser.DotExpressionComponentContext - Class in org.apache.beam.sdk.schemas.parser.generated
- FieldSpecifierNotationParser.DotExpressionContext - Class in org.apache.beam.sdk.schemas.parser.generated
- FieldSpecifierNotationParser.FieldSpecifierContext - Class in org.apache.beam.sdk.schemas.parser.generated
- FieldSpecifierNotationParser.MapQualifierContext - Class in org.apache.beam.sdk.schemas.parser.generated
- FieldSpecifierNotationParser.MapQualifierListContext - Class in org.apache.beam.sdk.schemas.parser.generated
- FieldSpecifierNotationParser.QualifiedComponentContext - Class in org.apache.beam.sdk.schemas.parser.generated
- FieldSpecifierNotationParser.QualifierListContext - Class in org.apache.beam.sdk.schemas.parser.generated
- FieldSpecifierNotationParser.QualifyComponentContext - Class in org.apache.beam.sdk.schemas.parser.generated
- FieldSpecifierNotationParser.SimpleIdentifierContext - Class in org.apache.beam.sdk.schemas.parser.generated
- FieldSpecifierNotationParser.WildcardContext - Class in org.apache.beam.sdk.schemas.parser.generated
- FieldSpecifierNotationVisitor<T> - Interface in org.apache.beam.sdk.schemas.parser.generated
-
This interface defines a complete generic visitor for a parse tree produced by
FieldSpecifierNotationParser. - FieldType() - Constructor for class org.apache.beam.sdk.schemas.Schema.FieldType
- FieldTypeDescriptors - Class in org.apache.beam.sdk.schemas
-
Utilities for converting between
Schemafield types andTypeDescriptors that define Java objects which can represent these field types. - FieldTypeDescriptors() - Constructor for class org.apache.beam.sdk.schemas.FieldTypeDescriptors
- fieldTypeForJavaType(TypeDescriptor) - Static method in class org.apache.beam.sdk.schemas.FieldTypeDescriptors
-
Get a
Schema.FieldTypefrom aTypeDescriptor. - fieldUpdate(String, String, String) - Static method in class org.apache.beam.sdk.io.mongodb.UpdateField
- FieldValueGetter<ObjectT,
ValueT> - Interface in org.apache.beam.sdk.schemas -
For internal use only; no backwards-compatibility guarantees.
- fieldValueGetters(Class<?>, Schema) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProvider
-
Deprecated.new implementations should override
GetterBasedSchemaProvider.fieldValueGetters(TypeDescriptor, Schema)and make this method throw anUnsupportedOperationException - fieldValueGetters(Class<?>, Schema) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProviderV2
- fieldValueGetters(TypeDescriptor<T>, Schema) - Method in class org.apache.beam.sdk.io.thrift.ThriftSchema
- fieldValueGetters(TypeDescriptor<T>, Schema) - Method in class org.apache.beam.sdk.extensions.avro.schemas.AvroRecordSchema
- fieldValueGetters(TypeDescriptor<T>, Schema) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoMessageSchema
- fieldValueGetters(TypeDescriptor<T>, Schema) - Method in class org.apache.beam.sdk.io.aws2.schemas.AwsSchemaProvider
- fieldValueGetters(TypeDescriptor<T>, Schema) - Method in class org.apache.beam.sdk.schemas.AutoValueSchema
- fieldValueGetters(TypeDescriptor<T>, Schema) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProvider
-
Deprecated.Delegates to the
GetterBasedSchemaProvider.fieldValueGetters(Class, Schema)for backwards compatibility, override it if you want to use the richer type signature contained in theTypeDescriptornot subject to the type erasure. - fieldValueGetters(TypeDescriptor<T>, Schema) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProviderV2
- fieldValueGetters(TypeDescriptor<T>, Schema) - Method in class org.apache.beam.sdk.schemas.JavaBeanSchema
- fieldValueGetters(TypeDescriptor<T>, Schema) - Method in class org.apache.beam.sdk.schemas.JavaFieldSchema
- FieldValueHaver<ObjectT> - Interface in org.apache.beam.sdk.schemas
-
For internal use only; no backwards-compatibility guarantees.
- FieldValueSetter<ObjectT,
ValueT> - Interface in org.apache.beam.sdk.schemas -
For internal use only; no backwards-compatibility guarantees.
- FieldValueTypeInformation - Class in org.apache.beam.sdk.schemas
-
Represents type information for a Java type that will be used to infer a Schema type.
- FieldValueTypeInformation() - Constructor for class org.apache.beam.sdk.schemas.FieldValueTypeInformation
- FieldValueTypeInformation.Builder - Class in org.apache.beam.sdk.schemas
- fieldValueTypeInformations(Class<?>, Schema) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProvider
-
Deprecated.new implementations should override
GetterBasedSchemaProvider.fieldValueTypeInformations(TypeDescriptor, Schema)and make this method throw anUnsupportedOperationException - fieldValueTypeInformations(Class<?>, Schema) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProviderV2
- fieldValueTypeInformations(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.io.thrift.ThriftSchema
- fieldValueTypeInformations(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.extensions.avro.schemas.AvroRecordSchema
- fieldValueTypeInformations(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoMessageSchema
- fieldValueTypeInformations(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.io.aws2.schemas.AwsSchemaProvider
- fieldValueTypeInformations(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.schemas.AutoValueSchema
- fieldValueTypeInformations(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProvider
-
Deprecated.Delegates to the
GetterBasedSchemaProvider.fieldValueTypeInformations(Class, Schema)for backwards compatibility, override it if you want to use the richer type signature contained in theTypeDescriptornot subject to the type erasure. - fieldValueTypeInformations(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProviderV2
- fieldValueTypeInformations(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.schemas.JavaBeanSchema
- fieldValueTypeInformations(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.schemas.JavaFieldSchema
- FieldValueTypeSupplier - Interface in org.apache.beam.sdk.schemas.utils
-
A naming policy for schema fields.
- FILE - Enum constant in enum class org.apache.beam.sdk.io.FileSystem.LineageLevel
- FILE_ARTIFACT_URN - Static variable in class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
- FILE_LOADS - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.Method
-
Use BigQuery load jobs to insert data.
- FILE_NAME_FIELD - Static variable in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformProvider
- FILE_TRIGGERING_BYTE_COUNT - Static variable in class org.apache.beam.sdk.io.WriteFiles
- FILE_TRIGGERING_RECORD_BUFFERING_DURATION - Static variable in class org.apache.beam.sdk.io.WriteFiles
- FILE_TRIGGERING_RECORD_COUNT - Static variable in class org.apache.beam.sdk.io.WriteFiles
- FileBasedReader(FileBasedSource<T>) - Constructor for class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
-
Subclasses should not perform IO operations at the constructor.
- FileBasedSink<UserT,
DestinationT, - Class in org.apache.beam.sdk.ioOutputT> -
Abstract class for file-based output.
- FileBasedSink(ValueProvider<ResourceId>, FileBasedSink.DynamicDestinations<?, DestinationT, OutputT>) - Constructor for class org.apache.beam.sdk.io.FileBasedSink
-
Construct a
FileBasedSinkwith the given temp directory, producing uncompressed files. - FileBasedSink(ValueProvider<ResourceId>, FileBasedSink.DynamicDestinations<?, DestinationT, OutputT>, Compression) - Constructor for class org.apache.beam.sdk.io.FileBasedSink
-
Construct a
FileBasedSinkwith the given temp directory and output channel type. - FileBasedSink(ValueProvider<ResourceId>, FileBasedSink.DynamicDestinations<?, DestinationT, OutputT>, FileBasedSink.WritableByteChannelFactory) - Constructor for class org.apache.beam.sdk.io.FileBasedSink
-
Construct a
FileBasedSinkwith the given temp directory and output channel type. - FileBasedSink.CompressionType - Enum Class in org.apache.beam.sdk.io
-
Deprecated.use
Compression. - FileBasedSink.DynamicDestinations<UserT,
DestinationT, - Class in org.apache.beam.sdk.ioOutputT> -
A class that allows value-dependent writes in
FileBasedSink. - FileBasedSink.FilenamePolicy - Class in org.apache.beam.sdk.io
-
A naming policy for output files.
- FileBasedSink.FileResult<DestinationT> - Class in org.apache.beam.sdk.io
-
Result of a single bundle write.
- FileBasedSink.FileResultCoder<DestinationT> - Class in org.apache.beam.sdk.io
-
A coder for
FileBasedSink.FileResultobjects. - FileBasedSink.OutputFileHints - Interface in org.apache.beam.sdk.io
-
Provides hints about how to generate output files, such as a suggested filename suffix (e.g.
- FileBasedSink.WritableByteChannelFactory - Interface in org.apache.beam.sdk.io
-
Implementations create instances of
WritableByteChannelused byFileBasedSinkand related classes to allow decorating, or otherwise transforming, the raw data that would normally be written directly to theWritableByteChannelpassed intoFileBasedSink.WritableByteChannelFactory.create(WritableByteChannel). - FileBasedSink.WriteOperation<DestinationT,
OutputT> - Class in org.apache.beam.sdk.io -
Abstract operation that manages the process of writing to
FileBasedSink. - FileBasedSink.Writer<DestinationT,
OutputT> - Class in org.apache.beam.sdk.io -
Abstract writer that writes a bundle to a
FileBasedSink. - FileBasedSource<T> - Class in org.apache.beam.sdk.io
-
A common base class for all file-based
Sources. - FileBasedSource(MatchResult.Metadata, long, long, long) - Constructor for class org.apache.beam.sdk.io.FileBasedSource
-
Create a
FileBasedSourcebased on a single file. - FileBasedSource(ValueProvider<String>, long) - Constructor for class org.apache.beam.sdk.io.FileBasedSource
-
Like
FileBasedSource(ValueProvider, EmptyMatchTreatment, long), but uses the default value ofEmptyMatchTreatment.DISALLOW. - FileBasedSource(ValueProvider<String>, EmptyMatchTreatment, long) - Constructor for class org.apache.beam.sdk.io.FileBasedSource
-
Create a
FileBaseSourcebased on a file or a file pattern specification, with the given strategy for treating filepatterns that do not match any files. - FileBasedSource.FileBasedReader<T> - Class in org.apache.beam.sdk.io
-
A
readerthat implements code common to readers ofFileBasedSources. - FileBasedSource.Mode - Enum Class in org.apache.beam.sdk.io
-
A given
FileBasedSourcerepresents a file resource of one of these types. - FileChecksumMatcher - Class in org.apache.beam.sdk.testing
-
Matcher to verify checksum of the contents of an
ShardedFilein E2E test. - fileContentsHaveChecksum(String) - Static method in class org.apache.beam.sdk.testing.FileChecksumMatcher
- FileIO - Class in org.apache.beam.sdk.io
-
General-purpose transforms for working with files: listing files (matching), reading and writing.
- FileIO() - Constructor for class org.apache.beam.sdk.io.FileIO
- FileIO.Match - Class in org.apache.beam.sdk.io
-
Implementation of
FileIO.match(). - FileIO.MatchAll - Class in org.apache.beam.sdk.io
-
Implementation of
FileIO.matchAll(). - FileIO.MatchConfiguration - Class in org.apache.beam.sdk.io
-
Describes configuration for matching filepatterns, such as
EmptyMatchTreatmentand continuous watching for matching files. - FileIO.ReadableFile - Class in org.apache.beam.sdk.io
-
A utility class for accessing a potentially compressed file.
- FileIO.ReadMatches - Class in org.apache.beam.sdk.io
-
Implementation of
FileIO.readMatches(). - FileIO.ReadMatches.DirectoryTreatment - Enum Class in org.apache.beam.sdk.io
-
Enum to control how directories are handled.
- FileIO.Sink<ElementT> - Interface in org.apache.beam.sdk.io
-
Specifies how to write elements to individual files in
FileIO.write()andFileIO.writeDynamic(). - FileIO.Write<DestinationT,
UserT> - Class in org.apache.beam.sdk.io -
Implementation of
FileIO.write()andFileIO.writeDynamic(). - FileIO.Write.FileNaming - Interface in org.apache.beam.sdk.io
-
A policy for generating names for shard files.
- FilenamePolicy() - Constructor for class org.apache.beam.sdk.io.FileBasedSink.FilenamePolicy
- File naming - Search tag in class org.apache.beam.sdk.io.FileIO
- Section
- filepattern(String) - Method in class org.apache.beam.sdk.io.FileIO.Match
-
Matches the given filepattern.
- filepattern(String) - Method in class org.apache.beam.sdk.io.tika.TikaIO.Parse
-
Matches the given filepattern.
- filepattern(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.FileIO.Match
-
Like
FileIO.Match.filepattern(String)but using aValueProvider. - filepattern(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.tika.TikaIO.Parse
-
Like
TikaIO.Parse.filepattern(String)but using aValueProvider. - FILEPATTERN - Enum constant in enum class org.apache.beam.sdk.io.FileBasedSource.Mode
- Filepattern expansion and watching - Search tag in class org.apache.beam.sdk.extensions.avro.io.AvroIO
- Section
- Filepattern expansion and watching - Search tag in class org.apache.beam.sdk.io.TextIO
- Section
- Filepattern expansion and watching - Search tag in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO
- Section
- FileReadSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.fileschematransform
- FileReadSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration
- FileReadSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.fileschematransform
- FileReadSchemaTransformFormatProvider - Interface in org.apache.beam.sdk.io.fileschematransform
-
Interface that provides a
PTransformthat reads in aPCollectionofFileIO.ReadableFiles and outputs the data represented as aPCollectionofRows. - FileReadSchemaTransformProvider - Class in org.apache.beam.sdk.io.fileschematransform
- FileReadSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformProvider
- FileReporter - Class in org.apache.beam.runners.flink.metrics
-
Flink
metrics reporterfor writing metrics to a file specified via the "metrics.reporter.file.path" config key (assuming an alias of "file" for this reporter in the "metrics.reporters" setting). - FileReporter() - Constructor for class org.apache.beam.runners.flink.metrics.FileReporter
- FileResult(ResourceId, int, BoundedWindow, PaneInfo, DestinationT) - Constructor for class org.apache.beam.sdk.io.FileBasedSink.FileResult
- FileResultCoder(Coder<BoundedWindow>, Coder<DestinationT>) - Constructor for class org.apache.beam.sdk.io.FileBasedSink.FileResultCoder
- fileSize(GcsPath) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
-
Returns the file size from GCS or throws
FileNotFoundExceptionif the resource does not exist. - FileStagingOptions - Interface in org.apache.beam.sdk.options
-
File staging related options.
- FileSystem<ResourceIdT> - Class in org.apache.beam.sdk.io
-
File system interface in Beam.
- FileSystem() - Constructor for class org.apache.beam.sdk.io.FileSystem
- FileSystem.LineageLevel - Enum Class in org.apache.beam.sdk.io
- FileSystemRegistrar - Interface in org.apache.beam.sdk.io
-
A registrar that creates
FileSysteminstances fromPipelineOptions. - FileSystems - Class in org.apache.beam.sdk.io
-
Clients facing
FileSystemutility. - FileSystems() - Constructor for class org.apache.beam.sdk.io.FileSystems
- FileSystemUtils - Class in org.apache.beam.sdk.io
- FileSystemUtils() - Constructor for class org.apache.beam.sdk.io.FileSystemUtils
- FileWriteSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.fileschematransform
-
The configuration for building file writing transforms using
SchemaTransformandSchemaTransformProvider. - FileWriteSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration
- FileWriteSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.fileschematransform
- FileWriteSchemaTransformConfiguration.CsvConfiguration - Class in org.apache.beam.sdk.io.fileschematransform
-
Configures extra details related to writing CSV formatted files.
- FileWriteSchemaTransformConfiguration.CsvConfiguration.Builder - Class in org.apache.beam.sdk.io.fileschematransform
- FileWriteSchemaTransformConfiguration.ParquetConfiguration - Class in org.apache.beam.sdk.io.fileschematransform
-
Configures extra details related to writing Parquet formatted files.
- FileWriteSchemaTransformConfiguration.ParquetConfiguration.Builder - Class in org.apache.beam.sdk.io.fileschematransform
- FileWriteSchemaTransformConfiguration.XmlConfiguration - Class in org.apache.beam.sdk.io.fileschematransform
-
Configures extra details related to writing XML formatted files.
- FileWriteSchemaTransformConfiguration.XmlConfiguration.Builder - Class in org.apache.beam.sdk.io.fileschematransform
- FileWriteSchemaTransformFormatProvider - Interface in org.apache.beam.sdk.io.fileschematransform
-
Provides a
PTransformthat writes aPCollectionofRows and outputs aPCollectionof the file names according to a registeredAutoServiceFileWriteSchemaTransformFormatProviderimplementation. - FileWriteSchemaTransformFormatProviders - Class in org.apache.beam.sdk.io.fileschematransform
-
FileWriteSchemaTransformFormatProviderscontainsFileWriteSchemaTransformFormatProviderimplementations. - FileWriteSchemaTransformFormatProviders() - Constructor for class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformFormatProviders
- FileWriteSchemaTransformProvider - Class in org.apache.beam.sdk.io.fileschematransform
-
A
TypedSchemaTransformProviderimplementation for writing aRowPCollectionto file systems, driven by aFileWriteSchemaTransformConfiguration. - FileWriteSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformProvider
- FillGaps<ValueT> - Class in org.apache.beam.sdk.extensions.timeseries
-
Fill gaps in timeseries.
- FillGaps() - Constructor for class org.apache.beam.sdk.extensions.timeseries.FillGaps
- FillGaps.FillGapsDoFn<ValueT> - Class in org.apache.beam.sdk.extensions.timeseries
- FillGaps.InterpolateData<ValueT> - Class in org.apache.beam.sdk.extensions.timeseries
-
Argument to withInterpolateFunction function.
- Filter - Class in org.apache.beam.sdk.schemas.transforms
-
A
PTransformfor filtering a collection of schema types. - Filter<T> - Class in org.apache.beam.sdk.transforms
-
PTransforms for filtering from aPCollectionthe elements satisfying a predicate, or satisfying an inequality with a given value based on the elements' natural ordering. - Filter() - Constructor for class org.apache.beam.sdk.schemas.transforms.Filter
- FILTER - Enum constant in enum class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider.PushDownOptions
- Filter.Inner<T> - Class in org.apache.beam.sdk.schemas.transforms
-
Implementation of the filter.
- filterCharacters(String) - Method in class org.apache.beam.runners.flink.metrics.FileReporter
- FilterForMutationDoFn - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn
- FilterForMutationDoFn() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.FilterForMutationDoFn
- FilterUtils - Class in org.apache.beam.sdk.io.iceberg
-
Utilities that convert between a SQL filter expression and an Iceberg
Expression. - FilterUtils() - Constructor for class org.apache.beam.sdk.io.iceberg.FilterUtils
- FinalFlinkCombiner(CombineFnBase.GlobalCombineFn<?, AccumT, OutputT>) - Constructor for class org.apache.beam.runners.flink.translation.functions.AbstractFlinkCombineRunner.FinalFlinkCombiner
- FINALIZE_STREAM - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics.RpcMethod
- finalizeAllOutstandingBundles() - Method in class org.apache.beam.runners.fnexecution.control.BundleFinalizationHandlers.InMemoryFinalizer
-
All finalization requests will be sent without waiting for the responses.
- finalizeCheckpoint() - Method in class org.apache.beam.runners.spark.io.EmptyCheckpointMark
- finalizeCheckpoint() - Method in class org.apache.beam.sdk.io.CountingSource.CounterMark
- finalizeCheckpoint() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.CheckpointMarkImpl
- finalizeCheckpoint() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark
- finalizeCheckpoint() - Method in class org.apache.beam.sdk.io.solace.read.SolaceCheckpointMark
- finalizeCheckpoint() - Method in interface org.apache.beam.sdk.io.UnboundedSource.CheckpointMark
-
Called by the system to signal that this checkpoint mark has been committed along with all the records which have been read from the
UnboundedSource.UnboundedReadersince the previous checkpoint was taken. - finalizeCheckpoint() - Method in class org.apache.beam.sdk.io.UnboundedSource.CheckpointMark.NoopCheckpointMark
- finalizeDestination(DestinationT, BoundedWindow, Integer, Collection<FileBasedSink.FileResult<DestinationT>>) - Method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
- finalizeWriteStream(String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.WriteStreamService
-
Finalize a write stream.
- finalizeWriteStream(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.WriteStreamServiceImpl
- finalizeWriteStream(String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- find(String) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.FindPTransformthat checks if a portion of the line matches the Regex. - find(String, int) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.FindPTransformthat checks if a portion of the line matches the Regex. - find(String, String) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.FindNamePTransformthat checks if a portion of the line matches the Regex. - find(Pattern) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.FindPTransformthat checks if a portion of the line matches the Regex. - find(Pattern, int) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.FindPTransformthat checks if a portion of the line matches the Regex. - find(Pattern, String) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.FindNamePTransformthat checks if a portion of the line matches the Regex. - Find(Pattern, int) - Constructor for class org.apache.beam.sdk.transforms.Regex.Find
- findAll(String) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.FindAllPTransformthat checks if a portion of the line matches the Regex. - findAll(Pattern) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.FindAllPTransformthat checks if a portion of the line matches the Regex. - FindAll(Pattern) - Constructor for class org.apache.beam.sdk.transforms.Regex.FindAll
- findAllTableIndexes() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
-
Finds all indexes for the metadata table.
- findAvailablePort() - Static method in class org.apache.beam.sdk.extensions.python.PythonService
- findKV(String, int, int) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.FindKVPTransformthat checks if a portion of the line matches the Regex. - findKV(String, String, String) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.FindNameKVPTransformthat checks if a portion of the line matches the Regex. - findKV(Pattern, int, int) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.FindKVPTransformthat checks if a portion of the line matches the Regex. - findKV(Pattern, String, String) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.FindNameKVPTransformthat checks if a portion of the line matches the Regex. - FindKV(Pattern, int, int) - Constructor for class org.apache.beam.sdk.transforms.Regex.FindKV
- FindName(Pattern, String) - Constructor for class org.apache.beam.sdk.transforms.Regex.FindName
- FindNameKV(Pattern, String, String) - Constructor for class org.apache.beam.sdk.transforms.Regex.FindNameKV
- FindQuery - Class in org.apache.beam.sdk.io.mongodb
-
Builds a MongoDB FindQuery object.
- FindQuery() - Constructor for class org.apache.beam.sdk.io.mongodb.FindQuery
- finish() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- finish() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.ErrorCounterFn
- finish() - Method in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformProvider.ErrorFn
- finish(DoFn.FinishBundleContext) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider.ErrorFn
- finish(DoFn.FinishBundleContext) - Method in class org.apache.beam.sdk.io.TFRecordReadSchemaTransformProvider.ErrorFn
- finish(DoFn.FinishBundleContext) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.ErrorFn
- finish(DoFn.FinishBundleContext) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.ErrorFn
- finish(DoFn.FinishBundleContext) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.DataChangeRecordToRow
- finishBundle() - Method in class org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate
- finishBundle() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.stableinput.BufferingDoFnRunner
- finishBundle() - Method in class org.apache.beam.runners.spark.translation.DoFnRunnerWithMetrics
- finishBundle() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write.PubsubBoundedWriter
- finishBundle() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.PubsubLiteSink
- finishBundle() - Method in class org.apache.beam.sdk.io.pulsar.WriteToPulsarDoFn
- finishBundle() - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
Deprecated.Use
TestPipelinewith theDirectRunner. - finishBundle(DoFn.FinishBundleContext) - Method in class org.apache.beam.sdk.io.gcp.bigquery.UpdateSchemaDestination
- finishBundle(DoFn.FinishBundleContext) - Method in class org.apache.beam.sdk.io.solace.write.UnboundedBatchedSolaceWriter
- finishBundle(DoFn.FinishBundleContext) - Method in class org.apache.beam.sdk.io.solace.write.UnboundedStreamingSolaceWriter
- finishBundle(DoFn.FinishBundleContext, PipelineOptions) - Method in interface org.apache.beam.sdk.extensions.sql.BeamSqlSeekableTable
- finishBundle(DoFn.FinishBundleContext) - Method in class org.apache.beam.sdk.transforms.View.ToListViewDoFn
- FinishBundleContext() - Constructor for class org.apache.beam.sdk.transforms.DoFn.FinishBundleContext
- FINISHED - Enum constant in enum class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.State
- finishRunnerBundle(DoFnRunner<InputT, OutputT>) - Method in class org.apache.beam.runners.jet.processors.ParDoP
- finishRunnerBundle(DoFnRunner<KV<?, ?>, OutputT>) - Method in class org.apache.beam.runners.jet.processors.StatefulParDoP
- finishSpecifying() - Method in interface org.apache.beam.sdk.state.StateSpec
-
For internal use only; no backwards-compatibility guarantees.
- finishSpecifying(PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PCollection
-
After building, finalizes this
PValueto make it ready for running. - finishSpecifying(PInput, PTransform<?, ?>) - Method in interface org.apache.beam.sdk.values.PValue
-
After building, finalizes this
PValueto make it ready for being used as an input to aPTransform. - finishSpecifying(PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PValueBase
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.io.requestresponse.Result
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedEventProcessorResult
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Write.Result
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.WriteBatches.Result
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.csv.CsvIOParseResult
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata.Result
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read.Result
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search.Result
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.AbstractResult
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.Result
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything.Result
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read.Result
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.Result
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write.Result
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteResult
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteResult
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.jms.WriteJmsResult
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.solace.write.SolaceOutput
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.WriteFilesResult
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.transforms.JsonToRow.ParseResult
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.transforms.WithFailures.Result
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PCollection
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PCollectionList
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PDone
-
Does nothing; there is nothing to finish specifying.
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in interface org.apache.beam.sdk.values.POutput
-
As part of applying the producing
PTransform, finalizes this output to make it ready for being used as an input and for running. - finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PValueBase
- finishSplit(int) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- finishWrite() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
-
Called after all calls to
FileBasedSink.Writer.writeHeader(),FileBasedSink.Writer.write(OutputT)andFileBasedSink.Writer.writeFooter(). - FIRE_ALWAYS - Enum constant in enum class org.apache.beam.sdk.transforms.windowing.Window.ClosingBehavior
-
Always fire the last pane.
- FIRE_ALWAYS - Enum constant in enum class org.apache.beam.sdk.transforms.windowing.Window.OnTimeBehavior
-
Always fire the on-time pane.
- FIRE_IF_NON_EMPTY - Enum constant in enum class org.apache.beam.sdk.transforms.windowing.Window.ClosingBehavior
-
Only fire the last pane if there is new data since the previous firing.
- FIRE_IF_NON_EMPTY - Enum constant in enum class org.apache.beam.sdk.transforms.windowing.Window.OnTimeBehavior
-
Only fire the on-time pane if there is new data since the previous firing.
- fireEligibleTimers(InMemoryTimerInternals, Map<KV<String, String>, FnDataReceiver<Timer>>, Object) - Static method in class org.apache.beam.runners.fnexecution.translation.PipelineTranslatorUtils
-
Fires all timers which are ready to be fired.
- FirestoreIO - Class in org.apache.beam.sdk.io.gcp.firestore
-
FirestoreIOprovides an API for reading from and writing to Google Cloud Firestore. - FirestoreOptions - Interface in org.apache.beam.sdk.io.gcp.firestore
- FirestoreV1 - Class in org.apache.beam.sdk.io.gcp.firestore
-
FirestoreV1provides an API which provides lifecycle managedPTransforms for Cloud Firestore v1 API. - FirestoreV1.BatchGetDocuments - Class in org.apache.beam.sdk.io.gcp.firestore
-
Concrete class representing a
PTransform<PCollection<BatchGetDocumentsRequest>,PTransform<BatchGetDocumentsResponse>>which will read from Firestore. - FirestoreV1.BatchGetDocuments.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
-
A type safe builder for
FirestoreV1.BatchGetDocumentsallowing configuration and instantiation. - FirestoreV1.BatchWriteWithDeadLetterQueue - Class in org.apache.beam.sdk.io.gcp.firestore
-
Concrete class representing a
PTransform<PCollection<Write>,PCollection<FirestoreV1.WriteFailure>which will write to Firestore. - FirestoreV1.BatchWriteWithDeadLetterQueue.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
-
A type safe builder for
FirestoreV1.BatchWriteWithDeadLetterQueueallowing configuration and instantiation. - FirestoreV1.BatchWriteWithSummary - Class in org.apache.beam.sdk.io.gcp.firestore
- FirestoreV1.BatchWriteWithSummary.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
-
A type safe builder for
FirestoreV1.BatchWriteWithSummaryallowing configuration and instantiation. - FirestoreV1.FailedWritesException - Exception Class in org.apache.beam.sdk.io.gcp.firestore
-
Exception that is thrown if one or more
Writes is unsuccessful with a non-retryable status code. - FirestoreV1.ListCollectionIds - Class in org.apache.beam.sdk.io.gcp.firestore
-
Concrete class representing a
PTransform<PCollection<ListCollectionIdsRequest>,PTransform<ListCollectionIdsResponse>>which will read from Firestore. - FirestoreV1.ListCollectionIds.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
-
A type safe builder for
FirestoreV1.ListCollectionIdsallowing configuration and instantiation. - FirestoreV1.ListDocuments - Class in org.apache.beam.sdk.io.gcp.firestore
-
Concrete class representing a
PTransform<PCollection<ListDocumentsRequest>,PTransform<ListDocumentsResponse>>which will read from Firestore. - FirestoreV1.ListDocuments.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
-
A type safe builder for
FirestoreV1.ListDocumentsallowing configuration and instantiation. - FirestoreV1.PartitionQuery - Class in org.apache.beam.sdk.io.gcp.firestore
-
Concrete class representing a
PTransform<PCollection<PartitionQueryRequest>,PTransform<RunQueryRequest>>which will read from Firestore. - FirestoreV1.PartitionQuery.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
-
A type safe builder for
FirestoreV1.PartitionQueryallowing configuration and instantiation. - FirestoreV1.Read - Class in org.apache.beam.sdk.io.gcp.firestore
-
Type safe builder factory for read operations.
- FirestoreV1.RunQuery - Class in org.apache.beam.sdk.io.gcp.firestore
-
Concrete class representing a
PTransform<PCollection<RunQueryRequest>,PTransform<RunQueryResponse>>which will read from Firestore. - FirestoreV1.RunQuery.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
-
A type safe builder for
FirestoreV1.RunQueryallowing configuration and instantiation. - FirestoreV1.Write - Class in org.apache.beam.sdk.io.gcp.firestore
-
Type safe builder factory for write operations.
- FirestoreV1.WriteFailure - Class in org.apache.beam.sdk.io.gcp.firestore
-
Failure details for an attempted
Write. - FirestoreV1.WriteSuccessSummary - Class in org.apache.beam.sdk.io.gcp.firestore
-
Summary object produced when a number of writes are successfully written to Firestore in a single BatchWrite.
- fireTimer(TimerInternals.TimerData) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- fireTimer(TimerInternals.TimerData) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.SplittableDoFnOperator
- fireTimer(TimerInternals.TimerData) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.WindowDoFnOperator
- fireTimer(TimerInternals.TimerData) - Method in class org.apache.beam.runners.spark.translation.AbstractInOutIterator
-
Fires a timer using the DoFnRunner from the context and performs cleanup afterwards.
- fireTimerInternal(FlinkKey, TimerInternals.TimerData) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- fireTimerInternal(FlinkKey, TimerInternals.TimerData) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.ExecutableStageDoFnOperator
- fireTimestamp() - Method in class org.apache.beam.sdk.transforms.DoFn.OnTimerContext
-
Returns the firing timestamp of the current timer.
- first - Variable in class org.apache.beam.sdk.transforms.PeriodicSequence.SequenceDefinition
- FIRST - Enum constant in enum class org.apache.beam.sdk.extensions.sql.impl.cep.CEPKind
- firstInput(K, AccumT, PipelineOptions, SideInputReader, Collection<? extends BoundedWindow>) - Method in class org.apache.beam.runners.flink.translation.functions.AbstractFlinkCombineRunner.FinalFlinkCombiner
- firstInput(K, InputT, PipelineOptions, SideInputReader, Collection<? extends BoundedWindow>) - Method in class org.apache.beam.runners.flink.translation.functions.AbstractFlinkCombineRunner.CompleteFlinkCombiner
- firstInput(K, InputT, PipelineOptions, SideInputReader, Collection<? extends BoundedWindow>) - Method in interface org.apache.beam.runners.flink.translation.functions.AbstractFlinkCombineRunner.FlinkCombiner
- firstInput(K, InputT, PipelineOptions, SideInputReader, Collection<? extends BoundedWindow>) - Method in class org.apache.beam.runners.flink.translation.functions.AbstractFlinkCombineRunner.PartialFlinkCombiner
- fixDefaults() - Method in class org.apache.beam.sdk.values.WindowingStrategy
-
Fixes all the defaults so that equals can be used to check that two strategies are the same, regardless of the state of "defaulted-ness".
- FIXED_LENGTH - Static variable in class org.apache.beam.runners.dataflow.internal.IsmFormat.Footer
- FIXED_WINDOW_TVF - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.TVFStreamingUtils
- FixedBytes - Class in org.apache.beam.sdk.schemas.logicaltypes
-
A LogicalType representing a fixed-length byte array.
- FixedPrecisionNumeric - Class in org.apache.beam.sdk.schemas.logicaltypes
-
Fixed precision numeric types used to represent jdbc NUMERIC and DECIMAL types.
- fixedSizeGlobally(int) - Static method in class org.apache.beam.sdk.transforms.Sample
-
Returns a
PTransformthat takes aPCollection<T>, selectssampleSizeelements, uniformly at random, and returns aPCollection<Iterable<T>>containing the selected elements. - fixedSizePerKey(int) - Static method in class org.apache.beam.sdk.transforms.Sample
-
Returns a
PTransformthat takes an inputPCollection<KV<K, V>>and returns aPCollection<KV<K, Iterable<V>>>that contains an output element mapping each distinct key in the inputPCollectionto a sample ofsampleSizevalues associated with that key in the inputPCollection, taken uniformly at random. - fixedString(int) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- FixedString - Class in org.apache.beam.sdk.schemas.logicaltypes
-
A LogicalType representing a fixed-length string.
- FIXEDSTRING - Enum constant in enum class org.apache.beam.sdk.io.clickhouse.TableSchema.TypeName
- fixedStringSize() - Method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- FixedWindows - Class in org.apache.beam.sdk.transforms.windowing
-
A
WindowFnthat windows values into fixed-size timestamp-based windows. - flatMap(RawUnionValue, Collector<WindowedValue<?>>) - Method in class org.apache.beam.runners.flink.translation.functions.FlinkExecutableStagePruningFunction
- flatMap(KV<K, Iterable<WindowedValue<V>>>, RecordCollector<WindowedValue<KV<K, Iterable<V>>>>) - Method in class org.apache.beam.runners.twister2.translators.functions.GroupByWindowFunction
- flatMap(WindowedValue<InputT>, Collector<WindowedValue<RawUnionValue>>) - Method in class org.apache.beam.runners.flink.translation.functions.FlinkDoFnFunction
- flatMap(WindowedValue<RawUnionValue>, Collector<WindowedValue<T>>) - Method in class org.apache.beam.runners.flink.translation.functions.FlinkMultiOutputPruningFunction
- flatMap(WindowedValue<T>, Collector<WindowedValue<T>>) - Method in class org.apache.beam.runners.flink.translation.functions.FlinkAssignWindows
- flatMap(WindowedValue<T>, Collector<WindowedValue<T>>) - Method in class org.apache.beam.runners.flink.translation.functions.FlinkExplodeWindowsFunction
- FlatMapElements<InputT,
OutputT> - Class in org.apache.beam.sdk.transforms -
PTransforms for mapping a simple function that returns iterables over the elements of aPCollectionand merging the results. - FlatMapElements.FlatMapWithFailures<InputT,
OutputT, - Class in org.apache.beam.sdk.transformsFailureT> -
A
PTransformthat adds exception handling toFlatMapElements. - Flat style - Search tag in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
- Section
- Flatten - Class in org.apache.beam.sdk.transforms
-
Flatten<T>takes multiplePCollection<T>s bundled into aPCollectionList<T>and returns a singlePCollection<T>containing all the elements in all the inputPCollections. - Flatten() - Constructor for class org.apache.beam.sdk.transforms.Flatten
- Flatten.Iterables<T> - Class in org.apache.beam.sdk.transforms
-
FlattenIterables<T>takes aPCollection<Iterable<T>>and returns aPCollection<T>that contains all the elements from each iterable. - Flatten.PCollections<T> - Class in org.apache.beam.sdk.transforms
-
A
PTransformthat flattens aPCollectionListinto aPCollectioncontaining all the elements of all thePCollections in its input. - Flattened() - Constructor for class org.apache.beam.sdk.schemas.transforms.Select.Flattened
- FLATTENED_ROW_SCHEMA - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider
- flattenedSchema() - Static method in class org.apache.beam.sdk.schemas.transforms.Select
-
Selects every leaf-level field.
- FlattenP - Class in org.apache.beam.runners.jet.processors
-
Jet
Processorimplementation for Beam's Flatten primitive. - FlattenP.Supplier - Class in org.apache.beam.runners.jet.processors
-
Jet
Processorsupplier that will provide instances ofFlattenP. - flattenRel(RelStructuredTypeFlattener) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSinkRel
- FlattenTransformProvider - Class in org.apache.beam.sdk.schemas.transforms.providers
-
An implementation of
TypedSchemaTransformProviderfor Flatten. - FlattenTransformProvider() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.FlattenTransformProvider
- FlattenTransformProvider.Configuration - Class in org.apache.beam.sdk.schemas.transforms.providers
- FlattenTransformProvider.Configuration.Builder - Class in org.apache.beam.sdk.schemas.transforms.providers
- FlattenTranslatorBatch<T> - Class in org.apache.beam.runners.twister2.translators.batch
-
Flatten translator.
- FlattenTranslatorBatch() - Constructor for class org.apache.beam.runners.twister2.translators.batch.FlattenTranslatorBatch
- FlattenWithHeterogeneousCoders - Interface in org.apache.beam.sdk.testing
-
Category tag for tests that use a
Flattenwhere the inputPCollectionListcontainsPCollectionsheterogeneouscoders. - FlinkAssignWindows<T,
W> - Class in org.apache.beam.runners.flink.translation.functions -
Flink
FlatMapFunctionfor implementingWindow.Assign. - FlinkAssignWindows(WindowFn<T, W>) - Constructor for class org.apache.beam.runners.flink.translation.functions.FlinkAssignWindows
- FlinkBatchPortablePipelineTranslator - Class in org.apache.beam.runners.flink
-
A translator that translates bounded portable pipelines into executable Flink pipelines.
- FlinkBatchPortablePipelineTranslator(Map<String, FlinkBatchPortablePipelineTranslator.PTransformTranslator>) - Constructor for class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator
- FlinkBatchPortablePipelineTranslator.BatchTranslationContext - Class in org.apache.beam.runners.flink
-
Batch translation context.
- FlinkBatchPortablePipelineTranslator.IsFlinkNativeTransform - Class in org.apache.beam.runners.flink
-
Predicate to determine whether a URN is a Flink native transform.
- FlinkBatchPortablePipelineTranslator.PTransformTranslator - Interface in org.apache.beam.runners.flink
-
Transform translation interface.
- FlinkBoundedSource<T> - Class in org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.bounded
-
A Flink
Sourceimplementation that wraps a BeamBoundedSource. - FlinkBoundedSource(String, BoundedSource<T>, SerializablePipelineOptions, Boundedness, int) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.bounded.FlinkBoundedSource
- FlinkBoundedSource(String, BoundedSource<T>, SerializablePipelineOptions, Boundedness, int, FlinkSource.TimestampExtractor<WindowedValue<T>>) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.bounded.FlinkBoundedSource
- FlinkBoundedSourceReader<T> - Class in org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.bounded
-
A Flink
SourceReaderimplementation that reads from the assignedFlinkSourceSplitsby using BeamBoundedReaders. - FlinkBoundedSourceReader(String, SourceReaderContext, PipelineOptions, ScheduledExecutorService, Function<WindowedValue<T>, Long>) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.bounded.FlinkBoundedSourceReader
- FlinkBoundedSourceReader(String, SourceReaderContext, PipelineOptions, Function<WindowedValue<T>, Long>) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.bounded.FlinkBoundedSourceReader
- FlinkBroadcastStateInternals<K> - Class in org.apache.beam.runners.flink.translation.wrappers.streaming.state
-
StateInternalsthat uses a FlinkOperatorStateBackendto manage the broadcast state. - FlinkBroadcastStateInternals(int, OperatorStateBackend, SerializablePipelineOptions) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkBroadcastStateInternals
- FlinkDetachedRunnerResult - Class in org.apache.beam.runners.flink
-
Result of a detached execution of a
Pipelinewith Flink. - FlinkDoFnFunction<InputT,
OutputT> - Class in org.apache.beam.runners.flink.translation.functions -
Encapsulates a
DoFninside a FlinkRichMapPartitionFunction. - FlinkDoFnFunction(DoFn<InputT, OutputT>, String, WindowingStrategy<?, ?>, Map<PCollectionView<?>, WindowingStrategy<?, ?>>, PipelineOptions, Map<TupleTag<?>, Integer>, TupleTag<OutputT>, Coder<InputT>, Map<TupleTag<?>, Coder<?>>, DoFnSchemaInformation, Map<String, PCollectionView<?>>) - Constructor for class org.apache.beam.runners.flink.translation.functions.FlinkDoFnFunction
- FlinkExecutableStageContextFactory - Class in org.apache.beam.runners.flink.translation.functions
-
Singleton class that contains one
ExecutableStageContext.Factoryper job. - FlinkExecutableStageFunction<InputT> - Class in org.apache.beam.runners.flink.translation.functions
-
Flink operator that passes its input DataSet through an SDK-executed
ExecutableStage. - FlinkExecutableStageFunction(String, PipelineOptions, RunnerApi.ExecutableStagePayload, JobInfo, Map<String, Integer>, FlinkExecutableStageContextFactory, Coder, Coder<WindowedValue<InputT>>) - Constructor for class org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction
- FlinkExecutableStagePruningFunction - Class in org.apache.beam.runners.flink.translation.functions
-
A Flink function that demultiplexes output from a
FlinkExecutableStageFunction. - FlinkExecutableStagePruningFunction(int, PipelineOptions) - Constructor for class org.apache.beam.runners.flink.translation.functions.FlinkExecutableStagePruningFunction
-
Creates a
FlinkExecutableStagePruningFunctionthat extracts elements of the given union tag. - FlinkExecutionEnvironments - Class in org.apache.beam.runners.flink
-
Utilities for Flink execution environments.
- FlinkExecutionEnvironments() - Constructor for class org.apache.beam.runners.flink.FlinkExecutionEnvironments
- FlinkExplodeWindowsFunction<T> - Class in org.apache.beam.runners.flink.translation.functions
-
Explode
WindowedValuethat belongs to multiple windows into multiple "single window"values, so we can safely group elements by (K, W) tuples. - FlinkExplodeWindowsFunction() - Constructor for class org.apache.beam.runners.flink.translation.functions.FlinkExplodeWindowsFunction
- FlinkIdentityFunction<T> - Class in org.apache.beam.runners.flink.translation.functions
-
A map function that outputs the input element without any change.
- FlinkJobInvoker - Class in org.apache.beam.runners.flink
-
Job Invoker for the
FlinkRunner. - FlinkJobInvoker(FlinkJobServerDriver.FlinkServerConfiguration) - Constructor for class org.apache.beam.runners.flink.FlinkJobInvoker
- FlinkJobServerDriver - Class in org.apache.beam.runners.flink
-
Driver program that starts a job server for the Flink runner.
- FlinkJobServerDriver.FlinkServerConfiguration - Class in org.apache.beam.runners.flink
-
Flink runner-specific Configuration for the jobServer.
- FlinkKey - Class in org.apache.beam.runners.flink.adapter
- FlinkKey() - Constructor for class org.apache.beam.runners.flink.adapter.FlinkKey
- FlinkKeyUtils - Class in org.apache.beam.runners.flink.translation.wrappers.streaming
-
Utility functions for dealing with key encoding.
- FlinkKeyUtils() - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.FlinkKeyUtils
- FlinkMergingNonShuffleReduceFunction<K,
InputT, - Class in org.apache.beam.runners.flink.translation.functionsAccumT, OutputT, W> -
Special version of
FlinkReduceFunctionthat supports merging windows. - FlinkMergingNonShuffleReduceFunction(CombineFnBase.GlobalCombineFn<InputT, AccumT, OutputT>, WindowingStrategy<Object, W>, Map<PCollectionView<?>, WindowingStrategy<?, ?>>, PipelineOptions) - Constructor for class org.apache.beam.runners.flink.translation.functions.FlinkMergingNonShuffleReduceFunction
- FlinkMetricContainer - Class in org.apache.beam.runners.flink.metrics
-
Helper class for holding a
MetricsContainerImpland forwarding Beam metrics to Flink accumulators and metrics. - FlinkMetricContainer(RuntimeContext) - Constructor for class org.apache.beam.runners.flink.metrics.FlinkMetricContainer
- FlinkMetricContainerWithoutAccumulator - Class in org.apache.beam.runners.flink.metrics
-
The base helper class for holding a
MetricsContainerImpland forwarding Beam metrics to Flink accumulators and metrics. - FlinkMetricContainerWithoutAccumulator(MetricGroup) - Constructor for class org.apache.beam.runners.flink.metrics.FlinkMetricContainerWithoutAccumulator
- FlinkMiniClusterEntryPoint - Class in org.apache.beam.runners.flink
-
Entry point for starting an embedded Flink cluster.
- FlinkMiniClusterEntryPoint() - Constructor for class org.apache.beam.runners.flink.FlinkMiniClusterEntryPoint
- FlinkMultiOutputPruningFunction<T> - Class in org.apache.beam.runners.flink.translation.functions
-
A
FlatMapFunctionfunction that filters out those elements that don't belong in this output. - FlinkMultiOutputPruningFunction(int, PipelineOptions) - Constructor for class org.apache.beam.runners.flink.translation.functions.FlinkMultiOutputPruningFunction
- FlinkNonMergingReduceFunction<K,
InputT> - Class in org.apache.beam.runners.flink.translation.functions -
Reduce function for non-merging GBK implementation.
- FlinkNonMergingReduceFunction(WindowingStrategy<?, ?>, boolean) - Constructor for class org.apache.beam.runners.flink.translation.functions.FlinkNonMergingReduceFunction
- FlinkNoOpStepContext - Class in org.apache.beam.runners.flink.translation.functions
-
A
StepContextfor Flink Batch Runner execution. - FlinkNoOpStepContext() - Constructor for class org.apache.beam.runners.flink.translation.functions.FlinkNoOpStepContext
- FlinkPartialReduceFunction<K,
InputT, - Class in org.apache.beam.runners.flink.translation.functionsAccumT, W> -
This is the first step for executing a
Combine.PerKeyon Flink. - FlinkPartialReduceFunction(CombineFnBase.GlobalCombineFn<InputT, AccumT, ?>, WindowingStrategy<Object, W>, Map<PCollectionView<?>, WindowingStrategy<?, ?>>, PipelineOptions) - Constructor for class org.apache.beam.runners.flink.translation.functions.FlinkPartialReduceFunction
- FlinkPartialReduceFunction(CombineFnBase.GlobalCombineFn<InputT, AccumT, ?>, WindowingStrategy<Object, W>, Map<PCollectionView<?>, WindowingStrategy<?, ?>>, PipelineOptions, boolean) - Constructor for class org.apache.beam.runners.flink.translation.functions.FlinkPartialReduceFunction
- FlinkPipelineOptions - Interface in org.apache.beam.runners.flink
-
Options which can be used to configure the Flink Runner.
- FlinkPipelineOptions.MaxBundleSizeFactory - Class in org.apache.beam.runners.flink
-
Maximum bundle size factory.
- FlinkPipelineOptions.MaxBundleTimeFactory - Class in org.apache.beam.runners.flink
-
Maximum bundle time factory.
- FlinkPipelineRunner - Class in org.apache.beam.runners.flink
-
Runs a Pipeline on Flink via
FlinkRunner. - FlinkPipelineRunner(FlinkPipelineOptions, String, List<String>) - Constructor for class org.apache.beam.runners.flink.FlinkPipelineRunner
-
Setup a flink pipeline runner.
- FlinkPortableClientEntryPoint - Class in org.apache.beam.runners.flink
-
Flink job entry point to launch a Beam pipeline by executing an external SDK driver program.
- FlinkPortableClientEntryPoint(String) - Constructor for class org.apache.beam.runners.flink.FlinkPortableClientEntryPoint
- FlinkPortablePipelineTranslator<T> - Interface in org.apache.beam.runners.flink
-
Interface for portable Flink translators.
- FlinkPortablePipelineTranslator.Executor - Interface in org.apache.beam.runners.flink
-
A handle used to execute a translated pipeline.
- FlinkPortablePipelineTranslator.TranslationContext - Interface in org.apache.beam.runners.flink
-
The context used for pipeline translation.
- FlinkPortableRunnerResult - Class in org.apache.beam.runners.flink
-
Result of executing a portable
Pipelinewith Flink. - FlinkPortableRunnerUtils - Class in org.apache.beam.runners.flink.translation.utils
-
Various utilies related to portability.
- FlinkReduceFunction<K,
AccumT, - Class in org.apache.beam.runners.flink.translation.functionsOutputT, W> -
This is the second part for executing a
Combine.PerKeyon Flink, the second part isFlinkReduceFunction. - FlinkReduceFunction(CombineFnBase.GlobalCombineFn<?, AccumT, OutputT>, WindowingStrategy<Object, W>, Map<PCollectionView<?>, WindowingStrategy<?, ?>>, PipelineOptions) - Constructor for class org.apache.beam.runners.flink.translation.functions.FlinkReduceFunction
- FlinkReduceFunction(CombineFnBase.GlobalCombineFn<?, AccumT, OutputT>, WindowingStrategy<Object, W>, Map<PCollectionView<?>, WindowingStrategy<?, ?>>, PipelineOptions, boolean) - Constructor for class org.apache.beam.runners.flink.translation.functions.FlinkReduceFunction
- FlinkRunner - Class in org.apache.beam.runners.flink
-
A
PipelineRunnerthat executes the operations in the pipeline by first translating them to a Flink Plan and then executing them either locally or on a Flink cluster, depending on the configuration. - FlinkRunner(FlinkPipelineOptions) - Constructor for class org.apache.beam.runners.flink.FlinkRunner
- FlinkRunnerRegistrar - Class in org.apache.beam.runners.flink
-
AutoService registrar - will register FlinkRunner and FlinkOptions as possible pipeline runner services.
- FlinkRunnerRegistrar.Options - Class in org.apache.beam.runners.flink
-
Pipeline options registrar.
- FlinkRunnerRegistrar.Runner - Class in org.apache.beam.runners.flink
-
Pipeline runner registrar.
- FlinkRunnerResult - Class in org.apache.beam.runners.flink
-
Result of executing a
Pipelinewith Flink. - FlinkServerConfiguration() - Constructor for class org.apache.beam.runners.flink.FlinkJobServerDriver.FlinkServerConfiguration
- FlinkSideInputReader - Class in org.apache.beam.runners.flink.translation.functions
-
A
SideInputReaderfor the Flink Batch Runner. - FlinkSideInputReader(Map<PCollectionView<?>, WindowingStrategy<?, ?>>, RuntimeContext) - Constructor for class org.apache.beam.runners.flink.translation.functions.FlinkSideInputReader
- FlinkSource<T,
OutputT> - Class in org.apache.beam.runners.flink.translation.wrappers.streaming.io.source -
The base class for
FlinkBoundedSourceandFlinkUnboundedSource. - FlinkSource(String, Source<T>, SerializablePipelineOptions, Boundedness, int) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSource
- FlinkSource.TimestampExtractor<T> - Interface in org.apache.beam.runners.flink.translation.wrappers.streaming.io.source
- FlinkSourceReaderBase<T,
OutputT> - Class in org.apache.beam.runners.flink.translation.wrappers.streaming.io.source -
An abstract implementation of
SourceReaderwhich encapsulatesBeam Sourcesfor data reading. - FlinkSourceReaderBase(String, ScheduledExecutorService, SourceReaderContext, PipelineOptions, Function<OutputT, Long>) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- FlinkSourceReaderBase(String, SourceReaderContext, PipelineOptions, Function<OutputT, Long>) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- FlinkSourceReaderBase.ReaderAndOutput - Class in org.apache.beam.runners.flink.translation.wrappers.streaming.io.source
-
A wrapper for the reader and its associated information.
- FlinkSourceSplit<T> - Class in org.apache.beam.runners.flink.translation.wrappers.streaming.io.source
-
A Flink
SourceSplitimplementation that encapsulates a BeamSource. - FlinkSourceSplit(int, Source<T>) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceSplit
- FlinkSourceSplit(int, Source<T>, byte[], UnboundedSource.CheckpointMark) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceSplit
- FlinkSourceSplitEnumerator<T> - Class in org.apache.beam.runners.flink.translation.wrappers.streaming.io.source
- FlinkSourceSplitEnumerator(SplitEnumeratorContext<FlinkSourceSplit<T>>, Source<T>, PipelineOptions, int) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceSplitEnumerator
- FlinkSourceSplitEnumerator(SplitEnumeratorContext<FlinkSourceSplit<T>>, Source<T>, PipelineOptions, int, boolean) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceSplitEnumerator
- FlinkStateBackendFactory - Interface in org.apache.beam.runners.flink
-
Constructs a StateBackend to use from flink pipeline options.
- FlinkStatefulDoFnFunction<K,
V, - Class in org.apache.beam.runners.flink.translation.functionsOutputT> -
A
RichGroupReduceFunctionfor statefulParDoin Flink Batch Runner. - FlinkStatefulDoFnFunction(DoFn<KV<K, V>, OutputT>, String, WindowingStrategy<?, ?>, Map<PCollectionView<?>, WindowingStrategy<?, ?>>, PipelineOptions, Map<TupleTag<?>, Integer>, TupleTag<OutputT>, Coder<KV<K, V>>, Map<TupleTag<?>, Coder<?>>, DoFnSchemaInformation, Map<String, PCollectionView<?>>) - Constructor for class org.apache.beam.runners.flink.translation.functions.FlinkStatefulDoFnFunction
- FlinkStateInternals<K> - Class in org.apache.beam.runners.flink.translation.wrappers.streaming.state
-
StateInternalsthat uses a FlinkKeyedStateBackendto manage state. - FlinkStateInternals(KeyedStateBackend<FlinkKey>, Coder<K>, Coder<? extends BoundedWindow>, SerializablePipelineOptions) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals
- FlinkStateInternals.EarlyBinder - Class in org.apache.beam.runners.flink.translation.wrappers.streaming.state
-
Eagerly create user state to work around https://jira.apache.org/jira/browse/FLINK-12653.
- FlinkStateInternals.FlinkStateNamespaceKeySerializer - Class in org.apache.beam.runners.flink.translation.wrappers.streaming.state
- FlinkStateInternals.FlinkStateNamespaceKeySerializer.FlinkStateNameSpaceSerializerSnapshot - Class in org.apache.beam.runners.flink.translation.wrappers.streaming.state
-
Serializer configuration snapshot for compatibility and format evolution.
- FlinkStateNamespaceKeySerializer(Coder<? extends BoundedWindow>) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.FlinkStateNamespaceKeySerializer
- FlinkStateNameSpaceSerializerSnapshot() - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.FlinkStateNamespaceKeySerializer.FlinkStateNameSpaceSerializerSnapshot
- FlinkStepContext() - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator.FlinkStepContext
- FlinkStreamingAggregationsTranslators - Class in org.apache.beam.runners.flink
- FlinkStreamingAggregationsTranslators() - Constructor for class org.apache.beam.runners.flink.FlinkStreamingAggregationsTranslators
- FlinkStreamingAggregationsTranslators.ConcatenateAsIterable<T> - Class in org.apache.beam.runners.flink
- FlinkStreamingPortablePipelineTranslator - Class in org.apache.beam.runners.flink
-
Translate an unbounded portable pipeline representation into a Flink pipeline representation.
- FlinkStreamingPortablePipelineTranslator() - Constructor for class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator
- FlinkStreamingPortablePipelineTranslator(Map<String, FlinkStreamingPortablePipelineTranslator.PTransformTranslator<FlinkStreamingPortablePipelineTranslator.StreamingTranslationContext>>) - Constructor for class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator
- FlinkStreamingPortablePipelineTranslator.IsFlinkNativeTransform - Class in org.apache.beam.runners.flink
-
Predicate to determine whether a URN is a Flink native transform.
- FlinkStreamingPortablePipelineTranslator.PTransformTranslator<T> - Interface in org.apache.beam.runners.flink
- FlinkStreamingPortablePipelineTranslator.StreamingTranslationContext - Class in org.apache.beam.runners.flink
-
Streaming translation context.
- FlinkUnboundedSource<T> - Class in org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.unbounded
-
A Flink
Sourceimplementation that wraps a BeamUnboundedSource. - FlinkUnboundedSource(String, UnboundedSource<T, ?>, SerializablePipelineOptions, int) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.unbounded.FlinkUnboundedSource
- FlinkUnboundedSource(String, UnboundedSource<T, ?>, SerializablePipelineOptions, int, FlinkSource.TimestampExtractor<WindowedValue<ValueWithRecordId<T>>>) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.unbounded.FlinkUnboundedSource
- FlinkUnboundedSourceReader<T> - Class in org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.unbounded
-
A Flink
SourceReaderimplementation that reads from the assignedFlinkSourceSplitsby using BeamUnboundedReaders. - FlinkUnboundedSourceReader(String, SourceReaderContext, PipelineOptions, ScheduledExecutorService, Function<WindowedValue<ValueWithRecordId<T>>, Long>) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.unbounded.FlinkUnboundedSourceReader
- FlinkUnboundedSourceReader(String, SourceReaderContext, PipelineOptions, Function<WindowedValue<ValueWithRecordId<T>>, Long>) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.unbounded.FlinkUnboundedSourceReader
- FLOAT - Enum constant in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- FLOAT - Enum constant in enum class org.apache.beam.sdk.transforms.display.DisplayData.Type
- FLOAT - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
- FLOAT - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
-
The type of float fields.
- FLOAT32 - Enum constant in enum class org.apache.beam.sdk.io.clickhouse.TableSchema.TypeName
- FLOAT32 - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- FLOAT64 - Enum constant in enum class org.apache.beam.sdk.io.clickhouse.TableSchema.TypeName
- FLOAT64 - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- FloatCoder - Class in org.apache.beam.sdk.coders
-
A
FloatCoderencodesFloatvalues in 4 bytes using Java serialization. - floats() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
The
TypeDescriptorfor Float. - floatToByteArray(float) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigtableUtils
- flush() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Sink
- flush() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver
- flush() - Method in interface org.apache.beam.sdk.fn.data.CloseableFnDataReceiver
-
Deprecated.to be removed once splitting/checkpointing are available in SDKs and rewinding in readers.
- flush() - Method in interface org.apache.beam.sdk.io.FileIO.Sink
-
Flushes the buffered state (if any) before the channel is closed.
- flush() - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Sink
- flush() - Method in class org.apache.beam.sdk.io.TextIO.Sink
- flush() - Method in class org.apache.beam.sdk.io.TFRecordIO.Sink
- flush() - Method in class org.apache.beam.sdk.io.thrift.ThriftIO.Sink
- flush() - Method in class org.apache.beam.sdk.io.xml.XmlIO.Sink
- flush(boolean) - Method in class org.apache.beam.runners.jet.metrics.JetMetricsContainer
- flush(String, long) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.WriteStreamService
-
Flush a given stream up to the given offset.
- flush(String, long) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.WriteStreamServiceImpl
- flush(String, long) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- FLUSH_ROWS - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics.RpcMethod
- flushBufferedMetrics() - Method in interface org.apache.beam.sdk.io.kafka.KafkaMetrics
- flushBufferedMetrics() - Method in class org.apache.beam.sdk.io.kafka.KafkaMetrics.KafkaMetricsImpl
-
Export all metrics recorded in this instance to the underlying
perWorkerMetricscontainers. - flushBufferedMetrics() - Method in class org.apache.beam.sdk.io.kafka.KafkaMetrics.NoOpKafkaMetrics
- flushBundle(DoFn.OnTimerContext) - Method in class org.apache.beam.sdk.io.solace.write.UnboundedBatchedSolaceWriter
- flushData() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.ExecutableStageDoFnOperator
- fn(Contextful.Fn<InputT, OutputT>, Requirements) - Static method in class org.apache.beam.sdk.transforms.Contextful
-
Same with
Contextful.of(ClosureT, org.apache.beam.sdk.transforms.Requirements)but with better type inference behavior for the case ofContextful.Fn. - fn(ProcessFunction<InputT, OutputT>) - Static method in class org.apache.beam.sdk.transforms.Contextful
- fn(SerializableFunction<InputT, OutputT>) - Static method in class org.apache.beam.sdk.transforms.Contextful
-
Binary compatibility adapter for
Contextful.fn(ProcessFunction). - FnApiControlClient - Class in org.apache.beam.runners.fnexecution.control
-
A client for the control plane of an SDK harness, which can issue requests to it over the Fn API.
- FnApiControlClientPoolService - Class in org.apache.beam.runners.fnexecution.control
-
A Fn API control service which adds incoming SDK harness connections to a sink.
- FnDataReceiver<T> - Interface in org.apache.beam.sdk.fn.data
-
A receiver of streamed data.
- FnDataService - Interface in org.apache.beam.runners.fnexecution.data
-
The
FnDataServiceis able to forward inbound elements to a consumer and is also a consumer of outbound elements. - FnService - Interface in org.apache.beam.sdk.fn.server
-
An interface sharing common behavior with services used during execution of user Fns.
- Footer() - Constructor for class org.apache.beam.runners.dataflow.internal.IsmFormat.Footer
- FooterCoder() - Constructor for class org.apache.beam.runners.dataflow.internal.IsmFormat.FooterCoder
- forBagUserStateHandlerFactory(ProcessBundleDescriptors.ExecutableProcessBundleDescriptor, StateRequestHandlers.BagUserStateHandlerFactory) - Static method in class org.apache.beam.runners.fnexecution.state.StateRequestHandlers
-
Returns an adapter which converts a
StateRequestHandlers.BagUserStateHandlerFactoryto aStateRequestHandler. - forBatch(PCollectionView<ViewT>) - Static method in class org.apache.beam.runners.dataflow.CreateDataflowView
- forBoolean(Boolean) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Constructs a
CloudObjectrepresenting the given value. - forBytes() - Static method in class org.apache.beam.sdk.extensions.zetasketch.HllCount.Init
-
Returns a
HllCount.Init.Builderfor aHllCount.InitcombiningPTransformthat computes bytes-type HLL++ sketches. - forClass(Class<?>) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Constructs a
CloudObjectto be used for serializing an instance of the supplied class for transport via the Dataflow API. - forClassName(String) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Constructs a
CloudObjectto be used for serializing data to be deserialized using the supplied class name the supplied class name for transport via the Dataflow API. - forCoder(TypeDescriptor<?>, Coder<?>) - Static method in class org.apache.beam.sdk.coders.CoderProviders
-
Creates a
CoderProviderthat always returns the given coder for the specified type. - forConsumers(List<DataEndpoint<?>>, List<TimerEndpoint<?>>) - Static method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver
-
Creates a receiver that is able to consume elements multiplexing on to the provided set of endpoints.
- forDescriptor(Endpoints.ApiServiceDescriptor) - Method in class org.apache.beam.sdk.fn.channel.ManagedChannelFactory
- forDescriptor(ProtoDomain, Descriptors.Descriptor) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoDynamicMessageSchema
-
Deprecated.Create a new ProtoDynamicMessageSchema from a
ProtoDomainand for a descriptor. - forDescriptor(ProtoDomain, String) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoDynamicMessageSchema
-
Deprecated.Create a new ProtoDynamicMessageSchema from a
ProtoDomainand for a message. - forEncoding(ByteString) - Static method in class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow
- forever(Trigger) - Static method in class org.apache.beam.sdk.transforms.windowing.Repeatedly
-
Create a composite trigger that repeatedly executes the trigger
repeated, firing each time it fires and ignoring any indications to finish. - forField(TypeDescriptor<?>, Field, int) - Static method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
- forFloat(Double) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Constructs a
CloudObjectrepresenting the given value. - forFloat(Float) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Constructs a
CloudObjectrepresenting the given value. - forGetter(Method, int) - Static method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
- forGetter(TypeDescriptor<?>, Method, int) - Static method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
- forHandler(RunnerApi.Environment, InstructionRequestHandler) - Static method in interface org.apache.beam.runners.fnexecution.environment.RemoteEnvironment
-
Create a new
RemoteEnvironmentfor the providedRunnerApi.EnvironmentandAutoCloseableInstructionRequestHandler. - forInteger(Integer) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Constructs a
CloudObjectrepresenting the given value. - forInteger(Long) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Constructs a
CloudObjectrepresenting the given value. - forIntegers() - Static method in class org.apache.beam.sdk.extensions.zetasketch.HllCount.Init
-
Returns a
HllCount.Init.Builderfor aHllCount.InitcombiningPTransformthat computes integer-type HLL++ sketches. - forIterableSideInput(String, String, Coder<V>, Coder<W>) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.SideInputHandlerFactory
- forIterableSideInput(String, String, Coder<V>, Coder<W>) - Method in class org.apache.beam.runners.fnexecution.translation.BatchSideInputHandlerFactory
- forIterableSideInput(String, String, Coder<V>, Coder<W>) - Method in class org.apache.beam.runners.fnexecution.translation.StreamingSideInputHandlerFactory
- forKey(K) - Static method in class org.apache.beam.runners.spark.stateful.SparkStateInternals
- forKeyAndState(K, Table<String, String, byte[]>) - Static method in class org.apache.beam.runners.spark.stateful.SparkStateInternals
- forKnownType(Object) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Constructs a
CloudObjectrepresenting the given value of a well-known cloud object type. - forLongs() - Static method in class org.apache.beam.sdk.extensions.zetasketch.HllCount.Init
-
Returns a
HllCount.Init.Builderfor aHllCount.InitcombiningPTransformthat computes long-type HLL++ sketches. - format - Variable in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
- FORMAT - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- FormatAsTextFn() - Constructor for class org.apache.beam.runners.spark.structuredstreaming.examples.WordCount.FormatAsTextFn
- formatByteStringRange(Range.ByteStringRange) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ByteStringRangeHelper
-
Returns formatted string of a partition for debugging.
- formatRecord(ElementT, Schema) - Method in interface org.apache.beam.sdk.extensions.avro.io.AvroIO.RecordFormatter
-
Deprecated.
- formatRecord(UserT) - Method in class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
-
Convert an input record type into the output type.
- formatTimestamp(Instant) - Static method in class org.apache.beam.sdk.transforms.windowing.BoundedWindow
-
Formats a
Instanttimestamp with additional Beam-specific metadata, such as indicating whether the timestamp is the end of the global window or one of the distinguished valuesBoundedWindow.TIMESTAMP_MIN_VALUEorBoundedWindow.TIMESTAMP_MIN_VALUE. - forMultimapSideInput(String, String, KvCoder<K, V>, Coder<W>) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.SideInputHandlerFactory
- forMultimapSideInput(String, String, KvCoder<K, V>, Coder<W>) - Method in class org.apache.beam.runners.fnexecution.translation.BatchSideInputHandlerFactory
- forMultimapSideInput(String, String, KvCoder<K, V>, Coder<W>) - Method in class org.apache.beam.runners.fnexecution.translation.StreamingSideInputHandlerFactory
- forNewInput(Instant, InputT) - Method in interface org.apache.beam.sdk.transforms.Watch.Growth.TerminationCondition
-
Called by the
Watchtransform to create a new independent termination state for a newly arrivedInputT. - forOneOf(String, boolean, Map<String, FieldValueTypeInformation>) - Static method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
- forOrdinal(int) - Static method in enum class org.apache.beam.sdk.io.kafka.KafkaTimestampType
- forProject(String, int, String) - Static method in class org.apache.beam.sdk.transformservice.launcher.TransformServiceLauncher
-
Initializes a client for managing transform service instances.
- forRegistry(MetricRegistry) - Static method in class org.apache.beam.runners.spark.metrics.WithMetricsSupport
- forRegistry(MetricRegistry) - Static method in class org.apache.beam.runners.spark.structuredstreaming.metrics.WithMetricsSupport
- forRequestObserver(String, StreamObserver<BeamFnApi.InstructionRequest>, ConcurrentMap<String, BeamFnApi.ProcessBundleDescriptor>) - Static method in class org.apache.beam.runners.fnexecution.control.FnApiControlClient
-
Returns a
FnApiControlClientwhich will submit its requests to the provided observer. - forService(InstructionRequestHandler) - Static method in class org.apache.beam.runners.fnexecution.environment.StaticRemoteEnvironmentFactory
- forSetter(Method) - Static method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
- forSetter(Method, String) - Static method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
- forSetter(TypeDescriptor<?>, Method) - Static method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
- forSetter(TypeDescriptor<?>, Method, String) - Static method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
- forSideInputHandlerFactory(Map<String, Map<String, ProcessBundleDescriptors.SideInputSpec>>, StateRequestHandlers.SideInputHandlerFactory) - Static method in class org.apache.beam.runners.fnexecution.state.StateRequestHandlers
-
Returns an adapter which converts a
StateRequestHandlers.SideInputHandlerFactoryto aStateRequestHandler. - forSqlType(Schema.TypeName) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.BigDecimalConverter
- forStage(ExecutableStage) - Method in class org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory
- forStage(ExecutableStage) - Method in interface org.apache.beam.runners.fnexecution.control.JobBundleFactory
- forStage(ExecutableStage) - Method in class org.apache.beam.runners.fnexecution.control.SingleEnvironmentInstanceJobBundleFactory
-
Deprecated.
- forStage(ExecutableStage, Map<RunnerApi.ExecutableStagePayload.SideInputId, PCollectionView<?>>, SideInputHandler) - Static method in class org.apache.beam.runners.fnexecution.translation.StreamingSideInputHandlerFactory
-
Creates a new state handler for the given stage.
- forStage(ExecutableStage, BatchSideInputHandlerFactory.SideInputGetter) - Static method in class org.apache.beam.runners.fnexecution.translation.BatchSideInputHandlerFactory
-
Creates a new state handler for the given stage.
- forStreamFromSources(List<Integer>, Map<Integer, GlobalWatermarkHolder.SparkWatermarks>) - Static method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
-
Build the
TimerInternalsaccording to the feeding streams. - forStreaming(PCollectionView<ViewT>) - Static method in class org.apache.beam.runners.dataflow.CreateDataflowView
- forString(String) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Constructs a
CloudObjectrepresenting the given value. - forStrings() - Static method in class org.apache.beam.sdk.extensions.zetasketch.HllCount.Init
-
Returns a
HllCount.Init.Builderfor aHllCount.InitcombiningPTransformthat computes string-type HLL++ sketches. - forThrowable(Throwable) - Static method in class org.apache.beam.sdk.values.EncodableThrowable
-
Wraps
throwableand returns the result. - forTransformHierarchy(TransformHierarchy, PipelineOptions) - Static method in class org.apache.beam.sdk.Pipeline
- forTypeName(Schema.TypeName) - Static method in class org.apache.beam.sdk.schemas.Schema.FieldType
- forUserState(String, String, Coder<K>, Coder<V>, Coder<W>) - Method in class org.apache.beam.runners.fnexecution.state.InMemoryBagUserStateFactory
- forUserState(String, String, Coder<K>, Coder<V>, Coder<W>) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.BagUserStateHandlerFactory
- ForwardingClientResponseObserver<ReqT,
RespT> - Class in org.apache.beam.sdk.fn.stream -
A
ClientResponseObserverwhich delegates allStreamObservercalls. - forWriter(LogWriter) - Static method in class org.apache.beam.runners.fnexecution.logging.GrpcLoggingService
- freeze() - Method in class org.apache.beam.runners.jet.metrics.JetMetricResults
- from(double, double) - Static method in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker.Progress
-
A representation for the amount of known completed and remaining work.
- from(long) - Static method in class org.apache.beam.sdk.io.GenerateSequence
-
Specifies the minimum number to generate (inclusive).
- from(TableReference) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
-
Read from table specified by a
TableReference. - from(TableReference) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- from(Struct) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper.PartitionMetadataMapper
-
Transforms a
Structrepresenting a partition metadata row into aPartitionMetadatamodel. - from(ConfigT) - Method in class org.apache.beam.sdk.schemas.transforms.TypedSchemaTransformProvider
-
Produce a SchemaTransform from ConfigT.
- from(String) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Parse
-
Reads from the given filename or filepattern.
- from(String) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Read
- from(String) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroSource
- from(String) - Static method in class org.apache.beam.sdk.extensions.python.PythonExternalTransform
-
Instantiates a cross-language wrapper for a Python transform with a given transform name.
- from(String) - Method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.Read
-
Reads text from the file(s) with the given filename or filename pattern.
- from(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
-
Reads a BigQuery table specified as
"[project_id]:[dataset_id].[table_id]","[project_id].[dataset_id].[table_id]", or"[dataset_id].[table_id]"for tables within the current project. - from(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- from(String) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Parse
- from(String) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Read
- from(String) - Method in class org.apache.beam.sdk.io.solr.SolrIO.Read
-
Provide name of collection while reading from Solr.
- from(String) - Method in class org.apache.beam.sdk.io.TextIO.Read
-
Reads text files that reads from the file(s) with the given filename or filename pattern.
- from(String) - Method in class org.apache.beam.sdk.io.TFRecordIO.Read
-
Returns a transform for reading TFRecord files that reads from the file(s) with the given filename or filename pattern.
- from(String) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
-
Reads a single XML file or a set of XML files defined by a Java "glob" file pattern.
- from(String, String) - Static method in class org.apache.beam.sdk.extensions.python.PythonExternalTransform
-
Instantiates a cross-language wrapper for a Python transform with a given transform name.
- from(String, Row, Schema) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaIOProvider
-
Produces a SchemaIO given a String representing the data's location, the schema of the data that resides there, and some IO-specific configuration object.
- from(String, Row, Schema) - Method in class org.apache.beam.sdk.io.jdbc.JdbcSchemaIOProvider
-
Produce a SchemaIO given a String representing the data's location, the schema of the data that resides there, and some IO-specific configuration object.
- from(String, Row, Schema) - Method in interface org.apache.beam.sdk.schemas.io.SchemaIOProvider
-
Produce a SchemaIO given a String representing the data's location, the schema of the data that resides there, and some IO-specific configuration object.
- from(String, Row, Schema) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSchemaIOProvider
-
Produce a SchemaIO given a String representing the data's location, the schema of the data that resides there, and some IO-specific configuration object.
- from(String, Row, Schema) - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider
-
Produce a SchemaIO given a String representing the data's location, the schema of the data that resides there, and some IO-specific configuration object.
- from(String, Row, Schema) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
-
Produce a SchemaIO given a String representing the data's location, the schema of the data that resides there, and some IO-specific configuration object.
- from(ExecutorService) - Static method in class org.apache.beam.sdk.fn.test.TestExecutors
- from(Supplier<ExecutorService>) - Static method in class org.apache.beam.sdk.fn.test.TestExecutors
- from(Map<String, String>) - Static method in class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.WorkerLogLevelOverrides
-
Deprecated.Expects a map keyed by logger
Names with values representingLevels. - from(Map<String, String>) - Static method in class org.apache.beam.sdk.options.SdkHarnessOptions.SdkHarnessLogLevelOverrides
-
Expects a map keyed by logger
Names with values representingLogLevels. - from(DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration) - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider
- from(WindowIntoTransformProvider.Configuration) - Method in class org.apache.beam.sdk.expansion.service.WindowIntoTransformProvider
- from(SqsReadConfiguration) - Method in class org.apache.beam.sdk.io.aws2.sqs.providers.SqsReadSchemaTransformProvider
- from(BoundedSource<T>) - Method in class org.apache.beam.sdk.io.Read.Builder
-
Returns a new
Read.BoundedPTransformreading from the givenBoundedSource. - from(BoundedSource<T>) - Static method in class org.apache.beam.sdk.io.Read
-
Returns a new
Read.BoundedPTransformreading from the givenBoundedSource. - from(CsvWriteTransformProvider.CsvWriteConfiguration) - Method in class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider
- from(FileBasedSource<T>) - Static method in class org.apache.beam.sdk.io.CompressedSource
-
Creates a
CompressedSourcefrom an underlyingFileBasedSource. - from(FileReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformProvider
- from(FileWriteSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformProvider
-
Builds a
SchemaTransformfrom aFileWriteSchemaTransformConfiguration. - from(MatchResult.Metadata) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroSource
- from(BigQueryExportReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformProvider
-
Returns the expected
SchemaTransformof the configuration. - from(BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider
- from(BigQueryWriteConfiguration) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryFileLoadsSchemaTransformProvider
- from(BigQueryWriteConfiguration) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider
- from(BigQueryWriteConfiguration) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteSchemaTransformProvider
- from(BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider
- from(BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider
- from(PubsubReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformProvider
- from(PubsubWriteSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformProvider
- from(PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
- from(PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
- from(SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider
- from(SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider
- from(SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider
- from(IcebergCdcReadSchemaTransformProvider.Configuration) - Method in class org.apache.beam.sdk.io.iceberg.IcebergCdcReadSchemaTransformProvider
- from(IcebergReadSchemaTransformProvider.Configuration) - Method in class org.apache.beam.sdk.io.iceberg.IcebergReadSchemaTransformProvider
- from(IcebergWriteSchemaTransformProvider.Configuration) - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider
- from(JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider
- from(JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.jdbc.providers.ReadFromMySqlSchemaTransformProvider
- from(JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.jdbc.providers.ReadFromPostgresSchemaTransformProvider
- from(JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.jdbc.providers.ReadFromSqlServerSchemaTransformProvider
- from(JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider
- from(JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.jdbc.providers.WriteToMySqlSchemaTransformProvider
- from(JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.jdbc.providers.WriteToPostgresSchemaTransformProvider
- from(JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.jdbc.providers.WriteToSqlServerSchemaTransformProvider
- from(JsonWriteTransformProvider.JsonWriteConfiguration) - Method in class org.apache.beam.sdk.io.json.providers.JsonWriteTransformProvider
- from(KafkaReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider
- from(KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider
- from(SingleStoreSchemaTransformReadConfiguration) - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadProvider
-
Returns the expected
SchemaTransformof the configuration. - from(SingleStoreSchemaTransformWriteConfiguration) - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteProvider
-
Returns the expected
SchemaTransformof the configuration. - from(Solace.Queue) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Read
-
Set the queue name to read from.
- from(Solace.Topic) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Read
-
Set the topic name to read from.
- from(TFRecordReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.TFRecordReadSchemaTransformProvider
-
Returns the expected
SchemaTransformof the configuration. - from(TFRecordWriteSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformProvider
-
Returns the expected
SchemaTransformof the configuration. - from(UnboundedSource<T, ?>) - Method in class org.apache.beam.sdk.io.Read.Builder
-
Returns a new
Read.UnboundedPTransformreading from the givenUnboundedSource. - from(UnboundedSource<T, ?>) - Static method in class org.apache.beam.sdk.io.Read
- from(ManagedSchemaTransformProvider.ManagedConfig) - Method in class org.apache.beam.sdk.managed.ManagedSchemaTransformProvider
- from(TestSchemaTransformProvider.Config) - Method in class org.apache.beam.sdk.managed.testing.TestSchemaTransformProvider
- from(ValueProvider<String>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Parse
- from(ValueProvider<String>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Read
-
Reads from the given filename or filepattern.
- from(ValueProvider<String>) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroSource
-
Reads from the given file name or pattern ("glob").
- from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.Read
-
Same as
from(filepattern), but accepting aValueProvider. - from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
-
Same as
from(String), but with aValueProvider. - from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Parse
- from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Read
-
Reads from the given filename or filepattern.
- from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.TextIO.Read
-
Same as
from(filepattern), but accepting aValueProvider. - from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.TFRecordIO.Read
-
Same as
from(filepattern), but accepting aValueProvider. - from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
-
Reads a single XML file or a set of XML files defined by a Java "glob" file pattern.
- from(GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration) - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider
- from(FlattenTransformProvider.Configuration) - Method in class org.apache.beam.sdk.schemas.transforms.providers.FlattenTransformProvider
- from(JavaExplodeTransformProvider.Configuration) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider
- from(JavaFilterTransformProvider.Configuration) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider
- from(JavaMapToFieldsTransformProvider.Configuration) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider
- from(LoggingTransformProvider.Configuration) - Method in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider
- from(HasDisplayData) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
-
Collect the
DisplayDatafrom a component. - from(Row) - Method in class org.apache.beam.sdk.extensions.sql.expansion.SqlTransformSchemaTransformProvider
- from(Row) - Method in interface org.apache.beam.sdk.schemas.transforms.SchemaTransformProvider
-
Produce a
SchemaTransformfrom some transform-specific configuration object. - from(Row) - Method in class org.apache.beam.sdk.schemas.transforms.TypedSchemaTransformProvider
-
Produces a
SchemaTransformfrom a Row configuration. - from(TableIdentifier) - Method in class org.apache.beam.sdk.io.iceberg.IcebergIO.ReadRows
- fromArgs(String...) - Method in class org.apache.beam.sdk.options.PipelineOptionsFactory.Builder
-
Sets the command line arguments to parse when constructing the
PipelineOptions. - fromArgs(String...) - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
-
Sets the command line arguments to parse when constructing the
PipelineOptions. - fromArray(T...) - Static method in class org.apache.beam.sdk.fn.stream.PrefetchableIterables
-
Returns a
PrefetchableIterableover the specified values. - fromArray(T...) - Static method in class org.apache.beam.sdk.fn.stream.PrefetchableIterators
-
Returns a
PrefetchableIteratorover the specified values. - fromAvroType(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.FixedBytesField
-
Create a
AvroUtils.FixedBytesFieldfrom an AVRO type. - fromBeamFieldType(Schema.FieldType) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.FixedBytesField
-
Create a
AvroUtils.FixedBytesFieldfrom a BeamSchema.FieldType. - fromByteArray(byte[], Coder<T>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
-
Utility method for deserializing a byte array using the specified coder.
- fromByteArray(byte[], Coder<T>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.CoderHelpers
-
Utility method for deserializing a byte array using the specified coder.
- fromByteArray(byte[], Coder<T>) - Static method in class org.apache.beam.runners.twister2.utils.TranslationUtils
-
Utility method for deserializing a byte array using the specified coder.
- fromByteArray(byte[], WindowedValues.WindowedValueCoder<T>) - Static method in class org.apache.beam.runners.twister2.utils.TranslationUtils
-
Utility method for deserializing a byte array using the specified coder.
- fromByteArrays(Collection<byte[]>, Coder<T>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
-
Utility method for deserializing a Iterable of byte arrays using the specified coder.
- fromByteFunction(Coder<T>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
-
A function wrapper for converting a byte array to an object.
- FromByteFunction(Coder<K>, Coder<V>) - Constructor for class org.apache.beam.runners.spark.coders.CoderHelpers.FromByteFunction
- fromByteFunctionIterable(Coder<K>, Coder<V>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
-
A function wrapper for converting a byte array pair to a key-value pair, where values are
Iterable. - fromCanonical(Compression) - Static method in enum class org.apache.beam.sdk.io.FileBasedSink.CompressionType
-
Deprecated.
- fromCloudDuration(String) - Static method in class org.apache.beam.runners.dataflow.util.TimeUtil
-
Converts a Dataflow API duration string into a
Duration. - fromCloudObject(CloudObject) - Method in interface org.apache.beam.runners.dataflow.util.CloudObjectTranslator
-
Converts back into the original object from a provided
CloudObject. - fromCloudObject(CloudObject) - Method in class org.apache.beam.runners.dataflow.util.RowCoderCloudObjectTranslator
-
Convert from a cloud object.
- fromCloudObject(CloudObject) - Method in class org.apache.beam.runners.dataflow.util.SchemaCoderCloudObjectTranslator
-
Convert from a cloud object.
- fromCloudPubsubMessages() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.CloudPubsubTransforms
-
Transform messages publishable using PubsubIO to their equivalent Pub/Sub Lite publishable message.
- fromCloudTime(String) - Static method in class org.apache.beam.runners.dataflow.util.TimeUtil
-
Converts a time value received via the Dataflow API into the corresponding
Instant. - fromComponents(String, String) - Static method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
-
Creates a GcsPath from bucket and object components.
- fromComponents(List<Coder<?>>, byte[]) - Static method in class org.apache.beam.sdk.values.WindowedValues.ParamWindowedValueCoder
- fromComponents(List<Coder<?>>, byte[], CoderTranslation.TranslationContext) - Method in class org.apache.beam.sdk.extensions.avro.AvroGenericCoderTranslator
- fromConfig(FlinkJobServerDriver.FlinkServerConfiguration) - Static method in class org.apache.beam.runners.flink.FlinkJobServerDriver
- fromConfig(FlinkJobServerDriver.FlinkServerConfiguration, JobServerDriver.JobInvokerFactory) - Static method in class org.apache.beam.runners.flink.FlinkJobServerDriver
- fromConfig(SparkJobServerDriver.SparkServerConfiguration) - Static method in class org.apache.beam.runners.spark.SparkJobServerDriver
- fromConfigRow(Row, PipelineOptions) - Method in class org.apache.beam.sdk.schemas.transforms.SchemaTransformTranslation.SchemaTransformPayloadTranslator
- fromExceptionInformation(RecordT, Coder<RecordT>, Exception, String) - Static method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord
- fromExecutableStage(String, ExecutableStage, Endpoints.ApiServiceDescriptor) - Static method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors
- fromExecutableStage(String, ExecutableStage, Endpoints.ApiServiceDescriptor, Endpoints.ApiServiceDescriptor) - Static method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors
-
Note that the
BeamFnApi.ProcessBundleDescriptoris constructed by: Adding gRPC read and write nodes wiring them to the specified data endpoint. - fromExistingTable(String, String) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataTableNames
-
Encapsulates a selected table name.
- fromFeedRange(FeedRange) - Static method in class org.apache.beam.sdk.io.azure.cosmos.NormalizedRange
- fromFile(File, boolean) - Static method in class org.apache.beam.sdk.io.LocalResources
- fromFile(String) - Static method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService.ArtifactDestination
- fromFile(String, OutputStream) - Static method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService.ArtifactDestination
- fromGenericAvroSchema(Schema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
-
Convert an Avro
Schemato a BigQueryTableSchema. - fromGenericAvroSchema(Schema, Boolean) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
-
Convert an Avro
Schemato a BigQueryTableSchema. - fromHex(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinStringFunctions
- fromHttpResponse(HttpResponse) - Static method in class org.apache.beam.sdk.io.solace.broker.BrokerResponse
- fromInt(Integer, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.DateConversion
- fromInt(Integer, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimeMillisConversion
- fromInt(Integer, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.DateConversion
- fromInt(Integer, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimeConversion
- fromIr(Ir) - Static method in class org.apache.beam.sdk.extensions.sbe.SerializableIr
-
Creates a new instance from
ir. - fromIr(Ir, SbeSchema.IrOptions) - Static method in class org.apache.beam.sdk.extensions.sbe.SbeSchema
-
Creates a new
SbeSchemafrom the given intermediate representation. - fromJsonFile(File) - Method in class org.apache.beam.sdk.io.cdap.ConfigWrapper
-
Gets
ConfigWrapperby JSON file. - fromJsonString(String, Class<T>) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
- fromJsonString(String) - Method in class org.apache.beam.sdk.io.cdap.ConfigWrapper
-
Gets
ConfigWrapperby JSON string. - fromLong(Long, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.LocalTimestampMicrosConversion
- fromLong(Long, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.LocalTimestampMillisConversion
- fromLong(Long, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimeMicrosConversion
- fromLong(Long, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimestampMicrosConversion
- fromLong(Long, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimestampMillisConversion
- fromLong(Long, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimeMicrosConversion
- fromLong(Long, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimestampConversion
- fromLong(Long, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimestampMicrosConversion
- fromMap(Map<String, String>) - Static method in class org.apache.beam.sdk.io.hadoop.SerializableConfiguration
-
Returns a new configuration instance using provided flags.
- fromModel(Message) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
-
From model
Messageto hl7v2 message. - fromName(String) - Static method in enum class org.apache.beam.io.debezium.Connectors
-
Returns a connector class corresponding to the given connector name.
- fromName(String) - Static method in class org.apache.beam.sdk.io.solace.data.Solace.Queue
- fromName(String) - Static method in class org.apache.beam.sdk.io.solace.data.Solace.Topic
- fromObject(StorageObject) - Static method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
-
Creates a GcsPath from a StorageObject.
- fromOptions(DataflowPipelineOptions) - Static method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator
-
Constructs a translator from the provided options.
- fromOptions(PipelineOptions) - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem.ClassLoaderFileSystemRegistrar
- fromOptions(PipelineOptions) - Method in class org.apache.beam.sdk.io.LocalFileSystemRegistrar
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.dataflow.DataflowRunner
-
Construct a runner from the provided options.
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.dataflow.TestDataflowRunner
-
Constructs a runner from the provided options.
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.dataflow.util.GcsStager
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.direct.DirectRunner
-
Construct a
DirectRunnerfrom the provided options. - fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.flink.FlinkRunner
-
Construct a runner from the provided options.
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.flink.TestFlinkRunner
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.jet.JetRunner
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.portability.PortableRunner
-
Constructs a runner from the provided options.
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.portability.testing.TestPortableRunner
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.portability.testing.TestUniversalRunner
-
Constructs a runner from the provided options.
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.prism.PrismRunner
-
Invoked from
Pipeline.run()wherePrismRunnerinstantiates usingPrismPipelineOptionsconfiguration details. - fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.prism.TestPrismRunner
-
Invoked from
Pipeline.run()whereTestPrismRunnerinstantiates usingTestPrismPipelineOptionsconfiguration details. - fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.spark.SparkRunner
-
Creates and returns a new SparkRunner with specified options.
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.spark.SparkRunnerDebugger
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingRunner
-
Creates and returns a new SparkStructuredStreamingRunner with specified options.
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.spark.TestSparkRunner
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.twister2.Twister2Runner
-
Deprecated.
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.twister2.Twister2TestRunner
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.extensions.gcp.auth.GcpCredentialFactory
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.extensions.gcp.auth.NoopCredentialFactory
- fromOptions(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsFileSystemRegistrar
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.extensions.gcp.storage.GcsPathValidator
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.extensions.gcp.storage.NoopPathValidator
- fromOptions(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws2.s3.DefaultS3FileSystemSchemeRegistrar
- fromOptions(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemRegistrar
- fromOptions(PipelineOptions) - Method in interface org.apache.beam.sdk.io.aws2.s3.S3FileSystemSchemeRegistrar
-
Create zero or more
S3FileSystemConfigurationinstances from the givenPipelineOptions. - fromOptions(PipelineOptions) - Method in class org.apache.beam.sdk.io.azure.blobstore.AzureBlobStoreFileSystemRegistrar
- fromOptions(PipelineOptions) - Method in interface org.apache.beam.sdk.io.FileSystemRegistrar
-
Create zero or more
filesystemsfrom the givenPipelineOptions. - fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
-
Creates an instance of this rule using provided options.
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.io.googleads.GoogleAdsUserCredentialFactory
- fromOptions(PipelineOptions) - Method in class org.apache.beam.sdk.io.hdfs.HadoopFileSystemRegistrar
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.PipelineRunner
-
Constructs a runner from the provided
PipelineOptions. - fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.testing.CrashingRunner
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.testing.TestPipeline
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.testing.TestPipelineExtension
-
Creates a new TestPipelineExtension with custom options.
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHints
-
Creates a
ResourceHintsinstance with hints supplied in options. - fromOptions(PipelineOptions, Function<ClientConfig, JetInstance>) - Static method in class org.apache.beam.runners.jet.JetRunner
- fromParams(String[]) - Static method in class org.apache.beam.runners.flink.FlinkJobServerDriver
- fromParams(String[]) - Static method in class org.apache.beam.runners.spark.SparkJobServerDriver
- fromParams(DefaultFilenamePolicy.Params) - Static method in class org.apache.beam.sdk.io.DefaultFilenamePolicy
-
Construct a
DefaultFilenamePolicyfrom aDefaultFilenamePolicy.Paramsobject. - fromPath(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubSubscription
-
Creates a class representing a Pub/Sub subscription from the specified subscription path.
- fromPath(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
-
Creates a class representing a Cloud Pub/Sub topic from the specified topic path.
- fromPath(Path, boolean) - Static method in class org.apache.beam.sdk.io.LocalResources
- fromProcessFunctionWithOutputType(ProcessFunction<InputT, OutputT>, TypeDescriptor<OutputT>) - Static method in class org.apache.beam.sdk.transforms.InferableFunction
- fromProto(PubsubMessage) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages
- fromPTransform(RunnerApi.PTransform) - Static method in class org.apache.beam.sdk.fn.data.RemoteGrpcPortRead
- fromPTransform(RunnerApi.PTransform) - Static method in class org.apache.beam.sdk.fn.data.RemoteGrpcPortWrite
- fromQuery(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
-
Reads results received after executing the given query.
- fromQuery(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- fromQuery(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
-
A query to be executed in Snowflake.
- fromQuery(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
-
Same as
fromQuery(String), but with aValueProvider. - fromQuery(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- fromQuery(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
- fromRawEvents(Coder<T>, List<TestStream.Event<T>>) - Static method in class org.apache.beam.sdk.testing.TestStream
-
For internal use only.
- fromResourceName(String) - Static method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
-
Creates a GcsPath from a OnePlatform resource name in string form.
- fromRow(Row) - Static method in class org.apache.beam.sdk.values.Row
-
Creates a row builder based on the specified row.
- fromRowFunction(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.io.aws2.schemas.AwsSchemaProvider
- fromRowFunction(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.annotations.DefaultSchema.DefaultSchemaProvider
-
Given a type, returns a function that converts from a
Rowobject to that type. - fromRowFunction(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProvider
-
Deprecated.
- fromRowFunction(TypeDescriptor<T>) - Method in interface org.apache.beam.sdk.schemas.SchemaProvider
-
Given a type, returns a function that converts from a
Rowobject to that type. - fromRows(Class<OutputT>) - Static method in class org.apache.beam.sdk.schemas.transforms.Convert
-
Convert a
PCollection<Row> into aPCollection<OutputT>. - fromRows(TypeDescriptor<OutputT>) - Static method in class org.apache.beam.sdk.schemas.transforms.Convert
-
Convert a
PCollection<Row> into aPCollection<OutputT>. - fromS3Options(S3Options) - Static method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
- fromSerializableFunctionWithOutputType(SerializableFunction<InputT, OutputT>, TypeDescriptor<OutputT>) - Static method in class org.apache.beam.sdk.transforms.SimpleFunction
- fromSnapshot(Long) - Method in class org.apache.beam.sdk.io.iceberg.IcebergIO.ReadRows
- fromSnapshot(Snapshot) - Static method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo
- fromSnapshot(Snapshot, String) - Static method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo
- fromSpec(Map<String, Object>) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Constructs a
CloudObjectby copying the supplied serialized object spec, which must represent an SDK object serialized for transport via the Dataflow API. - fromSpec(HCatalogIO.Read) - Static method in class org.apache.beam.sdk.io.hcatalog.HCatToRow
- fromStandardParameters(ValueProvider<ResourceId>, String, String, boolean) - Static method in class org.apache.beam.sdk.io.DefaultFilenamePolicy
-
Construct a
DefaultFilenamePolicy. - fromStaticMethods(Class<?>, Class<?>) - Static method in class org.apache.beam.sdk.coders.CoderProviders
-
Creates a
CoderProviderfrom a class'sstatic <T> Coder<T> of(TypeDescriptor<T>, List<Coder<?>>) method. - fromString(String, boolean) - Static method in class org.apache.beam.sdk.io.LocalResources
- fromString(ValueProvider<String>, boolean) - Static method in class org.apache.beam.sdk.io.LocalResources
- fromSubscription(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
-
Reads from the given subscription.
- fromSubscription(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
-
Like
subscription()but with aValueProvider. - fromSupplier(SerializableSupplier<Matcher<T>>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
Constructs a
SerializableMatcherfrom a non-serializableMatchervia indirection throughSerializableSupplier. - fromTable(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
-
A table name to be read in Snowflake.
- fromTable(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
- fromTableSchema(TableSchema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
-
Convert a BigQuery
TableSchemato a BeamSchema. - fromTableSchema(TableSchema, BigQueryUtils.SchemaConversionOptions) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
-
Convert a BigQuery
TableSchemato a BeamSchema. - fromTimestamp(Long) - Method in class org.apache.beam.sdk.io.iceberg.IcebergIO.ReadRows
- fromTopic(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
-
Creates and returns a transform for reading from a Cloud Pub/Sub topic.
- fromTopic(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
-
Like
PubsubIO.Read.fromTopic(String)but with aValueProvider. - fromUri(String) - Static method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
-
Creates a GcsPath from a URI in string form.
- fromUri(URI) - Static method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
-
Creates a GcsPath from a URI.
- FULL - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService.TableMetadataView
- FULL_RANGE - Static variable in class org.apache.beam.sdk.io.azure.cosmos.NormalizedRange
- FullNameTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider
-
Base class for table providers that look up table metadata using full table names, instead of querying it by parts of the name separately.
- FullNameTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.FullNameTableProvider
- fullOuterJoin(String, PCollection<KV<K, V1>>, PCollection<KV<K, V2>>, V1, V2) - Static method in class org.apache.beam.sdk.extensions.joinlibrary.Join
-
Full Outer Join of two collections of KV elements.
- fullOuterJoin(PCollection<KV<K, V1>>, PCollection<KV<K, V2>>, V1, V2) - Static method in class org.apache.beam.sdk.extensions.joinlibrary.Join
-
Full Outer Join of two collections of KV elements.
- fullOuterJoin(PCollection<RhsT>) - Static method in class org.apache.beam.sdk.schemas.transforms.Join
-
Perform a full outer join.
- fullUpdate(String, String) - Static method in class org.apache.beam.sdk.io.mongodb.UpdateField
-
Sets the limit of documents to find.
- fullyExpand(Map<TupleTag<?>, PValue>) - Static method in class org.apache.beam.sdk.values.PValues
-
Returns all the tagged
PCollectionsrepresented in the givenPValue. - fun1(ScalaInterop.Fun1<T, V>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.utils.ScalaInterop
- fun2(ScalaInterop.Fun2<T1, T2, V>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.utils.ScalaInterop
- FUNCTION - Enum constant in enum class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields.Fanout.Kind
- functionToFlatMapFunction(Function<InputT, OutputT>) - Static method in class org.apache.beam.runners.spark.translation.TranslationUtils
- fuse(PipelineTranslator.UnresolvedTranslation<T, T2>) - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator.UnresolvedTranslation
G
- gauge(Class<?>, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
-
Create a metric that can have its new value set, and is aggregated by taking the last reported value.
- gauge(String, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
-
Create a metric that can have its new value set, and is aggregated by taking the last reported value.
- gauge(MetricName) - Static method in class org.apache.beam.sdk.metrics.Metrics
-
Create a metric that can have its new value set, and is aggregated by taking the last reported value.
- Gauge - Interface in org.apache.beam.sdk.metrics
-
A metric that reports the latest value out of reported values.
- GaugeImpl - Class in org.apache.beam.runners.jet.metrics
-
Implementation of
Gauge. - GaugeImpl(MetricName) - Constructor for class org.apache.beam.runners.jet.metrics.GaugeImpl
- GaugeResult - Class in org.apache.beam.sdk.metrics
-
The result of a
Gaugemetric. - GaugeResult() - Constructor for class org.apache.beam.sdk.metrics.GaugeResult
- GaugeResult.EmptyGaugeResult - Class in org.apache.beam.sdk.metrics
-
Empty
GaugeResult, representing no values reported. - GceMetadataUtil - Class in org.apache.beam.sdk.extensions.gcp.util
- GceMetadataUtil() - Constructor for class org.apache.beam.sdk.extensions.gcp.util.GceMetadataUtil
- GcpCredentialFactory - Class in org.apache.beam.sdk.extensions.gcp.auth
-
Construct an oauth credential to be used by the SDK and the SDK workers.
- GcpIoPipelineOptionsRegistrar - Class in org.apache.beam.sdk.io.gcp.common
-
A registrar containing the default GCP options.
- GcpIoPipelineOptionsRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.common.GcpIoPipelineOptionsRegistrar
- GcpOAuthScopesFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.GcpOAuthScopesFactory
- GcpOptions - Interface in org.apache.beam.sdk.extensions.gcp.options
-
Options used to configure Google Cloud Platform specific options such as the project and credentials.
- GcpOptions.DefaultProjectFactory - Class in org.apache.beam.sdk.extensions.gcp.options
-
Attempts to infer the default project based upon the environment this application is executing within.
- GcpOptions.EnableStreamingEngineFactory - Class in org.apache.beam.sdk.extensions.gcp.options
-
EnableStreamingEngine defaults to false unless one of the two experiments is set.
- GcpOptions.GcpOAuthScopesFactory - Class in org.apache.beam.sdk.extensions.gcp.options
-
Returns the default set of OAuth scopes.
- GcpOptions.GcpTempLocationFactory - Class in org.apache.beam.sdk.extensions.gcp.options
-
Returns
PipelineOptions.getTempLocation()as the default GCP temp location. - GcpOptions.GcpUserCredentialsFactory - Class in org.apache.beam.sdk.extensions.gcp.options
-
Attempts to load the GCP credentials.
- GcpPipelineOptionsRegistrar - Class in org.apache.beam.sdk.extensions.gcp.options
-
A registrar containing the default GCP options.
- GcpPipelineOptionsRegistrar() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcpPipelineOptionsRegistrar
- GCPSecretSessionServiceFactory - Class in org.apache.beam.sdk.io.solace.broker
-
This class implements a
SessionServiceFactorythat retrieve the basic authentication credentials from a Google Cloud Secret Manager secret. - GCPSecretSessionServiceFactory() - Constructor for class org.apache.beam.sdk.io.solace.broker.GCPSecretSessionServiceFactory
- GCPSecretSessionServiceFactory.Builder - Class in org.apache.beam.sdk.io.solace.broker
- GcpTempLocationFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.GcpTempLocationFactory
- GcpUserCredentialsFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.GcpUserCredentialsFactory
- GCS_URI - Static variable in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
-
Pattern that is used to parse a GCS URL.
- GcsCountersOptions() - Constructor for class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.GcsCountersOptions
- GcsCreateOptions - Class in org.apache.beam.sdk.extensions.gcp.storage
-
An abstract class that contains common configuration options for creating resources.
- GcsCreateOptions() - Constructor for class org.apache.beam.sdk.extensions.gcp.storage.GcsCreateOptions
- GcsCreateOptions.Builder - Class in org.apache.beam.sdk.extensions.gcp.storage
-
A builder for
GcsCreateOptions. - GcsCustomAuditEntries() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcsOptions.GcsCustomAuditEntries
- GcsFileSystemRegistrar - Class in org.apache.beam.sdk.extensions.gcp.storage
-
AutoServiceregistrar for theGcsFileSystem. - GcsFileSystemRegistrar() - Constructor for class org.apache.beam.sdk.extensions.gcp.storage.GcsFileSystemRegistrar
- GcsOptions - Interface in org.apache.beam.sdk.extensions.gcp.options
-
Options used to configure Google Cloud Storage.
- GcsOptions.ExecutorServiceFactory - Class in org.apache.beam.sdk.extensions.gcp.options
-
Returns the default
ExecutorServiceto use within the Apache Beam SDK. - GcsOptions.GcsCustomAuditEntries - Class in org.apache.beam.sdk.extensions.gcp.options
-
Creates a
GcsOptions.GcsCustomAuditEntriesthat key-value pairs to be stored as custom information in GCS audit logs. - GcsOptions.PathValidatorFactory - Class in org.apache.beam.sdk.extensions.gcp.options
-
Creates a
PathValidatorobject using the class specified inGcsOptions.getPathValidatorClass(). - GcsPath - Class in org.apache.beam.sdk.extensions.gcp.util.gcsfs
-
Implements the Java NIO
PathAPI for Google Cloud Storage paths. - GcsPath(FileSystem, String, String) - Constructor for class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
-
Constructs a GcsPath.
- GcsPathValidator - Class in org.apache.beam.sdk.extensions.gcp.storage
-
GCP implementation of
PathValidator. - GcsReadOptionsFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.GcsReadOptionsFactory
- GcsResourceId - Class in org.apache.beam.sdk.extensions.gcp.storage
-
ResourceIdimplementation for Google Cloud Storage. - GcsStager - Class in org.apache.beam.runners.dataflow.util
-
Utility class for staging files to GCS.
- gcsUploadBufferSizeBytes() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsCreateOptions
-
The buffer size (in bytes) to use when uploading files to GCS.
- GcsUtil - Class in org.apache.beam.sdk.extensions.gcp.util
-
Provides operations on GCS.
- GcsUtil.CreateOptions - Class in org.apache.beam.sdk.extensions.gcp.util
- GcsUtil.CreateOptions.Builder - Class in org.apache.beam.sdk.extensions.gcp.util
- GcsUtil.GcsCountersOptions - Class in org.apache.beam.sdk.extensions.gcp.util
- GcsUtil.GcsReadOptionsFactory - Class in org.apache.beam.sdk.extensions.gcp.util
- GcsUtil.GcsUtilFactory - Class in org.apache.beam.sdk.extensions.gcp.util
-
This is a
DefaultValueFactoryable to create aGcsUtilusing any transport flags specified on thePipelineOptions. - GcsUtil.StorageObjectOrIOException - Class in org.apache.beam.sdk.extensions.gcp.util
-
A class that holds either a
StorageObjector anIOException. - GcsUtilFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.GcsUtilFactory
- generate(Schema) - Static method in class org.apache.beam.sdk.coders.RowCoderGenerator
- generateInitialChangeStreamPartitions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.ChangeStreamDao
-
Returns the result from GenerateInitialChangeStreamPartitions API.
- generateInitialPartitionsAction(ChangeStreamMetrics, ChangeStreamDao, Instant) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ActionFactory
-
Creates and returns a singleton instance of an action class for processing
DetectNewPartitionsDoFn - GenerateInitialPartitionsAction - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.action
-
Class to generate first set of outputs for
DetectNewPartitionsDoFn. - GenerateInitialPartitionsAction(ChangeStreamMetrics, ChangeStreamDao, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.GenerateInitialPartitionsAction
- generateRandom(String) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataTableNames
-
Generates a unique name for the partition metadata table and its indexes.
- generateRowKeyPrefix() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.UniqueIdGenerator
-
Return a random base64 encoded 8 byte string.
- GenerateSequence - Class in org.apache.beam.sdk.io
-
A
PTransformthat produces longs starting from the given value, and either up to the given limit or untilLong.MAX_VALUE/ until the given time elapses. - GenerateSequence() - Constructor for class org.apache.beam.sdk.io.GenerateSequence
- GenerateSequence.External - Class in org.apache.beam.sdk.io
-
Exposes GenerateSequence as an external transform for cross-language usage.
- GenerateSequence.External.ExternalConfiguration - Class in org.apache.beam.sdk.io
-
Parameters class to expose the transform to an external SDK.
- GenerateSequenceConfiguration() - Constructor for class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration
- GenerateSequenceSchemaTransformProvider - Class in org.apache.beam.sdk.providers
- GenerateSequenceSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider
- GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration - Class in org.apache.beam.sdk.providers
- GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Builder - Class in org.apache.beam.sdk.providers
- GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Rate - Class in org.apache.beam.sdk.providers
- GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Rate.Builder - Class in org.apache.beam.sdk.providers
- GenerateSequenceSchemaTransformProvider.GenerateSequenceSchemaTransform - Class in org.apache.beam.sdk.providers
- GenerateSequenceTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.seqgen
-
Sequence generator table provider.
- GenerateSequenceTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.seqgen.GenerateSequenceTableProvider
- generic() - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory
-
Returns an
AvroDatumFactoryinstance for GenericRecord. - generic(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
-
Returns an
AvroCoderinstance for the Avro schema. - GenericDatumFactory() - Constructor for class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory.GenericDatumFactory
- GenericDlq - Class in org.apache.beam.sdk.schemas.io
-
Helper to generate a DLQ transform to write PCollection
to an external system. - GenericDlqProvider - Interface in org.apache.beam.sdk.schemas.io
-
A Provider for generic DLQ transforms that handle deserialization failures.
- get() - Method in class org.apache.beam.runners.portability.CloseableResource
-
Gets the underlying resource.
- get() - Static method in class org.apache.beam.runners.spark.io.EmptyCheckpointMark
- get() - Method in class org.apache.beam.runners.twister2.translators.functions.Twister2SinkFunction
- get() - Method in class org.apache.beam.sdk.io.aws2.schemas.AwsBuilderFactory
- get() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator.BytesThroughputEstimator
-
Returns the estimated throughput bytes for this run.
- get() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.BytesThroughputEstimator
-
Returns the estimated throughput bytes for now.
- get() - Method in interface org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.ThroughputEstimator
-
Returns the estimated throughput for now.
- get() - Method in class org.apache.beam.sdk.io.hadoop.SerializableConfiguration
- get() - Method in class org.apache.beam.sdk.io.kafka.KafkaIOUtils.MovingAvg
- get() - Method in interface org.apache.beam.sdk.options.ValueProvider
-
Returns the runtime value wrapped by this
ValueProviderin case it isValueProvider.isAccessible(), otherwise fails. - get() - Method in class org.apache.beam.sdk.options.ValueProvider.NestedValueProvider
- get() - Method in class org.apache.beam.sdk.options.ValueProvider.RuntimeValueProvider
- get() - Method in class org.apache.beam.sdk.options.ValueProvider.StaticValueProvider
- get() - Method in interface org.apache.beam.sdk.transforms.Materializations.IterableView
-
Returns an iterable for all values.
- get() - Method in interface org.apache.beam.sdk.transforms.Materializations.MultimapView
-
Returns an iterable of all keys.
- get() - Method in class org.apache.beam.sdk.values.PCollectionViews.ValueOrMetadata
- get(int) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel.WrappedList
- get(int) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel.WrappedRow
- get(int) - Method in class org.apache.beam.sdk.values.PCollectionList
-
Returns the
PCollectionat the given index (origin zero). - get(int) - Method in class org.apache.beam.sdk.values.TupleTagList
-
Returns the
TupleTagat the given index (origin zero). - get(K) - Method in interface org.apache.beam.sdk.transforms.Materializations.MultimapView
-
Returns an iterable of all the values for the specified key.
- get(Long) - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
-
Returns the
Broadcastcontaining theGlobalWatermarkHolder.SparkWatermarksmapped to their sources. - get(Object) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel.WrappedMap
- get(Object) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
- get(String) - Method in interface org.apache.beam.sdk.state.TimerMap
- get(String) - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
- get(String) - Method in class org.apache.beam.sdk.values.PCollectionTuple
-
Returns the
PCollectionassociated with the given tag in thisPCollectionTuple. - get(K) - Method in interface org.apache.beam.sdk.state.MapState
-
A deferred lookup, using null values if the item is not found.
- get(K) - Method in interface org.apache.beam.sdk.state.MultimapState
-
A deferred lookup, returns an empty iterable if the item is not found.
- get(K, W) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.BagUserStateHandler
-
Returns an
Iterableof values representing the bag user state for the given key and window. - get(K, W) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.MultimapSideInputHandler
-
Returns an
Iterableof values representing the side input for the given key and window. - get(ObjectT) - Method in interface org.apache.beam.sdk.schemas.FieldValueGetter
- get(JobInfo) - Method in class org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageContextFactory
- get(JobInfo) - Method in interface org.apache.beam.runners.fnexecution.control.ExecutableStageContext.Factory
-
Get or create
ExecutableStageContextfor givenJobInfo. - get(JobInfo) - Method in class org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory
- get(JobInfo) - Method in class org.apache.beam.runners.spark.translation.SparkExecutableStageContextFactory
- get(BoundedWindow) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.SideInputValues.ByWindow
- get(BoundedWindow) - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.SideInputValues
- get(BoundedWindow) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.SideInputValues.Global
- get(PCollectionView<T>, BoundedWindow) - Method in class org.apache.beam.runners.flink.translation.functions.FlinkSideInputReader
- get(PCollectionView<T>, BoundedWindow) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.CachedSideInputReader
- get(PCollectionView<T>, BoundedWindow) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.SparkSideInputReader
- get(PCollectionView<T>, BoundedWindow) - Method in class org.apache.beam.runners.spark.util.CachedSideInputReader
- get(PCollectionView<T>, BoundedWindow) - Method in class org.apache.beam.runners.spark.util.SparkSideInputReader
- get(PCollectionView<T>, BoundedWindow) - Method in class org.apache.beam.runners.twister2.utils.Twister2SideInputReader
- get(PValue) - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
-
Retrieve an object of Type T associated with the PValue passed in.
- get(TupleTag<T>) - Method in interface org.apache.beam.sdk.transforms.DoFn.MultiOutputReceiver
-
Returns an
DoFn.OutputReceiverfor the given tag. - get(TupleTag<T>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
- get(TupleTag<V>) - Method in class org.apache.beam.sdk.transforms.CombineFns.CoCombineResult
-
Returns the value represented by the given
TupleTag. - get(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.AutoValueSchema.AbstractGetterTypeSupplier
- get(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.JavaBeanSchema.GetterTypeSupplier
- get(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.JavaBeanSchema.SetterTypeSupplier
- get(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.JavaFieldSchema.JavaFieldTypeSupplier
- get(TypeDescriptor<?>) - Method in interface org.apache.beam.sdk.schemas.utils.FieldValueTypeSupplier
-
Return all the FieldValueTypeInformations.
- get(TypeDescriptor<?>, Schema) - Method in interface org.apache.beam.sdk.schemas.utils.FieldValueTypeSupplier
-
Return all the FieldValueTypeInformations.
- get(W) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.IterableSideInputHandler
-
Returns an
Iterableof values representing the side input for the given window. - get(W) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.MultimapSideInputHandler
-
Returns an
Iterableof keys representing the side input for the given window. - getAcceptedIssuers() - Method in class org.apache.beam.sdk.io.splunk.CustomX509TrustManager
- getAccessKey() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
- getAccountName() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
- getAccum() - Method in interface org.apache.beam.sdk.state.CombiningState
-
Read the merged accumulator for this state cell.
- getAccumTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.LazyAggregateCombineFn
- getAccumTVariable() - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
- getAccumTVariable() - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
-
Returns the
TypeVariableofAccumT. - getAccumTVariable() - Method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
-
Returns the
TypeVariableofAccumT. - getAccumTVariable() - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
-
Returns the
TypeVariableofAccumT. - getAccumTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.ArrayAgg.ArrayAggArray
-
Returns the
TypeVariableofAccumT. - getAccumTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations.BitXOr
-
Returns the
TypeVariableofAccumT. - getAccumTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions.PositionAwareCombineFn
-
Returns the
TypeVariableofAccumT. - getAccumTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn
-
Returns the
TypeVariableofAccumT. - getAccumTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
-
Returns the
TypeVariableofAccumT. - getAccumTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
-
Returns the
TypeVariableofAccumT. - getAccumTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
-
Returns the
TypeVariableofAccumT. - getAccumTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
-
Returns the
TypeVariableofAccumT. - getAccumTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
-
Returns the
TypeVariableofAccumT. - getAccumTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
-
Returns the
TypeVariableofAccumT. - getAccumTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.SimpleCombineFn
-
Returns the
TypeVariableofAccumT. - getAccumTVariable() - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
-
Returns the
TypeVariableofAccumT. - getAccumTVariable() - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
-
Returns the
TypeVariableofAccumT. - getAccumTVariable() - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
-
Returns the
TypeVariableofAccumT. - getAccumTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CountIf.CountIfFn
-
Returns the
TypeVariableofAccumT. - getAccumTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CovarianceFn
-
Returns the
TypeVariableofAccumT. - getAccumTVariable() - Method in class org.apache.beam.sdk.extensions.ordered.combiner.DefaultSequenceCombiner
-
Returns the
TypeVariableofAccumT. - getAccumTVariable() - Method in class org.apache.beam.runners.flink.FlinkStreamingAggregationsTranslators.ConcatenateAsIterable
-
Returns the
TypeVariableofAccumT. - getAccumTVariable() - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
-
Returns the
TypeVariableofAccumT. - getAccumTVariable() - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
-
Returns the
TypeVariableofAccumT. - getAccumTVariable() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Concatenate
-
Returns the
TypeVariableofAccumT. - getAccumTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggByte
-
Returns the
TypeVariableofAccumT. - getAccumTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggString
-
Returns the
TypeVariableofAccumT. - getAccumTVariable() - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
-
Returns the
TypeVariableofAccumT. - getAccumTVariable() - Method in class org.apache.beam.sdk.transforms.Top.TopCombineFn
-
Returns the
TypeVariableofAccumT. - getAccumTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.VarianceFn
-
Returns the
TypeVariableofAccumT. - getAccumulatorCoder(CoderRegistry, Coder<TimestampedValue<KV<EventKeyT, KV<Long, EventT>>>>) - Method in class org.apache.beam.sdk.extensions.ordered.combiner.DefaultSequenceCombiner
- getAccumulatorCoder(CoderRegistry, Coder) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
- getAccumulatorCoder(CoderRegistry, Coder) - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
- getAccumulatorCoder(CoderRegistry, Coder<DataT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
- getAccumulatorCoder(CoderRegistry, Coder<DataT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
- getAccumulatorCoder(CoderRegistry, Coder<InputT>) - Method in class org.apache.beam.sdk.extensions.sql.impl.LazyAggregateCombineFn
- getAccumulatorCoder(CoderRegistry, Coder<InputT>) - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
- getAccumulatorCoder(CoderRegistry, Coder<InputT>) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
- getAccumulatorCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.ArrayAgg.ArrayAggArray
- getAccumulatorCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations.BitXOr
- getAccumulatorCoder(CoderRegistry, Coder<InputT>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions.PositionAwareCombineFn
- getAccumulatorCoder(CoderRegistry, Coder<InputT>) - Method in class org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn
- getAccumulatorCoder(CoderRegistry, Coder<InputT>) - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
- getAccumulatorCoder(CoderRegistry, Coder<V>) - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
- getAccumulatorCoder(CoderRegistry, Coder<V>) - Method in class org.apache.beam.sdk.transforms.Combine.SimpleCombineFn
- getAccumulatorCoder(CoderRegistry, Coder<InputT>) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
- getAccumulatorCoder(CoderRegistry, Coder<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Concatenate
- getAccumulatorCoder(CoderRegistry, Coder<byte[]>) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggByte
- getAccumulatorCoder(CoderRegistry, Coder<String>) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggString
- getAccumulatorCoder(CoderRegistry, Coder<InputT>) - Method in interface org.apache.beam.sdk.transforms.CombineFnBase.GlobalCombineFn
-
Returns the
Coderto use for accumulatorAccumTvalues, or null if it is not able to be inferred. - getAccumulatorCoder(CoderRegistry, Coder<Boolean>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CountIf.CountIfFn
- getAccumulatorCoder(CoderRegistry, Coder<Double>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
- getAccumulatorCoder(CoderRegistry, Coder<Integer>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
- getAccumulatorCoder(CoderRegistry, Coder<Long>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
- getAccumulatorCoder(CoderRegistry, Coder<Row>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CovarianceFn
- getAccumulatorCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.runners.flink.FlinkStreamingAggregationsTranslators.ConcatenateAsIterable
- getAccumulatorCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.VarianceFn
- getAccumulatorCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
- getAccumulatorCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
-
Deprecated.
- getAccumulatorCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
- getAccumulatorCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.transforms.Top.TopCombineFn
- getAccumulatorCoder(CoderRegistry, Coder<V>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
- getActiveWorkRefreshPeriodMillis() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getAdditionalInputs() - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.ParDoSingle
- getAdditionalInputs() - Method in class org.apache.beam.sdk.io.WriteFiles
- getAdditionalInputs() - Method in class org.apache.beam.sdk.transforms.Combine.Globally
-
Returns the side inputs of this
Combine, tagged with the tag of thePCollectionView. - getAdditionalInputs() - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
-
Returns the side inputs of this
Combine, tagged with the tag of thePCollectionView. - getAdditionalInputs() - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
-
Returns the side inputs of this
ParDo, tagged with the tag of thePCollectionView. - getAdditionalInputs() - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
-
Returns the side inputs of this
ParDo, tagged with the tag of thePCollectionView. - getAdditionalInputs() - Method in class org.apache.beam.sdk.transforms.PTransform
-
Returns all
PValuesthat are consumed as inputs to thisPTransformthat are independent of the expansion of thePTransformwithinPTransform.expand(PInput). - getAdditionalOutputTags() - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
- getAddresses() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
- getAlgorithm() - Method in enum class org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions.AutoscalingAlgorithmType
-
Returns the string representation of this type.
- getAlgorithm() - Method in class org.apache.beam.sdk.io.aws2.s3.SSECustomerKey
- getAliases() - Static method in class org.apache.beam.sdk.managed.ManagedSchemaTransformProvider
- getAll() - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
-
Retrieve all HL7v2 Messages from a PCollection of message IDs (such as from PubSub notification subscription).
- getAll() - Method in class org.apache.beam.sdk.values.PCollectionList
-
Returns an immutable List of all the
PCollectionsin thisPCollectionList. - getAll() - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
-
Returns an immutable Map from tag to corresponding
PCollection, for all the members of thisPCollectionRowTuple. - getAll() - Method in class org.apache.beam.sdk.values.PCollectionTuple
-
Returns an immutable Map from
TupleTagto correspondingPCollection, for all the members of thisPCollectionTuple. - getAll() - Method in class org.apache.beam.sdk.values.TupleTagList
-
Returns an immutable List of all the
TupleTagsin thisTupleTagList. - getAll(String) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
-
Like
CoGbkResult.getAll(TupleTag)but using a String instead of aTupleTag. - getAll(TupleTag<V>) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
-
Returns the values from the table represented by the given
TupleTag<V>as anIterable<V>(which may be empty if there are no results). - getAllFields() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
-
If true, all fields are being accessed.
- getAllIds(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- getAllJobs() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
- getAllMetadata() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
-
Deprecated.use schema options instead.
- getAllowDuplicates() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- getAllowDuplicates() - Method in class org.apache.beam.sdk.transforms.Redistribute.RedistributeArbitrarily
- getAllowDuplicates() - Method in class org.apache.beam.sdk.transforms.Redistribute.RedistributeByKey
- getAllowedLateness() - Method in class org.apache.beam.sdk.values.WindowingStrategy
- getAllowedTimestampSkew() - Method in class org.apache.beam.sdk.transforms.DoFn
-
Deprecated.This method permits a
DoFnto emit elements behind the watermark. These elements are considered late, and if behind theallowed latenessof a downstreamPCollectionmay be silently dropped. See https://github.com/apache/beam/issues/18065 for details on a replacement. - getAllowedTimestampSkew() - Method in class org.apache.beam.sdk.transforms.WithTimestamps
-
Deprecated.This method permits a to elements to be emitted behind the watermark. These elements are considered late, and if behind the
allowed latenessof a downstreamPCollectionmay be silently dropped. See https://github.com/apache/beam/issues/18065 for details on a replacement. - getAllowlist() - Method in class org.apache.beam.sdk.expansion.service.ExpansionServiceConfig
- getAllowNonRestoredState() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getAllPartitionsCreatedAfter(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
-
Fetches all partitions with a
PartitionMetadataAdminDao.COLUMN_CREATED_ATless than the given timestamp. - getAllRows(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- getAllWorkerStatuses(long, TimeUnit) - Method in class org.apache.beam.runners.fnexecution.status.BeamWorkerStatusGrpcService
-
Get all the statuses from all connected SDK harnesses within specified timeout.
- getAlpha() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPFieldRef
- getAlsoStartLoopbackWorker() - Method in interface org.apache.beam.sdk.expansion.service.ExpansionServiceOptions
- getAndMaybeCreateSplitOutput(ReaderOutput<OutputT>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase.ReaderAndOutput
- getAnnotatedConstructor(Class<?>) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
- getAnnotatedCreateMethod(Class<?>) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
- getAnnotations() - Method in class org.apache.beam.sdk.transforms.PTransform
-
Returns annotations map to provide additional hints to the runner.
- getApiKey() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
- getApiPrefix() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
Generates the API endpoint prefix based on the set values.
- getApiRootUrl() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
The root URL for the Dataflow API.
- getApiServiceDescriptor() - Method in class org.apache.beam.sdk.fn.server.GrpcFnServer
-
Get an
Endpoints.ApiServiceDescriptordescribing the endpoint thisGrpcFnServeris bound to. - getAppend() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider.Configuration
- getAppId() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
- getApplicationName() - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher.TableAndQuery
- getAppliedFn(CoderRegistry, Coder<? extends KV<K, ? extends Iterable<InputT>>>, WindowingStrategy<?, ?>) - Method in class org.apache.beam.sdk.transforms.Combine.GroupedValues
-
Returns the
Combine.CombineFnbound to its coders. - getApplyMethod(ScalarFn) - Static method in class org.apache.beam.sdk.extensions.sql.impl.ScalarFnReflector
-
Gets the method annotated with
ScalarFn.ApplyMethodfromscalarFn. - getAppName() - Method in interface org.apache.beam.sdk.options.ApplicationNameOptions
-
Name of application, for display purposes.
- getAppProfileId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
-
Returns the app profile being read from.
- getApproximateArrivalTimestamp() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
- getArgument() - Method in class org.apache.beam.sdk.schemas.logicaltypes.Date
- getArgument() - Method in class org.apache.beam.sdk.schemas.logicaltypes.DateTime
- getArgument() - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
- getArgument() - Method in class org.apache.beam.sdk.schemas.logicaltypes.NanosDuration
- getArgument() - Method in class org.apache.beam.sdk.schemas.logicaltypes.NanosInstant
- getArgument() - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
- getArgument() - Method in class org.apache.beam.sdk.schemas.logicaltypes.PassThroughLogicalType
- getArgument() - Method in class org.apache.beam.sdk.schemas.logicaltypes.Time
- getArgument() - Method in class org.apache.beam.sdk.schemas.logicaltypes.UuidLogicalType
- getArgument() - Method in interface org.apache.beam.sdk.schemas.Schema.LogicalType
-
An optional argument to configure the type.
- getArguments() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getArgumentType() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.LocalMktDate
- getArgumentType() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCDateOnly
- getArgumentType() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimeOnly
- getArgumentType() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimestamp
- getArgumentType() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimeOnly
- getArgumentType() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimestamp
- getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.Date
- getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.DateTime
- getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
- getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.MicrosInstant
- getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.NanosDuration
- getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.NanosInstant
- getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
- getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.PassThroughLogicalType
- getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.PythonCallable
- getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.SchemaLogicalType
- getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.Time
- getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.UuidLogicalType
- getArgumentType() - Method in interface org.apache.beam.sdk.schemas.Schema.LogicalType
-
A schema type representing how to interpret the argument.
- getArgumentTypes(Method) - Method in class org.apache.beam.sdk.values.TypeDescriptor
-
Returns a list of argument types for the given method, which must be a part of the class.
- getArity() - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeInformation
- getArity() - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueTypeInformation
- getArray(int) - Method in class org.apache.beam.sdk.values.Row
-
Get an array value by field index,
IllegalStateExceptionis thrown if schema doesn't match. - getArray(String) - Method in class org.apache.beam.sdk.values.Row
-
Get an array value by field name,
IllegalStateExceptionis thrown if schema doesn't match. - getArtifact(ArtifactApi.GetArtifactRequest, StreamObserver<ArtifactApi.GetArtifactResponse>) - Method in class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
- getArtifact(RunnerApi.ArtifactInformation) - Static method in class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
- getArtifactPort() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver.ServerConfiguration
- getArtifactStagingPath() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver.ServerConfiguration
- getATN() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
- getATN() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- getAttachedMode() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getAttachmentBytes() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record
-
Gets the attachment data of the message as a byte array, if any.
- getAttempted() - Method in class org.apache.beam.sdk.metrics.MetricResult
-
Return the value of this metric across all attempts of executing all parts of the pipeline.
- getAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
-
Returns the given attribute value.
- getAttributeId() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
- getAttributeId() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
- getAttributeMap() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
-
Returns the full map of attributes.
- getAttributeMap() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
- getAttributes() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
- getAttributes() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration
- getAttributes() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
- getAttributes() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
- getAttributesMap() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
- getAttributesMap() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration
- getAuthenticator() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- getAuthenticator() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- getAuthToken(String, String) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverConfiguration
-
Certain embedded scenarios and so on actually allow for having no authentication at all.
- getAutoCommit() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- getAutoOffsetResetConfig() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- getAutoscalingAlgorithm() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
-
The autoscaling algorithm to use for the workerpool.
- getAutosharding() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
- getAutoSharding() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
- getAutoValueGenerated(TypeDescriptor<?>) - Static method in class org.apache.beam.sdk.schemas.utils.AutoValueUtils
- getAutoValueGeneratedBuilder(TypeDescriptor<?>) - Static method in class org.apache.beam.sdk.schemas.utils.AutoValueUtils
- getAutoWatermarkInterval() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getAvroBytesToRowFunction(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
-
Returns a function mapping encoded AVRO
GenericRecords to BeamRows. - getAvroFilterFormatFunction(boolean) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.PortableBigQueryDestinations
- getAwsCredentialsProvider() - Method in interface org.apache.beam.sdk.io.aws2.options.AwsOptions
-
AwsCredentialsProviderused to configure AWS service clients. - getAwsRegion() - Method in interface org.apache.beam.sdk.io.aws2.options.AwsOptions
-
Region used to configure AWS service clients.
- getAzureConnectionString() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
- getAzureCredentialsProvider() - Method in interface org.apache.beam.sdk.io.azure.options.AzureOptions
-
The credential instance that should be used to authenticate against Azure services.
- getBacking() - Method in class org.apache.beam.sdk.fn.data.WeightedList
- getBacklogBytes(String) - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthSempClient
- getBacklogBytes(String) - Method in class org.apache.beam.sdk.io.solace.broker.SempBasicAuthClientExecutor
- getBacklogBytes(String) - Method in interface org.apache.beam.sdk.io.solace.broker.SempClient
-
Retrieves the size of the backlog (in bytes) for the specified queue.
- getBacklogCheckTime() - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicy.PartitionContext
-
The time at which latest offset for the partition was fetched in order to calculate backlog.
- getBadRecordErrorHandler() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getBadRecordErrorHandler() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- getBadRecordErrorHandler() - Method in class org.apache.beam.sdk.io.WriteFiles
- getBadRecordRouter() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- getBadRecordRouter() - Method in class org.apache.beam.sdk.io.WriteFiles
- getBagUserStateSpecs() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.ExecutableProcessBundleDescriptor
-
Get a mapping from PTransform id to user state input id to
bag user statesthat are used during execution. - getBaseAutoValueClass(TypeDescriptor<?>) - Static method in class org.apache.beam.sdk.schemas.utils.AutoValueUtils
- getBaseName() - Method in class org.apache.beam.sdk.metrics.LabeledMetricNameUtils.ParsedMetricName
- getBaseType() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.LocalMktDate
- getBaseType() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCDateOnly
- getBaseType() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimeOnly
- getBaseType() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimestamp
- getBaseType() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimeOnly
- getBaseType() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimestamp
- getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.Date
- getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.DateTime
- getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
- getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.MicrosInstant
- getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.NanosDuration
- getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.NanosInstant
- getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
- getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.PassThroughLogicalType
- getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.PythonCallable
- getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.SchemaLogicalType
- getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.Time
- getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.UuidLogicalType
- getBaseType() - Method in interface org.apache.beam.sdk.schemas.Schema.LogicalType
-
The base
Schema.FieldTypeused to store values of this type. - getBaseValue(int) - Method in class org.apache.beam.sdk.values.Row
-
Returns the base type for this field.
- getBaseValue(int, Class<T>) - Method in class org.apache.beam.sdk.values.Row
-
Returns the base type for this field.
- getBaseValue(String) - Method in class org.apache.beam.sdk.values.Row
-
Returns the base type for this field.
- getBaseValue(String, Class<T>) - Method in class org.apache.beam.sdk.values.Row
-
Returns the base type for this field.
- getBaseValues() - Method in class org.apache.beam.sdk.values.Row
-
Return a list of data values.
- getBatchClient() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor
- getBatchCombinePerKeyOperator(FlinkStreamingTranslationContext, PCollection<KV<K, InputT>>, Map<Integer, PCollectionView<?>>, List<PCollectionView<?>>, Coder<WindowedValue<KV<K, AccumT>>>, CombineFnBase.GlobalCombineFn<InputT, AccumT, ?>, WindowDoFnOperator<K, AccumT, OutputT>, TypeInformation<WindowedValue<KV<K, OutputT>>>) - Static method in class org.apache.beam.runners.flink.FlinkStreamingAggregationsTranslators
- getBatchDuration() - Method in class org.apache.beam.runners.spark.io.CreateStream
- getBatchDuration(SerializablePipelineOptions) - Static method in class org.apache.beam.runners.spark.translation.TranslationUtils
-
Retrieves the batch duration in milliseconds from Spark pipeline options.
- getBatches() - Method in class org.apache.beam.runners.spark.io.CreateStream
-
Get the underlying queue representing the mock stream of micro-batches.
- getBatching() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration
- getBatchingParams() - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches
-
Returns user supplied parameters for batching.
- getBatchingParams() - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches.WithShardedKey
-
Returns user supplied parameters for batching.
- getBatchInitialCount() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
The initial size of a batch; used in the absence of the QoS system having significant data to determine a better batch size.
- getBatchIntervalMillis() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
- getBatchMaxBytes() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
The maximum number of bytes to include in a batch.
- getBatchMaxCount() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
The maximum number of writes to include in a batch.
- getBatchService() - Method in interface org.apache.beam.sdk.io.snowflake.services.SnowflakeServices
- getBatchService() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeServicesImpl
- getBatchSize() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
- getBatchSize() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteConfiguration
- getBatchSize() - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches.BatchingParams
- getBatchSizeBytes() - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
- getBatchSizeBytes() - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText
- getBatchSizeBytes() - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
- getBatchSizeBytes() - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches.BatchingParams
- getBatchTargetLatency() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
Target latency for batch requests.
- getBeamCheckpointDir() - Method in class org.apache.beam.runners.spark.translation.streaming.Checkpoint.CheckpointDir
- getBeamRelInput(RelNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils
- getBeamSchemaFromProto(String, String) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoByteUtils
-
Retrieves a Beam Schema from a Protocol Buffer message.
- getBeamSchemaFromProtoSchema(String, String) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoByteUtils
-
Parses the given Protocol Buffers schema string, retrieves the Descriptor for the specified message name, and constructs a Beam Schema from it.
- getBeamSplitSource() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceSplit
- getBeamSqlTable() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
- getBeamSqlUdfs() - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.UdfUdafProvider
-
For UDFs implement
BeamSqlUdf. - getBearerToken() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
- getBigLakeConfiguration() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
- getBigQueryEndpoint() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
-
BQ endpoint to use.
- getBigQueryLocation() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.TestBigQueryOptions
- getBigQueryProject() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getBigtableChangeStreamInstanceId() - Method in interface org.apache.beam.sdk.io.gcp.bigtable.changestreams.BigtableChangeStreamTestOptions
- getBigtableClientOverride() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
-
Returns the Bigtable client override.
- getBigtableOptions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Deprecated.read options are configured directly on BigtableIO.read(). Use
BigtableIO.Read.populateDisplayData(DisplayData.Builder)to view the current configurations. - getBigtableOptions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
-
Deprecated.write options are configured directly on BigtableIO.write(). Use
BigtableIO.Write.populateDisplayData(DisplayData.Builder)to view the current configurations. - getBlobServiceEndpoint() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
-
The Azure Blobstore service endpoint used by the Blob service client.
- getBlobstoreClientFactoryClass() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
- getBlockOffset() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmShard
-
Return the absolute position within the Ism file where the data block begins.
- getBloomFilterPosition() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.Footer
- getBody() - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsMessage
-
Message body.
- getBody() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
- getBoolean() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
- getBoolean(int) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Booleanvalue by field index,ClassCastExceptionis thrown if schema doesn't match. - getBoolean(String) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.BOOLEANvalue by field name,IllegalStateExceptionis thrown if schema doesn't match. - getBoolean(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- getBoolean(Map<String, Object>, String, Boolean) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- getBootstrapServers() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
- getBootstrapServers() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
-
Sets the bootstrap servers for the Kafka consumer.
- getBootstrapServers() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration
- getBoundedness() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSource
- getBoundednessOfRelNode(RelNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
-
This method returns the Boundedness of a RelNode.
- getBoundedTrie(MetricName) - Method in class org.apache.beam.runners.jet.metrics.JetMetricsContainer
- getBoundedTrie(MetricName) - Method in interface org.apache.beam.sdk.metrics.MetricsContainer
-
Return the
BoundedTriethat should be used for implementing the givenmetricNamein this container. - getBoundedTries() - Method in class org.apache.beam.sdk.metrics.MetricQueryResults
-
Return the metric results for the bounded tries that matched the filter.
- getBqStreamingApiLoggingFrequencySec() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getBranch() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getBroadcastSizeEstimate() - Method in class org.apache.beam.runners.spark.util.SideInputBroadcast
- getBucket() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
-
Returns the bucket name associated with this GCS path, or an empty string if this is a relative path component.
- getBucket(GcsPath) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
-
Get the
Bucketfrom Cloud Storage path or propagates an exception. - getBucketKeyEnabled() - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
- getBucketKeyEnabled() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
-
Whether to use an S3 Bucket Key for object encryption with server-side encryption using AWS KMS (SSE-KMS) or not.
- getBufferSize() - Method in class org.apache.beam.sdk.fn.stream.BufferingStreamObserver
- getBuilderCreator(TypeDescriptor<?>, Schema, FieldValueTypeSupplier) - Static method in class org.apache.beam.sdk.schemas.utils.AutoValueUtils
-
Try to find an accessible builder class for creating an AutoValue class.
- getBuiltinMethods() - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BeamBuiltinFunctionProvider
- getBulkDirective() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Document
- getBulkEndPoint() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
- getBulkIO() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
- getBundle() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleParameter
-
FHIR R4 bundle resource object as a string.
- getBundle(OutputReceiverFactory, TimerReceiverFactory, StateRequestHandler, BundleProgressHandler) - Method in interface org.apache.beam.runners.fnexecution.control.StageBundleFactory
-
Get a new
bundlefor processing the data in an executable stage. - getBundle(OutputReceiverFactory, TimerReceiverFactory, StateRequestHandler, BundleProgressHandler, BundleFinalizationHandler) - Method in interface org.apache.beam.runners.fnexecution.control.StageBundleFactory
- getBundle(OutputReceiverFactory, TimerReceiverFactory, StateRequestHandler, BundleProgressHandler, BundleFinalizationHandler, BundleCheckpointHandler) - Method in interface org.apache.beam.runners.fnexecution.control.StageBundleFactory
- getBundle(OutputReceiverFactory, StateRequestHandler, BundleProgressHandler) - Method in interface org.apache.beam.runners.fnexecution.control.StageBundleFactory
-
Get a new
bundlefor processing the data in an executable stage. - getBundle(OutputReceiverFactory, StateRequestHandler, BundleProgressHandler, BundleFinalizationHandler) - Method in interface org.apache.beam.runners.fnexecution.control.StageBundleFactory
- getBundle(OutputReceiverFactory, StateRequestHandler, BundleProgressHandler, BundleFinalizationHandler, BundleCheckpointHandler) - Method in interface org.apache.beam.runners.fnexecution.control.StageBundleFactory
- getBundleFinalizer() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- getBundleProcessorCacheTimeout() - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
- getBundleSize() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
- getByte() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
- getByte(int) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.BYTEvalue by field index,ClassCastExceptionis thrown if schema doesn't match. - getByte(String) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.BYTEvalue by field name,IllegalStateExceptionis thrown if schema doesn't match. - getBytes() - Method in class org.apache.beam.sdk.io.range.ByteKey
-
Returns a newly-allocated
byte[]representing thisByteKey. - getBytes(int) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.BYTESvalue by field index,ClassCastExceptionis thrown if schema doesn't match. - getBytes(String) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.BYTESvalue by field name,IllegalStateExceptionis thrown if schema doesn't match. - getBytes(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- getBytes(Map<String, Object>, String, byte[]) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- getBytesPerOffset() - Method in class org.apache.beam.sdk.io.OffsetBasedSource
-
Returns approximately how many bytes of data correspond to a single offset in this source.
- getBytesToRowFn(Schema) - Static method in class org.apache.beam.sdk.io.TFRecordReadSchemaTransformProvider
- getCacheCandidates() - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
-
Get the map of cache candidates hold by the evaluation context.
- getCacheTokens() - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandler
-
Retrieves a list of valid cache tokens.
- getCalciteConnectionProperties() - Method in interface org.apache.beam.sdk.extensions.sql.impl.BeamSqlPipelineOptions
- getCallable() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf.Configuration
- getCandidatesForGroupByKeyAndWindowTranslation() - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
-
Get the map of GBK transforms to their full names, which are candidates for group by key and window translation which aims to reduce memory usage.
- getCaseEnumType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
-
Returns the
EnumerationTypethat is used to represent the case type. - getCaseSensitive() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getCaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType.Value
-
Returns the enumeration that specified which OneOf field is set.
- getCatalog() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- getCatalog() - Method in class org.apache.beam.sdk.extensions.sql.impl.CatalogSchema
- getCatalog(String) - Method in interface org.apache.beam.sdk.extensions.sql.meta.catalog.CatalogManager
-
Attempts to fetch the catalog with this name.
- getCatalog(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.catalog.EmptyCatalogManager
- getCatalog(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.catalog.InMemoryCatalogManager
- getCatalogConfig() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getCatalogName() - Method in class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig
- getCatalogName() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration
- getCatalogProperties() - Method in class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig
- getCatalogProperties() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration
- getCatalogs() - Method in interface org.apache.beam.sdk.extensions.sql.meta.catalog.CatalogRegistrar
- getCatalogs() - Method in class org.apache.beam.sdk.extensions.sql.meta.catalog.InMemoryCatalogRegistrar
- getCatalogs() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.iceberg.IcebergCatalogRegistrar
- getCatalogSchema(TableName) - Method in class org.apache.beam.sdk.extensions.sql.impl.CatalogManagerSchema
- getCause() - Method in class org.apache.beam.runners.jet.FailedRunningPipelineResults
- getCellsMutatedPerColumn(String, String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema
-
Return the total number of cells affected when the specified column is mutated.
- getCellsMutatedPerRow(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema
-
Return the total number of cells affected with the given row is deleted.
- getCEPFieldRefFromParKeys(ImmutableBitSet) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPUtils
-
Transform the partition columns into serializable CEPFieldRef.
- getCepKind() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPOperator
- getCEPPatternFromPattern(Schema, RexNode, Map<String, RexNode>) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPUtils
-
Construct a list of
CEPPatterns from aRexNode. - getChangeSequenceNumber() - Method in class org.apache.beam.sdk.io.gcp.bigquery.RowMutationInformation
-
The value supplied to the BigQuery
_CHANGE_SEQUENCE_NUMBERpseudo-column. - getChangeStreamContinuationTokens() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
- getChangeStreamContinuationTokens() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
- getChangeStreamDao() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.DaoFactory
- getChangeStreamDao() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.DaoFactory
-
Creates and returns a singleton DAO instance for querying a partition change stream.
- getChangeStreamName() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.DaoFactory
- getChangeStreamName() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration
- getChangeStreamNamePrefix() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
-
Return the prefix used to identify the rows belonging to this job.
- getChangeStreamNamePrefix() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
- getChannelFactory() - Method in class org.apache.beam.sdk.io.CompressedSource
- getChannelNames() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
- getChannelzShowOnlyWindmillServiceChannels() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getCharset() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.XmlConfiguration
- getCheckpointDir() - Method in interface org.apache.beam.runners.spark.SparkCommonPipelineOptions
- getCheckpointDurationMillis() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
- getCheckpointingInterval() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getCheckpointingInterval() - Method in interface org.apache.beam.sdk.io.kafka.KafkaIO.Read.FakeFlinkPipelineOptions
- getCheckpointingMode() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getCheckpointMark() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceSplit
- getCheckpointMark() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
- getCheckpointMark() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
- getCheckpointMark() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
-
Returns a
UnboundedSource.CheckpointMarkrepresenting the progress of thisUnboundedReader. - getCheckpointMarkCoder() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource
- getCheckpointMarkCoder() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedSourceImpl
- getCheckpointMarkCoder() - Method in class org.apache.beam.sdk.io.solace.read.UnboundedSolaceSource
- getCheckpointMarkCoder() - Method in class org.apache.beam.sdk.io.UnboundedSource
-
Returns a
Coderfor encoding and decoding the checkpoints for this source. - getCheckpointTimeoutMillis() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getCheckStopReadingFn() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getChildPartitions() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
-
List of child partitions yielded within this record.
- getChildRels(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall
- getClass(String, String) - Static method in class org.apache.beam.sdk.schemas.transforms.providers.StringCompiler
- getClasses() - Method in class org.apache.beam.sdk.values.TypeDescriptor
-
Returns a set of
TypeDescriptors, one for each superclass (including this class). - getClassName() - Method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Gets the name of the Java class that this CloudObject represents.
- getClazz() - Method in class org.apache.beam.sdk.schemas.utils.ReflectUtils.ClassWithSchema
- getClient(String) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
- getClientBuilderFactory() - Method in interface org.apache.beam.sdk.io.aws2.options.AwsOptions
- getClientFactory() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
- getClientInfo() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- getClientInfo(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- getClock() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
- getCloningBehavior() - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
Deprecated.Use
TestPipelinewith theDirectRunner. - getCloseStream() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
- getClosingBehavior() - Method in class org.apache.beam.sdk.values.WindowingStrategy
- getClosure() - Method in class org.apache.beam.sdk.transforms.Contextful
-
Returns the closure.
- getClusterId() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
- getClustering() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- getClusteringFields() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
- getClusterName() - Method in interface org.apache.beam.runners.jet.JetPipelineOptions
- getClusterType() - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
- getCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.TypeCode
-
Returns the type code of the column.
- getCodec(DestinationT) - Method in class org.apache.beam.sdk.extensions.avro.io.DynamicAvroDestinations
-
Return an AVRO codec for a given destination.
- getCodeJarPathname() - Method in interface org.apache.beam.runners.jet.JetPipelineOptions
- getCoder() - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeInformation
- getCoder() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.FlinkStateNamespaceKeySerializer
- getCoder() - Method in class org.apache.beam.runners.fnexecution.control.RemoteOutputReceiver
- getCoder() - Method in class org.apache.beam.runners.fnexecution.data.RemoteInputDestination
- getCoder() - Method in class org.apache.beam.sdk.coders.DelegateCoder
-
Returns the coder used to encode/decode the intermediate values produced/consumed by the coding functions of this
DelegateCoder. - getCoder() - Method in class org.apache.beam.sdk.fn.data.DataEndpoint
- getCoder() - Method in class org.apache.beam.sdk.fn.data.TimerEndpoint
- getCoder() - Method in interface org.apache.beam.sdk.io.singlestore.SingleStoreIO.RowMapperWithCoder
- getCoder() - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Record
-
The coder for the record, or null if there is no coder.
- getCoder() - Static method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
-
Returns a
Codersuitable forIntervalWindow. - getCoder() - Method in class org.apache.beam.sdk.values.PCollection
-
Returns the
Coderused by thisPCollectionto encode and decode the values stored in it. - getCoder(Class<? extends T>, Class<T>, Map<Type, ? extends Coder<?>>, TypeVariable<?>) - Method in class org.apache.beam.sdk.coders.CoderRegistry
-
Deprecated.This method is to change in an unknown backwards incompatible way once support for this functionality is refined.
- getCoder(Class<T>) - Method in class org.apache.beam.sdk.coders.CoderRegistry
-
Returns the
Coderto use for values of the given class. - getCoder(CoderRegistry) - Method in class org.apache.beam.sdk.io.kafka.ConfluentSchemaRegistryDeserializerProvider
- getCoder(CoderRegistry) - Method in interface org.apache.beam.sdk.io.kafka.DeserializerProvider
- getCoder(Pipeline) - Static method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord
- getCoder(TypeDescriptor<OutputT>, TypeDescriptor<InputT>, Coder<InputT>) - Method in class org.apache.beam.sdk.coders.CoderRegistry
-
Deprecated.This method is to change in an unknown backwards incompatible way once support for this functionality is refined.
- getCoder(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.coders.CoderRegistry
-
Returns the
Coderto use for values of the given type. - getCoderArguments() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmRecordCoder
- getCoderArguments() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.MetadataKeyCoder
- getCoderArguments() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.SingletonKeyedWorkItemCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.AtomicCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.Coder
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.CustomCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.KvCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.MapCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.NullableCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.OptionalCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.ShardedKeyCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.SnappyCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.SortedMapCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.TimestampPrefixingWindowCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.VarLongCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.ZstdCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResultCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.io.hadoop.WritableCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.io.kafka.ProducerRecordCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.io.kafka.TopicPartitionCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.io.ReadableFileCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.testing.TestStream.TestStreamCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow.Coder
- getCoderArguments() - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow.Coder
- getCoderArguments() - Method in class org.apache.beam.sdk.values.PCollectionViews.ValueOrMetadataCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
- getCoderArguments() - Method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.values.WindowedValues.FullWindowedValueCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.values.WindowedValues.ValueOnlyWindowedValueCoder
-
Deprecated.
- getCoderInternal() - Method in interface org.apache.beam.sdk.values.PCollectionView
-
Deprecated.this method will be removed entirely. The
PCollectionunderlying a side input, including itsCoder, is part of the side input's specification with aParDotransform, which will obtain that information via a package-private channel. - getCoderInternal() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
- getCoderProvider() - Static method in class org.apache.beam.sdk.coders.SerializableCoder
-
Returns a
CoderProviderwhich uses theSerializableCoderif possible for all types. - getCoderProvider() - Static method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
-
Returns a
CoderProviderwhich uses theAvroCoderif possible for all types. - getCoderProvider() - Static method in class org.apache.beam.sdk.extensions.protobuf.DynamicProtoCoder
- getCoderProvider() - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
- getCoderProvider() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResultCoder
- getCoderProvider() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.OffsetByteRangeCoder
- getCoderProvider() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.SubscriptionPartitionCoder
- getCoderProvider() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UuidCoder
- getCoderProvider() - Static method in class org.apache.beam.sdk.io.hadoop.WritableCoder
- getCoderProvider() - Static method in class org.apache.beam.sdk.io.splunk.SplunkEventCoder
- getCoderProviders() - Method in interface org.apache.beam.sdk.coders.CoderProviderRegistrar
-
Returns a list of
coder providerswhich will be registered by default within eachcoder registryinstance. - getCoderProviders() - Method in class org.apache.beam.sdk.coders.DefaultCoder.DefaultCoderProviderRegistrar
- getCoderProviders() - Method in class org.apache.beam.sdk.coders.SerializableCoder.SerializableCoderProviderRegistrar
- getCoderProviders() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtobufCoderProviderRegistrar
- getCoderProviders() - Method in class org.apache.beam.sdk.io.amqp.AmqpMessageCoderProviderRegistrar
- getCoderProviders() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryCoderProviderRegistrar
- getCoderProviders() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubCoderProviderRegistrar
- getCoderProviders() - Method in class org.apache.beam.sdk.io.hadoop.WritableCoder.WritableCoderProviderRegistrar
- getCoderProviders() - Method in class org.apache.beam.sdk.io.hbase.HBaseCoderProviderRegistrar
- getCoderRegistry() - Method in class org.apache.beam.sdk.Pipeline
-
Returns the
CoderRegistrythat thisPipelineuses. - getCoderTranslators() - Method in class org.apache.beam.sdk.extensions.avro.AvroGenericCoderRegistrar
- getCoderURNs() - Method in class org.apache.beam.sdk.extensions.avro.AvroGenericCoderRegistrar
- getCoGbkResultSchema() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
-
Returns the
CoGbkResultSchemaassociated with thisKeyedPCollectionTuple. - getCohorts() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter
-
Returns a list of sets of expressions that should be on the same level.
- getCollations() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
- getCollection() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple.TaggedKeyedPCollection
-
Returns the underlying PCollection of this TaggedKeyedPCollection.
- getCollectionElementType() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
- getColumnDelimiter() - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
- getColumnDelimiter() - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText
- getColumnDelimiter() - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
- getColumns() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- getColumns() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration
- getColumns() - Method in class org.apache.beam.sdk.io.snowflake.data.SnowflakeTableSchema
- getColumns(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema
- getCombineFn() - Method in class org.apache.beam.sdk.extensions.sql.impl.UdafImpl
- getCombineFn() - Method in class org.apache.beam.sdk.transforms.Combine.GloballyAsSingletonView
- getComment() - Method in class org.apache.beam.sdk.extensions.sql.meta.Table
- getCommitDeadline() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getCommitRetrySettings() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getCommitted() - Method in class org.apache.beam.sdk.metrics.MetricResult
-
Return the value of this metric across all successfully completed parts of the pipeline.
- getCommittedOrNull() - Method in class org.apache.beam.sdk.metrics.MetricResult
-
Return the value of this metric across all attempts of executing all parts of the pipeline.
- getCommitTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.TransactionResult
-
Returns the commit timestamp of the read / write transaction.
- getCommitTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The timestamp at which the modifications within were committed in Cloud Spanner.
- getCommitTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionEventRecord
-
Returns the timestamp at which the key range change occurred.
- getComponents() - Method in class org.apache.beam.sdk.coders.AtomicCoder
- getComponents() - Method in class org.apache.beam.sdk.coders.StructuredCoder
- getComponents() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResultCoder
- getComponents() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Path
-
Hierarchy list of component paths making up the full path, starting with the top-level child component path.
- getComponents() - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
- getComponents() - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow.Coder
- getComponents() - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
- getComponents() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
- getComponents() - Method in class org.apache.beam.sdk.values.WindowedValues.FullWindowedValueCoder
- getComponents(AvroGenericCoder) - Method in class org.apache.beam.sdk.extensions.avro.AvroGenericCoderTranslator
- getComponentType() - Method in class org.apache.beam.sdk.values.TypeDescriptor
-
Returns the component type if this type is an array type, otherwise returns
null. - getCompression() - Method in class org.apache.beam.sdk.io.FileIO.ReadableFile
-
Returns the method with which this file will be decompressed in
FileIO.ReadableFile.open(). - getCompression() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration
-
See
Compressionfor expected values. - getCompression() - Method in class org.apache.beam.sdk.io.TextRowCountEstimator
- getCompression() - Method in class org.apache.beam.sdk.io.TFRecordReadSchemaTransformConfiguration
- getCompression() - Method in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformConfiguration
- getCompressionCodecName() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.ParquetConfiguration
- getComputeNumShards() - Method in class org.apache.beam.sdk.io.WriteFiles
- getConfig() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceProviderFromDataSourceConfiguration
- getConfigProperties() - Method in class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig
- getConfigProperties() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration
- getConfigUpdates() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
- getConfiguration() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.HadoopInputFormatBoundedSource
- getConfiguration() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
- getConfiguration() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
- getConfiguration() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Write
- getConfiguration() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.WriteRowMutations
- getConfigurationMap() - Method in class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
-
Configuration Map Getter.
- getConfigurationRow() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransform
- getConfigurationRow() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteSchemaTransformProvider.BigQueryWriteSchemaTransform
- getConfigurationRow() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransform
- getConfigurationRow() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransform
- getConfiguredLoggerFromOptions(SdkHarnessOptions) - Static method in interface org.apache.beam.sdk.options.SdkHarnessOptions
-
Configure log manager's default log level and log level overrides from the sdk harness options, and return the list of configured loggers.
- getConfluentSchemaRegistrySubject() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- getConfluentSchemaRegistryUrl() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- getConnection(InfluxDbIO.DataSourceConfiguration, boolean) - Static method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO
- getConnectionInitSql() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
- getConnectionInitSql() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
- getConnectionProperties() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
- getConnectionProperties() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
- getConnector() - Method in enum class org.apache.beam.io.debezium.Connectors
-
Class connector to debezium.
- getConnectStringPrefix() - Method in class org.apache.beam.sdk.extensions.sql.impl.JdbcDriver
- getConnectTimeout() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
- getConstructorCreator(TypeDescriptor<?>, Constructor, Schema, FieldValueTypeSupplier, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
- getConstructorCreator(TypeDescriptor<?>, Schema, FieldValueTypeSupplier) - Static method in class org.apache.beam.sdk.schemas.utils.AutoValueUtils
-
Try to find an accessible constructor for creating an AutoValue class.
- getConstructorCreator(TypeDescriptor<T>, Constructor<T>, Schema, FieldValueTypeSupplier, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.POJOUtils
- getConsumerConfig() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getConsumerConfigUpdates() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- getConsumerFactoryFn() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getConsumerFactoryFn() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- getConsumerPollingTimeout() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getContainerImageBaseRepository() - Method in class org.apache.beam.runners.dataflow.DataflowRunnerInfo
-
Provides the version/tag for constructing the container image path.
- getContent() - Method in class org.apache.beam.sdk.io.tika.ParseResult
-
Returns the extracted text.
- getContentEncoding() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
- getContentType() - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.CreateOptions
-
The content type for the created file, eg "text/plain".
- getContentType() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
- getContext() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
- getContext() - Method in class org.apache.beam.sdk.io.cdap.Plugin
-
Gets the context of a plugin.
- getContiguousSequenceRangeReevaluationFrequency() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler.OrderedProcessingGlobalSequenceHandler
-
How frequently the combiner should reevaluate the maximum range? This parameter only affects the behaviour of streaming pipelines.
- getContinuationTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
- getContinuationTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
-
Return a trigger to use after a
GroupByKeyto preserve the intention of this trigger. - getContinuationTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.Trigger.OnceTrigger
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterAll
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterEach
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterFirst
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterPane
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterSynchronizedProcessingTime
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.FromEndOfWindow
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.DefaultTrigger
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.Never.NeverTrigger
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.OrFinallyTrigger
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.Repeatedly
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.ReshuffleTrigger
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
-
Subclasses should override this to return the
Trigger.getContinuationTrigger()of thisTrigger. - getConversionOptions(ObjectNode) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTableProvider
- getConvertedSchemaInformation(Schema, TypeDescriptor<T>) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroSchemaInformationProvider
- getConvertedSchemaInformation(Schema, TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.utils.RowSchemaInformationProvider
- getConvertedSchemaInformation(Schema, TypeDescriptor<T>) - Method in interface org.apache.beam.sdk.schemas.utils.SchemaInformationProvider
- getConvertedSchemaInformation(Schema, TypeDescriptor<T>, SchemaRegistry) - Static method in class org.apache.beam.sdk.schemas.utils.ConvertHelpers
-
Get the coder used for converting from an inputSchema to a given type.
- getConvertedType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.DateConversion
- getConvertedType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.LocalTimestampMicrosConversion
- getConvertedType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.LocalTimestampMillisConversion
- getConvertedType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimeMicrosConversion
- getConvertedType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimeMillisConversion
- getConvertedType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimestampMicrosConversion
- getConvertedType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimestampMillisConversion
- getConvertedType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.DateConversion
- getConvertedType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimeConversion
- getConvertedType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimeMicrosConversion
- getConvertedType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimestampConversion
- getConvertedType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimestampMicrosConversion
- getConvertPrimitive(Schema.FieldType, TypeDescriptor<?>, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.ConvertHelpers
-
Returns a function to convert a Row into a primitive type.
- getCorrelationId() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
- getCosmosClientBuilder() - Method in interface org.apache.beam.sdk.io.azure.cosmos.CosmosOptions
- getCosmosKey() - Method in interface org.apache.beam.sdk.io.azure.cosmos.CosmosOptions
-
The Azure Cosmos key used to perform authentication for accessing resource.
- getCosmosServiceEndpoint() - Method in interface org.apache.beam.sdk.io.azure.cosmos.CosmosOptions
-
The Azure Cosmos service endpoint used by the Cosmos client.
- getCount() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel
- getCount() - Method in class org.apache.beam.sdk.metrics.DistributionResult
- getCount() - Method in class org.apache.beam.sdk.testing.CoderProperties.TestElementByteSizeObserver
- getCountBackoffs() - Method in class org.apache.beam.io.requestresponse.Monitoring
-
Count invocations of
BackOff.nextBackOffMillis(). - getCountCacheReadFailures() - Method in class org.apache.beam.io.requestresponse.Monitoring
-
Count
Cacheread failures. - getCountCacheReadNonNulls() - Method in class org.apache.beam.io.requestresponse.Monitoring
-
Count associated non-null values resulting from
Cachereads. - getCountCacheReadNulls() - Method in class org.apache.beam.io.requestresponse.Monitoring
-
Count associated null values resulting from
Cachereads. - getCountCacheReadRequests() - Method in class org.apache.beam.io.requestresponse.Monitoring
-
Count number of attempts to read from the
Cache. - getCountCacheWriteFailures() - Method in class org.apache.beam.io.requestresponse.Monitoring
-
Count
Cachewrite failures. - getCountCacheWriteRequests() - Method in class org.apache.beam.io.requestresponse.Monitoring
-
Count number of attempts to write to the
Cache. - getCountCacheWriteSuccesses() - Method in class org.apache.beam.io.requestresponse.Monitoring
-
Count
Cachewrite successes. - getCountCalls() - Method in class org.apache.beam.io.requestresponse.Monitoring
-
Count invocations of
Caller.call(RequestT). - getCountEndPoint() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
- getCounter(MetricName) - Method in class org.apache.beam.runners.jet.metrics.JetMetricsContainer
- getCounter(MetricName) - Method in interface org.apache.beam.sdk.metrics.MetricsContainer
-
Return the
Counterthat should be used for implementing the givenmetricNamein this container. - getCounters() - Method in class org.apache.beam.sdk.metrics.MetricQueryResults
-
Return the metric results for the counters that matched the filter.
- getCounters(MetricFilter) - Method in class org.apache.beam.runners.spark.metrics.WithMetricsSupport
- getCounters(MetricFilter) - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.WithMetricsSupport
- getCountFailures() - Method in class org.apache.beam.io.requestresponse.Monitoring
-
Count failures resulting from
Call's successfulCallerinvocation. - getCountRequests() - Method in class org.apache.beam.io.requestresponse.Monitoring
-
Count incoming request elements processed by
Call'sDoFn. - getCountResponses() - Method in class org.apache.beam.io.requestresponse.Monitoring
-
Count outgoing responses resulting from
Call's successfulCallerinvocation. - getCountryOfResidence() - Method in class org.apache.beam.sdk.extensions.sql.example.model.Customer
- getCountSetup() - Method in class org.apache.beam.io.requestresponse.Monitoring
-
Count invocations of
SetupTeardown.setup(). - getCountShouldBackoff() - Method in class org.apache.beam.io.requestresponse.Monitoring
-
Count when
CallShouldBackoff.isTrue()is found true. - getCountSleeps() - Method in class org.apache.beam.io.requestresponse.Monitoring
-
Count invocations of
Sleeper.sleep(long). - getCountTeardown() - Method in class org.apache.beam.io.requestresponse.Monitoring
-
Count invocations of
SetupTeardown.teardown(). - getCpu() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
- getCpuRate() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
- getCreatedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
The time at which this partition was first detected and created in the metadata table.
- getCreatedAtIndexName() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataTableNames
- getCreateDisposition() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
- getCreateDisposition() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.WriteBuilder.Configuration
- getCreateDisposition() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
- getCreateFromSnapshot() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
-
If set, the snapshot from which the job should be created.
- getCreateTime() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
-
Gets create time.
- getCreator(TypeDescriptor<T>, Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
-
Get an object creator for an AVRO-generated SpecificRecord.
- getCredential() - Method in interface org.apache.beam.sdk.extensions.gcp.auth.CredentialFactory
- getCredential() - Method in class org.apache.beam.sdk.extensions.gcp.auth.GcpCredentialFactory
-
Returns a default GCP
Credentialsor null when it fails. - getCredential() - Method in class org.apache.beam.sdk.extensions.gcp.auth.NoopCredentialFactory
- getCredential() - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsUserCredentialFactory
-
Returns
Credentialsas configured byGoogleAdsOptions. - getCredentialFactoryClass() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
-
The class of the credential factory that should be created and used to create credentials.
- getCredentials() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getCrossProduct() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider.Configuration
- getCsvConfiguration() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration
- getCsvFormat() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTableProvider.CsvToRow
- getCsvRecord() - Method in class org.apache.beam.sdk.io.csv.CsvIOParseError
-
The CSV record associated with the caught
Exception. - getCurrent() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
- getCurrent() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
- getCurrent() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
-
Gets the current record from the delegate reader.
- getCurrent() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
- getCurrent() - Method in class org.apache.beam.sdk.io.Source.Reader
-
Returns the value of the data item that was read by the last
Source.Reader.start()orSource.Reader.advance()call. - getCurrentBlock() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource.AvroReader
- getCurrentBlock() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
-
Returns the current block (the block that was read by the last successful call to
BlockBasedSource.BlockBasedReader.readNextBlock()). - getCurrentBlockOffset() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource.AvroReader
- getCurrentBlockOffset() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
-
Returns the largest offset such that starting to read from that offset includes the current block.
- getCurrentBlockSize() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource.AvroReader
- getCurrentBlockSize() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
-
Returns the size of the current block in bytes as it is represented in the underlying file, if possible.
- getCurrentBundle() - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
- getCurrentBundleTimestamp() - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
- getCurrentCatalogSchema() - Method in class org.apache.beam.sdk.extensions.sql.impl.CatalogManagerSchema
- getCurrentContainer() - Static method in class org.apache.beam.sdk.metrics.MetricsEnvironment
-
Return the
MetricsContainerfor the current thread. - getCurrentDatabaseSchema() - Method in class org.apache.beam.sdk.extensions.sql.impl.CatalogSchema
- getCurrentDirectory() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsResourceId
- getCurrentDirectory() - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem.ClassLoaderResourceId
- getCurrentDirectory() - Method in interface org.apache.beam.sdk.io.fs.ResourceId
-
Returns the
ResourceIdthat represents the current directory of thisResourceId. - getCurrentKey() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.ExecutableStageDoFnOperator
- getCurrentOffset() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
- getCurrentOffset() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
- getCurrentOffset() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
-
Returns the starting offset of the
current record, which has been read by the last successfulSource.Reader.start()orSource.Reader.advance()call. - getCurrentOutputWatermark() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- getCurrentParent() - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
-
Gets the parent composite transform to the current transform, if one exists.
- getCurrentRateLimit() - Method in class org.apache.beam.sdk.io.sparkreceiver.WrappedSupervisor
- getCurrentRecord() - Method in class org.apache.beam.sdk.io.BlockBasedSource.Block
-
Returns the current record.
- getCurrentRecordId() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
-
Returns a unique identifier for the current record.
- getCurrentRecordId() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow
- getCurrentRecordOffset() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
- getCurrentRecordOffset() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow
- getCurrentRelativeTime() - Method in interface org.apache.beam.sdk.state.Timer
-
Returns the current relative time used by
Timer.setRelative()andTimer.offset(org.joda.time.Duration). - getCurrentRowAsStruct() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSet
-
Returns the record at the current pointer as a
Struct. - getCurrentSchemaPlus() - Method in class org.apache.beam.sdk.extensions.sql.impl.JdbcConnection
-
Calcite-created
SchemaPluswrapper for the current schema. - getCurrentSource() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
- getCurrentSource() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource.AvroReader
- getCurrentSource() - Method in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
-
Returns a
Sourcedescribing the same input that thisReadercurrently reads (including items already read). - getCurrentSource() - Method in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
- getCurrentSource() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
- getCurrentSource() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
- getCurrentSource() - Method in class org.apache.beam.sdk.io.Source.Reader
-
Returns a
Sourcedescribing the same input that thisReadercurrently reads (including items already read). - getCurrentSource() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
-
Returns the
UnboundedSourcethat created this reader. - getCurrentTimestamp() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
- getCurrentTimestamp() - Method in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
-
By default, returns the minimum possible timestamp.
- getCurrentTimestamp() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
- getCurrentTimestamp() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
- getCurrentTimestamp() - Method in class org.apache.beam.sdk.io.Source.Reader
-
Returns the timestamp associated with the current data item.
- getCurrentToken() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
- getCurrentTransform() - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
- getCurrentTransform() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
- getCurrentTransform() - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
- getCurrentTransform() - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
- getCurrentVersion() - Method in class org.apache.beam.runners.flink.translation.types.UnversionedTypeSerializerSnapshot
- getCurrentVersion() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.FlinkStateNamespaceKeySerializer.FlinkStateNameSpaceSerializerSnapshot
- getCursor() - Method in class org.apache.beam.sdk.io.redis.RedisCursor
- getCustomBeamRequirement() - Method in interface org.apache.beam.sdk.extensions.python.PythonExternalTransformOptions
- getCustomerId() - Method in class org.apache.beam.sdk.extensions.sql.example.model.Order
- getCustomerProvidedKey() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
- getCustomError() - Method in class org.apache.beam.sdk.extensions.gcp.util.CustomHttpErrors.MatcherAndError
- getCustomError(HttpRequestWrapper, HttpResponseWrapper) - Method in class org.apache.beam.sdk.extensions.gcp.util.CustomHttpErrors
- getDanglingDataSets() - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.BatchTranslationContext
- getData() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
- getData() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
-
Gets data.
- getData(Row) - Method in interface org.apache.beam.sdk.io.iceberg.DynamicDestinations
- getDataAsBytes() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
- getDatabase() - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration
- getDatabase() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- getDatabase() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
-
Getting a Snowflake database.
- getDatabase() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- getDatabase() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- getDatabaseAdminClient() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor
- getDatabaseClient() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor
- getDatabaseId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
- getDatabaseId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
- getDatabaseId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntityWithSummary
- getDatabaseId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKeyWithSummary
- getDatabaseId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.WriteWithSummary
- getDatabaseId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
- getDatabaseId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
- getDatabaseId() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration
- getDatabaseId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getDatabaseId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration
- getDatabaseId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration
- getDatabaseRole() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getDatabaseSchema(TableName) - Method in class org.apache.beam.sdk.extensions.sql.impl.CatalogSchema
- getDataBoostEnabled() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getDataCatalogEndpoint() - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogPipelineOptions
-
DataCatalog endpoint.
- getDataCatalogSegments() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath
- getDataCatalogSegments() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.TopicPath
-
Returns the data catalog segments.
- getDataClient() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.BigtableChangeStreamAccessor
- getDataCoder() - Method in class org.apache.beam.runners.dataflow.DataflowRunner.StreamingPCollectionViewWriterFn
- getDataflowClient() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
An instance of the Dataflow client.
- getDataflowEndpoint() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
Dataflow endpoint to use.
- getDataflowEndpoint() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
-
Dataflow endpoint to use.
- getDataflowJobFile() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
The path to write the translated Dataflow job specification out to at job submission time.
- getDataflowKmsKey() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
-
GCP Cloud KMS key for Dataflow pipelines and buckets created by GcpTempLocationFactory.
- getDataflowOptions() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
- getDataflowRunnerInfo() - Static method in class org.apache.beam.runners.dataflow.DataflowRunnerInfo
-
Returns an instance of
DataflowRunnerInfo. - getDataflowServiceOptions() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
-
Service options are set by the user and configure the service.
- getDataflowWorkerJar() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
- getDataResource() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOError
- getDataSchema() - Method in interface org.apache.beam.sdk.io.iceberg.DynamicDestinations
- getDataset(String) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getDataset(String, String) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getDataset(String, String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Gets the specified
Datasetresource by dataset ID. - getDataset(String, String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
-
Gets the specified
Datasetresource by dataset ID. - getDataset(String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- getDataset(String, String, Map<String, String>) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getDataset(String, Map<String, String>) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getDataset(PCollection<T>) - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator.TranslationState
- getDataset(PCollection<T>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
- getDataSetOrThrow(String) - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.BatchTranslationContext
- getDatasetService(BigQueryOptions) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices
-
Returns a real, mock, or fake
BigQueryServices.DatasetService. - getDatasetService(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl
- getDatasetService(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
- getDataSource() - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.DataSourceConfiguration
- getDataSource() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- getDataSourceConfiguration() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadConfiguration
- getDataSourceConfiguration() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteConfiguration
- getDataSourceConfiguration() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- getDataSourceProviderFn() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
-
Getting a DataSource provider function for connection credentials.
- getDataStreamOrThrow(String) - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator.StreamingTranslationContext
- getDataType() - Method in class org.apache.beam.sdk.io.snowflake.data.SnowflakeColumn
- getDateTime() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
- getDateTime(int) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.DATETIMEvalue by field index,IllegalStateExceptionis thrown if schema doesn't match. - getDateTime(String) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.DATETIMEvalue by field name,IllegalStateExceptionis thrown if schema doesn't match. - getDatumFactory() - Method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
-
Returns the datum factory used for encoding/decoding.
- getDatumReader() - Method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
-
Returns the DatumReader used for decoding.
- getDatumWriter() - Method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
-
Returns the DatumWriter used for encoding.
- getDatumWriterFactory(DestinationT) - Method in class org.apache.beam.sdk.extensions.avro.io.DynamicAvroDestinations
-
Return a
AvroSink.DatumWriterFactoryfor a given destination. - getDayOfMonth() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
- getDayOfMonth() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
- getDbSize() - Method in class org.apache.beam.sdk.io.redis.RedisCursor
- getDebeziumConnectionProperties() - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration
- getDecimal() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
- getDecimal(int) - Method in class org.apache.beam.sdk.values.Row
-
Get a
BigDecimalvalue by field index,ClassCastExceptionis thrown if schema doesn't match. - getDecimal(String) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.DECIMALvalue by field name,IllegalStateExceptionis thrown if schema doesn't match. - getDef() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner.NonCumulativeCostImpl
- getDef() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.RelMdNodeStats
- getDefault() - Static method in class org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter
- getDefaultCoder(TypeDescriptor<?>, CoderRegistry) - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
-
Returns the default coder for a given type descriptor.
- getDefaultDestination() - Method in class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
-
Returns the default destination.
- getDefaultEnvironmentConfig() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
- getDefaultEnvironmentType() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
- getDefaultHeaders() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
- getDefaultJobName() - Static method in class org.apache.beam.runners.jobsubmission.PortablePipelineJarUtils
- getDefaultOutputCoder() - Method in class org.apache.beam.sdk.io.Source
-
Deprecated.Override
Source.getOutputCoder()instead. - getDefaultOutputCoder() - Method in class org.apache.beam.sdk.transforms.PTransform
-
Deprecated.Instead, the PTransform should explicitly call
PCollection.setCoder(org.apache.beam.sdk.coders.Coder<T>)on the returned PCollection. - getDefaultOutputCoder(InputT) - Method in class org.apache.beam.sdk.transforms.PTransform
-
Deprecated.Instead, the PTransform should explicitly call
PCollection.setCoder(org.apache.beam.sdk.coders.Coder<T>)on the returned PCollection. - getDefaultOutputCoder(InputT, PCollection<T>) - Method in class org.apache.beam.sdk.transforms.PTransform
-
Deprecated.Instead, the PTransform should explicitly call
PCollection.setCoder(org.apache.beam.sdk.coders.Coder<T>)on the returned PCollection. - getDefaultOutputCoder(CoderRegistry, Coder) - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
- getDefaultOutputCoder(CoderRegistry, Coder<InputT>) - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
- getDefaultOutputCoder(CoderRegistry, Coder<InputT>) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
- getDefaultOutputCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
- getDefaultOutputCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
- getDefaultOutputCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.ArrayAgg.ArrayAggArray
- getDefaultOutputCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations.BitXOr
- getDefaultOutputCoder(CoderRegistry, Coder<InputT>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions.PositionAwareCombineFn
- getDefaultOutputCoder(CoderRegistry, Coder<InputT>) - Method in class org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn
- getDefaultOutputCoder(CoderRegistry, Coder<InputT>) - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
- getDefaultOutputCoder(CoderRegistry, Coder<V>) - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
- getDefaultOutputCoder(CoderRegistry, Coder<V>) - Method in class org.apache.beam.sdk.transforms.Combine.SimpleCombineFn
- getDefaultOutputCoder(CoderRegistry, Coder<DataT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
- getDefaultOutputCoder(CoderRegistry, Coder<DataT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
- getDefaultOutputCoder(CoderRegistry, Coder<InputT>) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
- getDefaultOutputCoder(CoderRegistry, Coder<Boolean>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CountIf.CountIfFn
- getDefaultOutputCoder(CoderRegistry, Coder<Row>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CovarianceFn
- getDefaultOutputCoder(CoderRegistry, Coder<TimestampedValue<KV<EventKeyT, KV<Long, EventT>>>>) - Method in class org.apache.beam.sdk.extensions.ordered.combiner.DefaultSequenceCombiner
- getDefaultOutputCoder(CoderRegistry, Coder<InputT>) - Method in class org.apache.beam.sdk.extensions.sql.impl.LazyAggregateCombineFn
- getDefaultOutputCoder(CoderRegistry, Coder<InputT>) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
- getDefaultOutputCoder(CoderRegistry, Coder<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Concatenate
- getDefaultOutputCoder(CoderRegistry, Coder<byte[]>) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggByte
- getDefaultOutputCoder(CoderRegistry, Coder<String>) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggString
- getDefaultOutputCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.transforms.Top.TopCombineFn
- getDefaultOutputCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.VarianceFn
- getDefaultOutputCoder(CoderRegistry, Coder<InputT>) - Method in interface org.apache.beam.sdk.transforms.CombineFnBase.GlobalCombineFn
-
Returns the
Coderto use by default for outputOutputTvalues, or null if it is not able to be inferred. - getDefaultOutputCoder(CoderRegistry, Coder<Double>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
- getDefaultOutputCoder(CoderRegistry, Coder<Integer>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
- getDefaultOutputCoder(CoderRegistry, Coder<Long>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
- getDefaultOutputCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.runners.flink.FlinkStreamingAggregationsTranslators.ConcatenateAsIterable
- getDefaultOutputCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
- getDefaultOutputCoder(CoderRegistry, Coder<V>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
- getDefaultOverrides(boolean) - Static method in class org.apache.beam.runners.spark.SparkTransformOverrides
- getDefaultPrecision(SqlTypeName) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRelDataTypeSystem
- getDefaultSdkHarnessLogLevel() - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
-
This option controls the default log level of all loggers without a log level override.
- getDefaultValue() - Method in interface org.apache.beam.sdk.values.PCollectionViews.HasDefaultValue
- getDefaultValue() - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn
-
Deprecated.Returns the default value that was specified.
- getDefaultValue() - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn2
-
Returns the default value that was specified.
- getDefaultWindowMappingFn() - Method in class org.apache.beam.sdk.extensions.sql.impl.TVFSlidingWindowFn
- getDefaultWindowMappingFn() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
- getDefaultWindowMappingFn() - Method in class org.apache.beam.sdk.transforms.windowing.PartitioningWindowFn
- getDefaultWindowMappingFn() - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
- getDefaultWindowMappingFn() - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
-
Return a
WindowMappingFnthat returns the earliest window that contains the end of the main-input window. - getDefaultWindowMappingFn() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
-
Returns the default
WindowMappingFnto use to map main input windows to side input windows. - getDefaultWorkerLogLevel() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
-
Deprecated.This option controls the default log level of all loggers without a log level override.
- getDeidentifyConfig() - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
- getDeidentifyTemplateName() - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
- getDelay() - Method in class org.apache.beam.sdk.transforms.windowing.TimestampTransform.Delay
- getDelimiter() - Method in class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider.CsvWriteConfiguration
- getDelimiters() - Method in class org.apache.beam.sdk.io.TextRowCountEstimator
- getDeliveryMode() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
- getDeliveryMode() - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
- getDependencies() - Method in class org.apache.beam.sdk.expansion.service.ExpansionServiceConfig
- getDependencies(ConfigT, PipelineOptions) - Method in interface org.apache.beam.sdk.transforms.ExternalTransformBuilder
-
List the dependencies needed for this transform.
- getDependencies(RunnerApi.FunctionSpec, PipelineOptions) - Method in interface org.apache.beam.sdk.expansion.service.TransformProvider
- getDescription() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
-
If the field has a description, returns the description for the field.
- getDescription() - Method in class org.apache.beam.sdk.schemas.Schema.Field
-
Returns the field's description.
- getDescription() - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Failure
-
The description of what was being attempted when the failure occurred.
- getDescriptor(String) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoDomain
- getDescriptorFromTableSchema(TableSchema, boolean, boolean) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
- getDescriptorFromTableSchema(TableSchema, boolean, boolean) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
-
Given a BigQuery TableSchema, returns a protocol-buffer Descriptor that can be used to write data using the BigQuery Storage API.
- getDeserializer(Map<String, ?>, boolean) - Method in class org.apache.beam.sdk.io.kafka.ConfluentSchemaRegistryDeserializerProvider
- getDeserializer(Map<String, ?>, boolean) - Method in interface org.apache.beam.sdk.io.kafka.DeserializerProvider
- getDesiredNumUnboundedSourceSplits() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
The desired number of initial splits for UnboundedSources.
- getDestination() - Method in class org.apache.beam.runners.dataflow.util.PackageUtil.StagedFile
-
Staged target for this file.
- getDestination() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
- getDestination() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
-
Return the user destination object for this writer.
- getDestination() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record
-
Gets the destination (topic or queue) to which the message was sent.
- getDestination(String, String) - Method in interface org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService.ArtifactDestinationProvider
- getDestination(ValueInSingleWindow<Row>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.PortableBigQueryDestinations
- getDestination(ValueInSingleWindow<T>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
-
Returns an object that represents at a high level which table is being written to.
- getDestination(ValueInSingleWindow<T>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiDynamicDestinationsTableRow
- getDestination(UserT) - Method in class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
-
Returns an object that represents at a high level the destination being written to.
- getDestinationCoder() - Method in class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
-
Returns the coder for
FileBasedSink.DynamicDestinations. - getDestinationCoder() - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
-
Returns the coder for
DynamicDestinations. - getDestinationCoder() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiDynamicDestinationsTableRow
- getDestinationFile(boolean, FileBasedSink.DynamicDestinations<?, DestinationT, ?>, int, FileBasedSink.OutputFileHints) - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
- getDestinationFn() - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
- getDiagnostics() - Method in exception class org.apache.beam.sdk.schemas.transforms.providers.StringCompiler.CompileException
- getDictionary(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- getDictionary(Map<String, Object>, String, Map<String, Object>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- getDir() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.OrderKey
- getDirectoryTreatment() - Method in class org.apache.beam.sdk.io.TextRowCountEstimator
- getDisableAutoCommit() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
- getDisableMetrics() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getDiskSizeGb() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
-
Remote worker disk size, in gigabytes, or 0 to use the default size.
- getDistribution() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
- getDistribution(MetricName) - Method in class org.apache.beam.runners.jet.metrics.JetMetricsContainer
- getDistribution(MetricName) - Method in interface org.apache.beam.sdk.metrics.MetricsContainer
-
Return the
Distributionthat should be used for implementing the givenmetricNamein this container. - getDistributions() - Method in class org.apache.beam.sdk.metrics.MetricQueryResults
-
Return the metric results for the distributions that matched the filter.
- getDlqTransform(String) - Static method in class org.apache.beam.sdk.schemas.io.GenericDlq
- getDocToBulk() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
- getDocumentCount() - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
- getDoFn() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- getDoFn() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.PartialReduceBundleOperator
- getDoFn() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.WindowDoFnOperator
- getDoFn() - Method in class org.apache.beam.runners.spark.translation.SparkProcessContext
- getDoFnRunner() - Method in class org.apache.beam.runners.spark.translation.SparkProcessContext
- getDoFnRunner(PipelineOptions, DoFn<InputT, OutputT>, SideInputReader, AbstractParDoP.JetOutputManager, TupleTag<OutputT>, List<TupleTag<?>>, Coder<InputT>, Map<TupleTag<?>, Coder<?>>, WindowingStrategy<?, ?>, DoFnSchemaInformation, Map<String, PCollectionView<?>>) - Method in class org.apache.beam.runners.jet.processors.ParDoP
- getDoFnRunner(PipelineOptions, DoFn<KV<?, ?>, OutputT>, SideInputReader, AbstractParDoP.JetOutputManager, TupleTag<OutputT>, List<TupleTag<?>>, Coder<KV<?, ?>>, Map<TupleTag<?>, Coder<?>>, WindowingStrategy<?, ?>, DoFnSchemaInformation, Map<String, PCollectionView<?>>) - Method in class org.apache.beam.runners.jet.processors.StatefulParDoP
- getDoFnSchemaInformation(DoFn<?, ?>, PCollection<?>) - Static method in class org.apache.beam.sdk.transforms.ParDo
-
Extract information on how the DoFn uses schemas.
- getDouble() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
- getDouble(int) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.DOUBLEvalue by field index,ClassCastExceptionis thrown if schema doesn't match. - getDouble(String) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.DOUBLEvalue by field name,IllegalStateExceptionis thrown if schema doesn't match. - getDriverClassName() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
- getDriverClassName() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
- getDriverJars() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
- getDriverJars() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
- getDrop() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
- getDrop() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration
- getDrop() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider.Configuration
- getDropFields() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getDStream() - Method in class org.apache.beam.runners.spark.translation.streaming.UnboundedDataset
- getDumpHeapOnOOM() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
If true, save a heap dump before killing a thread or process which is GC thrashing or out of memory.
- getDuplicateCount() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus
- getDynamicDestinations() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSink
- getDynamicDestinations() - Method in class org.apache.beam.sdk.io.FileBasedSink
-
Return the
FileBasedSink.DynamicDestinationsused. - getEarliestBufferedSequence() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus
- getEarliestHL7v2SendTime(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Gets the earliest HL7v2 send time.
- getEarliestHL7v2SendTime(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- getEarlyTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
- getEffectiveInputWatermark() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- getElasticsearchHttpPort() - Method in interface org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOITCommon.ElasticsearchPipelineOptions
- getElasticsearchServer() - Method in interface org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOITCommon.ElasticsearchPipelineOptions
- getElemCoder() - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
- getElement() - Method in class org.apache.beam.sdk.io.gcp.bigquery.AvroWriteRequest
- getElementByteSize() - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches.BatchingParams
- getElementCoder() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.SingletonKeyedWorkItemCoder
- getElementCoders() - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
- getElementConverters() - Method in class org.apache.beam.sdk.transforms.DoFnSchemaInformation
-
The schema of the @Element parameter.
- getElementCount() - Method in class org.apache.beam.sdk.transforms.windowing.AfterPane
-
The number of elements after which this trigger may fire.
- getElementProcessingTimeoutMinutes() - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
-
The time limit (in minute) that an SDK worker allows for a PTransform operation before signaling the runner harness to restart the SDK worker.
- getElements() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.stableinput.KeyedBufferingElementsHandler
- getElements() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.stableinput.NonKeyedBufferingElementsHandler
- getElements() - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Rate
- getElements() - Method in class org.apache.beam.sdk.testing.TestStream.ElementEvent
- getElements() - Method in class org.apache.beam.sdk.transforms.Create.Values
- getElementType() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
-
If the field is a container type, returns the element type.
- getEmptyMatchTreatment() - Method in class org.apache.beam.sdk.io.FileBasedSource
- getEmptyMatchTreatment() - Method in class org.apache.beam.sdk.io.FileIO.MatchConfiguration
- getEmptyMatchTreatment() - Method in class org.apache.beam.sdk.io.TextRowCountEstimator
- getEmulatorHost() - Method in interface org.apache.beam.sdk.io.gcp.firestore.FirestoreOptions
-
A host port pair to allow connecting to a Cloud Firestore emulator instead of the live service.
- getEmulatorHost() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getEnableBucketReadMetricCounter() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
-
If true, reports number of bytes read from each gcs bucket.
- getEnableBucketWriteMetricCounter() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
-
If true, reports number of bytes written to each gcs bucket.
- getEnableHeapDumps() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
-
If true and PipelineOption tempLocation is set, save a heap dump before shutting down the JVM due to GC thrashing or out of memory.
- getEnableLogViaFnApi() - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
-
This option controls whether logging will be redirected through the FnApi.
- getEnableSparkMetricSinks() - Method in interface org.apache.beam.runners.spark.SparkCommonPipelineOptions
- getEnableStableInputDrain() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getEnableStorageReadApiV2() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getEnableWebUI() - Method in interface org.apache.beam.runners.prism.PrismPipelineOptions
- getEncodedElementByteSize(byte[]) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
- getEncodedElementByteSize(T) - Method in class org.apache.beam.sdk.coders.NullableCoder
-
Overridden to short-circuit the default
StructuredCoderbehavior of encoding and counting the bytes. - getEncodedElementByteSize(HyperLogLogPlus) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.HyperLogLogPlusCoder
- getEncodedElementByteSize(TableRow) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
- getEncodedElementByteSize(ByteString) - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
- getEncodedElementByteSize(Boolean) - Method in class org.apache.beam.sdk.coders.BooleanCoder
- getEncodedElementByteSize(Byte) - Method in class org.apache.beam.sdk.coders.ByteCoder
-
Returns the size in bytes of the encoded value using this coder.
- getEncodedElementByteSize(Double) - Method in class org.apache.beam.sdk.coders.DoubleCoder
-
Returns the size in bytes of the encoded value using this coder.
- getEncodedElementByteSize(Float) - Method in class org.apache.beam.sdk.coders.FloatCoder
-
Returns the size in bytes of the encoded value using this coder.
- getEncodedElementByteSize(Integer) - Method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
-
Returns the size in bytes of the encoded value using this coder.
- getEncodedElementByteSize(Integer) - Method in class org.apache.beam.sdk.coders.TextualIntegerCoder
- getEncodedElementByteSize(Integer) - Method in class org.apache.beam.sdk.coders.VarIntCoder
- getEncodedElementByteSize(Long) - Method in class org.apache.beam.sdk.coders.BigEndianLongCoder
-
Returns the size in bytes of the encoded value using this coder.
- getEncodedElementByteSize(Long) - Method in class org.apache.beam.sdk.coders.VarLongCoder
- getEncodedElementByteSize(Short) - Method in class org.apache.beam.sdk.coders.BigEndianShortCoder
-
Returns the size in bytes of the encoded value using this coder.
- getEncodedElementByteSize(String) - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
-
Returns the size in bytes of the encoded value using this coder.
- getEncodedElementByteSize(Void) - Method in class org.apache.beam.sdk.coders.VoidCoder
- getEncodedElementByteSize(BigDecimal) - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
-
Returns the size in bytes of the encoded value using this coder.
- getEncodedElementByteSize(BigInteger) - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
-
Returns the size in bytes of the encoded value using this coder.
- getEncodedElementByteSize(Optional<T>) - Method in class org.apache.beam.sdk.coders.OptionalCoder
-
Overridden to short-circuit the default
StructuredCoderbehavior of encoding and counting the bytes. - getEncodedElementByteSize(IsmFormat.Footer) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.FooterCoder
- getEncodedElementByteSize(IsmFormat.KeyPrefix) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.KeyPrefixCoder
- getEncodedElementByteSize(RandomAccessData) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
- getEncodedElementByteSize(EncodedBoundedWindow) - Method in class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow.Coder
- getEncodedElementByteSize(BigQueryInsertError) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
- getEncodedElementByteSize(OffsetRange) - Method in class org.apache.beam.sdk.io.range.OffsetRange.Coder
- getEncodedElementByteSize(ByteString) - Method in class org.apache.beam.runners.fnexecution.wire.ByteStringCoder
- getEncodedElementByteSize(Instant) - Method in class org.apache.beam.sdk.coders.InstantCoder
- getEncodedElementByteSize(T) - Method in class org.apache.beam.sdk.coders.Coder
-
Returns the size in bytes of the encoded value using this coder.
- getEncodedElementByteSize(T) - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
-
Overridden to short-circuit the default
StructuredCoderbehavior of encoding and counting the bytes. - getEncodedElementByteSizeUsingCoder(Coder<T>, T) - Static method in class org.apache.beam.sdk.coders.Coder
- getEncodedRecord() - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Record
-
Nullable to account for failing to encode, or if there is no coder for the record at the time of failure.
- getEncodedTypeDescriptor() - Method in class org.apache.beam.runners.fnexecution.wire.ByteStringCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.BigEndianLongCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.BigEndianShortCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.ByteCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.Coder
-
Returns the
TypeDescriptorfor the type encoded. - getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.CollectionCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.DelegateCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.DequeCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.DoubleCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.DurationCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.FloatCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.InstantCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.IterableCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.KvCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.ListCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.MapCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.NullableCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.OptionalCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.SerializableCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.SetCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.SortedMapCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.TextualIntegerCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.VarIntCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.VarLongCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.VoidCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.io.range.OffsetRange.Coder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.io.splunk.SplunkEventCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.schemas.SchemaCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
- getEncodedWindow() - Method in class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow
- getEncodingPositions() - Method in class org.apache.beam.sdk.schemas.Schema
-
Gets the encoding positions for this schema.
- getEnd() - Method in class org.apache.beam.sdk.extensions.ordered.ContiguousSequenceRange
- getEnd() - Method in class org.apache.beam.sdk.io.cassandra.RingRange
- getEnd() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.HL7v2MessagePages
- getEnd() - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration
- getEndAtTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration
- getEndKey() - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
-
Returns the
ByteKeyrepresenting the upper bound of thisByteKeyRange. - getEndOffset() - Method in class org.apache.beam.sdk.io.OffsetBasedSource
-
Returns the specified ending offset of the source.
- getEndOffset() - Method in interface org.apache.beam.sdk.io.sparkreceiver.HasOffset
- getEndpoint() - Method in interface org.apache.beam.sdk.io.aws2.options.AwsOptions
-
Endpoint used to configure AWS service clients.
- getEndTime() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
- getEndTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionEndRecord
-
The end timestamp at which the change stream partition is terminated.
- getEndTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
The end time for querying this given partition.
- getEnumeratorCheckpointSerializer() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSource
- getEnumName(int) - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
- getEnumValue(String) - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
- getEnvironment() - Method in class org.apache.beam.runners.fnexecution.environment.ProcessEnvironment
- getEnvironment() - Method in interface org.apache.beam.runners.fnexecution.environment.RemoteEnvironment
-
Return the environment that the remote handles.
- getEnvironment() - Method in class org.apache.beam.runners.fnexecution.environment.RemoteEnvironment.SimpleRemoteEnvironment
- getEnvironment() - Method in class org.apache.beam.runners.fnexecution.environment.StaticRemoteEnvironment
- getEnvironment() - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
- getEnvironmentCacheMillis() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
- getEnvironmentExpirationMillis() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
- getEnvironmentId() - Method in class org.apache.beam.sdk.values.WindowingStrategy
- getEnvironmentOption(PortablePipelineOptions, String) - Static method in interface org.apache.beam.sdk.options.PortablePipelineOptions
-
Return the value for the specified environment option or empty string if not present.
- getEnvironmentOptions() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
- getEquivalentFieldType(TableSchema.ColumnType) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema
-
Returns Beam equivalent of ClickHouse column type.
- getEquivalentSchema(TableSchema) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema
-
Returns Beam equivalent of ClickHouse schema.
- getError() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertError
- getError() - Method in class org.apache.beam.sdk.io.solace.data.Solace.PublishResult
-
The error details if the message could not be published.
- getError() - Method in class org.apache.beam.sdk.io.tika.ParseResult
-
Returns the parse error, if the file was parsed unsuccessfully.
- getError() - Method in class org.apache.beam.sdk.schemas.io.Failure
-
Information about the cause of the failure.
- getErrorAsString() - Method in class org.apache.beam.sdk.io.tika.ParseResult
-
Same as
ParseResult.getError(), but returns the complete stack trace of the error as aString. - getErrorHandling() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
- getErrorHandling() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
- getErrorHandling() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration
- getErrorHandling() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
- getErrorHandling() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
- getErrorHandling() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration
- getErrorHandling() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration
- getErrorHandling() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- getErrorHandling() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration
- getErrorHandling() - Method in class org.apache.beam.sdk.io.TFRecordReadSchemaTransformConfiguration
- getErrorHandling() - Method in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformConfiguration
- getErrorHandling() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider.Configuration
- getErrorHandling() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider.Configuration
- getErrorInfo(IOException) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
- getErrorMessage() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertError
- getErrorMessage() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOError
- getErrorRowSchema(Schema) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils
- getErrors() - Method in class org.apache.beam.sdk.io.csv.CsvIOParseResult
-
The
CsvIOParseErrorPCollectionas a result of errors associated with parsing CSV records. - getEstimatedLowWatermark() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
- getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.impulse.BeamImpulseSource
- getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.azure.cosmos.CosmosIO.BoundedCosmosBDSource
- getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.BoundedSource
-
An estimate of the total size (in bytes) of the data that would be read from this source.
- getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BoundedElasticsearchSource
- getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.FileBasedSource
- getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
- getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.HadoopInputFormatBoundedSource
- getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read.BoundedGridFSSource
- getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.OffsetBasedSource
- getEvaluator() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getEvent() - Method in class org.apache.beam.sdk.extensions.ordered.UnprocessedEvent
- getEventCoder(Pipeline, Coder<KV<KeyT, KV<Long, EventT>>>) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler
-
Provide the event coder.
- getEventExaminer() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler
- getEvents() - Method in class org.apache.beam.sdk.testing.TestStream
-
Returns the sequence of
Eventsin thisTestStream. - getEx() - Method in class org.apache.beam.runners.jet.processors.ParDoP.Supplier
- getEx() - Method in class org.apache.beam.runners.jet.processors.StatefulParDoP.Supplier
- getEx() - Method in class org.apache.beam.runners.jet.processors.FlattenP.Supplier
- getException() - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Failure
-
The exception itself, e.g.
- getExceptionStacktrace() - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Failure
-
The full stacktrace.
- getExecutables() - Method in interface org.apache.beam.runners.direct.ExecutableGraph
- getExecutableStageIntermediateId(PipelineNode.PTransformNode) - Static method in class org.apache.beam.runners.fnexecution.translation.PipelineTranslatorUtils
- getExecuteStreamingSqlRetrySettings() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getExecutionEnvironment() - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.BatchTranslationContext
- getExecutionEnvironment() - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator.StreamingTranslationContext
- getExecutionModeForBatch() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getExecutionRetryDelay() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getExecutorService() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
-
Deprecated.use
ExecutorOptions.getScheduledExecutorService()instead - getExpansionPort() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver.ServerConfiguration
- getExpansionServiceConfig() - Method in interface org.apache.beam.sdk.expansion.service.ExpansionServiceOptions
- getExpansionServiceConfigFile() - Method in interface org.apache.beam.sdk.expansion.service.ExpansionServiceOptions
- getExpectedAssertions() - Method in interface org.apache.beam.runners.spark.TestSparkPipelineOptions
- getExpectFileToNotExist() - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.CreateOptions
-
If true, the created file is expected to not exist.
- getExperimentalHost() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getExperiments() - Method in interface org.apache.beam.sdk.options.ExperimentalOptions
- getExperimentValue(PipelineOptions, String) - Static method in interface org.apache.beam.sdk.options.ExperimentalOptions
-
Return the value for the specified experiment or null if not present.
- getExpiration() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
- getExpiration() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record
-
Gets the message expiration time in milliseconds since the Unix epoch.
- getExplanation() - Method in class org.apache.beam.sdk.extensions.ordered.UnprocessedEvent
- getExplicitHashKey(T) - Method in interface org.apache.beam.sdk.io.aws2.kinesis.KinesisPartitioner.ExplicitPartitioner
-
Required hash value (128-bit integer) to determine explicitly the shard a record is assigned to based on the hash key range of each shard.
- getExplicitHashKey(T) - Method in interface org.apache.beam.sdk.io.aws2.kinesis.KinesisPartitioner
-
Optional hash value (128-bit integer) to determine explicitly the shard a record is assigned to based on the hash key range of each shard.
- getExpression() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf.Configuration
- getExpression(SchemaPlus, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.CatalogManagerSchema
- getExpression(SchemaPlus, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.CatalogSchema
- getExpression(SchemaPlus, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
- getExtendedSequenceNumber() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
- getExtensionHosts() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
- getExtensionRegistry() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
-
Returns the
ExtensionRegistrylisting all known Protocol Buffers extension messages toTregistered with thisProtoCoder. - getExternalSorterType() - Method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter.Options
-
Returns the external sorter type.
- getExtraInteger() - Method in class org.apache.beam.sdk.managed.testing.TestSchemaTransformProvider.Config
- getExtraString() - Method in class org.apache.beam.sdk.managed.testing.TestSchemaTransformProvider.Config
- getFactory() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.AvroConvertValueForGetter
- getFactory() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.AvroConvertValueForSetter
- getFactory() - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
- getFactory() - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
- getFactory(AwsOptions) - Static method in interface org.apache.beam.sdk.io.aws2.common.ClientBuilderFactory
-
Get a
ClientBuilderFactoryinstance according toAwsOptions.getClientBuilderFactory(). - getFailedBodies() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
- getFailedBodies() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.AbstractResult
- getFailedBodies() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.Result
-
Gets failed bodies with err.
- getFailedBundles() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
-
Gets failed FhirBundleResponse wrapped inside HealthcareIOError.
- getFailedFiles() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
- getFailedFiles() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.AbstractResult
- getFailedFiles() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.Result
-
Gets failed file imports with err.
- getFailedInserts() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
-
Returns a
PCollectioncontaining theTableRows that didn't make it to BQ. - getFailedInsertsWithErr() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
-
Returns a
PCollectioncontaining theBigQueryInsertErrors with detailed error information. - getFailedInsertsWithErr() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write.Result
- getFailedLatencyMetric() - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
- getFailedMessages() - Method in class org.apache.beam.sdk.io.jms.WriteJmsResult
- getFailedMutations() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteResult
- getFailedReads() - Method in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata.Result
-
Gets failed reads.
- getFailedReads() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read.Result
-
Gets failed reads.
- getFailedReads() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything.Result
-
Gets failed reads.
- getFailedReads() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read.Result
- getFailedReads() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.Result
- getFailedRowsTag() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiLoads
- getFailedRowsTupleTag() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingWriteTables
- getFailedSearches() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search.Result
-
Gets failed searches.
- getFailedStorageApiInserts() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
-
Return any rows that persistently fail to insert when using a storage-api method.
- getFailedToParseLines() - Method in class org.apache.beam.sdk.transforms.JsonToRow.ParseResult
-
Returns a
PCollectioncontaining theRows that didn't parse. - getFailOnCheckpointingErrors() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getFailsafeTableRow() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload
- getFailsafeTableRowPayload() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload
- getFailsafeValue() - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow
-
Returns the failsafe value of this
FailsafeValueInSingleWindow. - getFailure() - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord
-
Information about why the record failed.
- getFailureCollector() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getFailures() - Method in class org.apache.beam.io.requestresponse.Result
- getFanout() - Method in class org.apache.beam.sdk.transforms.Combine.GloballyAsSingletonView
- getFasterCopy() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getFetchSize() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
- getFhirBundleParameter() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleResponse
-
FhirBundleParameter represents a FHIR bundle in JSON format to be executed on a FHIR store.
- getFhirStore() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundles
- getField() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPMeasure
- getField() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema.KeyPart
- getField() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
- getField(int) - Method in class org.apache.beam.sdk.schemas.Schema
-
Return a field by index.
- getField(String) - Method in class org.apache.beam.sdk.schemas.Schema
- getFieldAccessDescriptor() - Method in class org.apache.beam.sdk.transforms.DoFnSchemaInformation
-
Effective FieldAccessDescriptor applied by DoFn.
- getFieldCount() - Method in class org.apache.beam.sdk.schemas.Schema
-
Return the count of fields.
- getFieldCount() - Method in class org.apache.beam.sdk.values.Row
-
Return the size of data fields.
- getFieldCount() - Method in class org.apache.beam.sdk.values.RowWithGetters
- getFieldCount() - Method in class org.apache.beam.sdk.values.RowWithStorage
- getFieldDescription(T) - Static method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
- getFieldId() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor
- getFieldName() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor
- getFieldNames() - Method in class org.apache.beam.sdk.schemas.Schema
-
Return the list of all field names.
- getFieldOptionById(int) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoDomain
- getFieldRef(CEPOperation) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPUtils
-
a function that finds a pattern reference recursively.
- getFieldRename() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor
- getFields() - Method in class org.apache.beam.sdk.schemas.Schema
- getFields() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider.Configuration
- getFields() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider.Configuration
- getFields(Class<?>) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
- getFieldsAccessed() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
- getFieldType(OneOfType.Value) - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
- getFieldType(Schema, CEPOperation) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPUtils
- getFieldTypes(TypeDescriptor<?>, Schema, FieldValueTypeSupplier) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
- getFieldTypes(TypeDescriptor<?>, Schema, FieldValueTypeSupplier) - Static method in class org.apache.beam.sdk.schemas.utils.POJOUtils
- getFieldTypes(TypeDescriptor<T>, Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
-
Get field types for an AVRO-generated SpecificRecord or a POJO.
- getFileDescriptor(String) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoDomain
- getFileDescriptorPath() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
- getFileDescriptorPath() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
- getFileDescriptorPath() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- getFileDescriptorPath() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration
- getFileFormat() - Method in class org.apache.beam.sdk.io.iceberg.IcebergDestination
-
File format for created files.
- getFileInputSplitMaxSizeMB() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getFileLocation() - Method in class org.apache.beam.sdk.io.tika.ParseResult
-
Returns the absolute path to the input file.
- getFilename() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsResourceId
- getFilename() - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem.ClassLoaderResourceId
- getFilename() - Method in class org.apache.beam.sdk.io.csv.CsvIOParseError
-
The filename associated with the caught
Exception. - getFilename() - Method in interface org.apache.beam.sdk.io.fs.ResourceId
-
Returns the name of the file or directory denoted by this
ResourceId. - getFilename(BoundedWindow, PaneInfo, int, int, Compression) - Method in interface org.apache.beam.sdk.io.FileIO.Write.FileNaming
-
Generates the filename.
- getFileName() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- getFilenamePolicy(DestinationT) - Method in class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
-
Converts a destination into a
FileBasedSink.FilenamePolicy. - getFilenamePrefix() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration
- getFilenameSuffix() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration
- getFilenameSuffix() - Method in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformConfiguration
- getFileOrPatternSpec() - Method in class org.apache.beam.sdk.io.FileBasedSource
- getFileOrPatternSpecProvider() - Method in class org.apache.beam.sdk.io.FileBasedSource
- getFilepattern() - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration
-
The filepattern used to match and read files.
- getFilePattern() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTable
- getFilePattern() - Method in class org.apache.beam.sdk.io.TextRowCountEstimator
- getFilePattern() - Method in class org.apache.beam.sdk.io.TFRecordReadSchemaTransformConfiguration
- getFilesList() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
-
Getting list of names of staged files.
- getFilesList() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeStreamingServiceConfig
-
Getter for a list of staged files which are will be loaded to Snowflake.
- getFilesToStage() - Method in interface org.apache.beam.sdk.options.FileStagingOptions
-
List of local files to make available to workers.
- getFileSystem() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- getFilter() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getFilterFormatFunction(boolean) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.PortableBigQueryDestinations
- getFilterString() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getFinishBundleBeforeCheckpointing() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getFinishedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
The time at which the connector finished processing this partition.
- getFirestoreDb() - Method in interface org.apache.beam.sdk.io.gcp.firestore.FirestoreOptions
-
The Firestore database ID to connect to.
- getFirestoreHost() - Method in interface org.apache.beam.sdk.io.gcp.firestore.FirestoreOptions
-
A host port pair to allow connecting to a Cloud Firestore instead of the default live service.
- getFirestoreProject() - Method in interface org.apache.beam.sdk.io.gcp.firestore.FirestoreOptions
-
The Firestore project ID to connect to.
- getFirstTimestamp() - Method in class org.apache.beam.runners.spark.translation.SparkStreamingTranslationContext
- getFlatComparators() - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueComparator
- getFlatJsonRows(Schema) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
- getFlatten() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration
- getFlexRSGoal() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
-
This option controls Flexible Resource Scheduling mode.
- getFlinkConfDir() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getFlinkMaster() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
-
The url of the Flink JobManager on which to execute pipelines.
- getFloat() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
- getFloat(int) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.FLOATvalue by field index,ClassCastExceptionis thrown if schema doesn't match. - getFloat(String) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.FLOATvalue by field name,IllegalStateExceptionis thrown if schema doesn't match. - getFn() - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.ParDoSingle
- getFn() - Method in class org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate
- getFn() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.stableinput.BufferingDoFnRunner
- getFn() - Method in class org.apache.beam.runners.spark.translation.DoFnRunnerWithMetrics
- getFn() - Method in class org.apache.beam.sdk.transforms.Combine.Globally
-
Returns the
CombineFnBase.GlobalCombineFnused by this Combine operation. - getFn() - Method in class org.apache.beam.sdk.transforms.Combine.GroupedValues
-
Returns the
CombineFnBase.GlobalCombineFnused by this Combine operation. - getFn() - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
-
Returns the
CombineFnBase.GlobalCombineFnused by this Combine operation. - getFn() - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
- getFn() - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
- getFnApiDevContainerVersion() - Method in class org.apache.beam.runners.dataflow.DataflowRunnerInfo
-
Provides the version/tag for dev SDK FnAPI container image.
- getFnApiEnvironmentMajorVersion() - Method in class org.apache.beam.runners.dataflow.DataflowRunnerInfo
-
Provides the FnAPI environment's major version number.
- getForceSlotSharingGroup() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getForceUnalignedCheckpointEnabled() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getFormat() - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration
-
The format of the file(s) to read.
- getFormat() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration
- getFormat() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
- getFormat() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration
- getFormat() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
- getFormat() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
- getFormat() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- getFormat() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration
- getFormatClass() - Method in class org.apache.beam.sdk.io.cdap.Plugin
-
Gets InputFormat or OutputFormat class for a plugin.
- getFormatClass() - Method in enum class org.apache.beam.sdk.io.cdap.PluginConstants.Format
- getFormatClass() - Method in enum class org.apache.beam.sdk.io.cdap.PluginConstants.Hadoop
- getFormatName() - Method in enum class org.apache.beam.sdk.io.cdap.PluginConstants.Format
- getFormatProviderClass() - Method in class org.apache.beam.sdk.io.cdap.Plugin
-
Gets InputFormatProvider or OutputFormatProvider class for a plugin.
- getFormatProviderClass() - Method in enum class org.apache.beam.sdk.io.cdap.PluginConstants.FormatProvider
- getFormatProviderName() - Method in enum class org.apache.beam.sdk.io.cdap.PluginConstants.FormatProvider
- getFractionConsumed() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
- getFractionConsumed() - Method in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
-
Returns a value in [0, 1] representing approximately what fraction of the
current sourcethis reader has read so far, ornullif such an estimate is not available. - getFractionConsumed() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
- getFractionConsumed() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
- getFractionConsumed() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
- getFractionConsumed() - Method in interface org.apache.beam.sdk.io.range.RangeTracker
-
Returns the approximate fraction of positions in the source that have been consumed by successful
RangeTracker.tryReturnRecordAt(boolean, PositionT)calls, or 0.0 if no such calls have happened. - getFractionOfBlockConsumed() - Method in class org.apache.beam.sdk.io.BlockBasedSource.Block
-
Returns the fraction of the block already consumed, if possible, as a value in
[0, 1]. - getFrom() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRange
-
Returns the range start timestamp (inclusive).
- getFrom() - Method in class org.apache.beam.sdk.io.range.OffsetRange
- getFrom(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.BytesThroughputEstimator
-
Returns the estimated throughput bytes for a specified time.
- getFrom(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.NullThroughputEstimator
-
Always returns 0.
- getFrom(Timestamp) - Method in interface org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.ThroughputEstimator
-
Returns the estimated throughput for a specified time.
- getFromRowFunction() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoDynamicMessageSchema
-
Deprecated.
- getFromRowFunction() - Method in class org.apache.beam.sdk.schemas.SchemaCoder
-
Returns the toRow conversion function.
- getFromRowFunction() - Method in class org.apache.beam.sdk.values.PCollection
-
Returns the attached schema's fromRowFunction.
- getFromRowFunction(Class<T>) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
- getFromRowFunction(Class<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
-
Retrieve the function that converts a
Rowobject to the specified type. - getFromRowFunction(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
-
Retrieve the function that converts a
Rowobject to the specified type. - getFromSnapshotExclusive() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getFromSnapshotInclusive() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getFromSnapshotRefExclusive() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getFromSnapshotRefInclusive() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getFromTimestamp() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getFullCoder(Coder<T>, Coder<? extends BoundedWindow>) - Static method in class org.apache.beam.sdk.values.WindowedValues
-
Returns the
Coderto use for aWindowedValue<T>, using the given valueCoder and windowCoder. - getFullName(PTransform<?, ?>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
-
Returns the full name of the currently being translated transform.
- getFullPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath
- getFullPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.TopicPath
- getFunction() - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields.Fanout
- getFunction() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf
- getFunctionNames() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
- getFunctionNames() - Method in class org.apache.beam.sdk.extensions.sql.impl.CatalogManagerSchema
- getFunctionNames() - Method in class org.apache.beam.sdk.extensions.sql.impl.CatalogSchema
- getFunctions(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
- getFunctions(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.CatalogManagerSchema
- getFunctions(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.CatalogSchema
- getGapDuration() - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
- getGauge(MetricName) - Method in class org.apache.beam.runners.jet.metrics.JetMetricsContainer
- getGauge(MetricName) - Method in interface org.apache.beam.sdk.metrics.MetricsContainer
-
Return the
Gaugethat should be used for implementing the givenmetricNamein this container. - getGauges() - Method in class org.apache.beam.sdk.metrics.MetricQueryResults
-
Return the metric results for the gauges that matched the filter.
- getGauges(MetricFilter) - Method in class org.apache.beam.runners.spark.metrics.WithMetricsSupport
- getGauges(MetricFilter) - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.WithMetricsSupport
- getGbek() - Method in interface org.apache.beam.sdk.options.PipelineOptions
-
A string defining whether GroupByKey transforms should be replaced by GroupByEncryptedKey
- getGcloudCancelCommand(DataflowPipelineOptions, String) - Static method in class org.apache.beam.runners.dataflow.util.MonitoringUtil
- getGcpCredential() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
-
The credential instance that should be used to authenticate against GCP services.
- getGcpOauthScopes() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
-
Controls the OAuth scopes that will be requested when creating
Credentialswith theGcpCredentialFactory(which is the defaultCredentialFactory). - getGcpTempLocation() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
-
A GCS path for storing temporary files in GCP.
- getGcsCustomAuditEntries() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
- getGcsEndpoint() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
-
GCS endpoint to use.
- getGcsHttpRequestReadTimeout() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
- getGcsHttpRequestWriteTimeout() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
- getGcsPerformanceMetrics() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
-
If true, reports metrics of certain operations, such as batch copies.
- getGcsReadCounterPrefix() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
- getGcsRewriteDataOpBatchLimit() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
- getGcsUploadBufferSizeBytes() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
-
The buffer size (in bytes) to use when uploading files to GCS.
- getGcsUtil() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
-
The GcsUtil instance that should be used to communicate with Google Cloud Storage.
- getGcsWriteCounterPrefix() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
- getGCThrashingPercentagePerPeriod() - Method in interface org.apache.beam.sdk.options.MemoryMonitorOptions
-
The GC thrashing threshold percentage.
- getGenericRecordToRowFunction(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
-
Returns a function mapping AVRO
GenericRecords to BeamRows for use inPCollection.setSchema(org.apache.beam.sdk.schemas.Schema, org.apache.beam.sdk.values.TypeDescriptor<T>, org.apache.beam.sdk.transforms.SerializableFunction<T, org.apache.beam.sdk.values.Row>, org.apache.beam.sdk.transforms.SerializableFunction<org.apache.beam.sdk.values.Row, T>). - getGetOffsetFn() - Method in class org.apache.beam.sdk.io.cdap.Plugin
-
Gets a
SerializableFunctionthat defines how to get record offset for CDAPPluginclass. - getGetReceiverArgsFromConfigFn() - Method in class org.apache.beam.sdk.io.cdap.Plugin
-
Gets a
SerializableFunctionthat defines how to get constructor arguments forReceiverusingPluginConfig. - getGetters() - Method in class org.apache.beam.sdk.values.RowWithGetters
- getGetters(TypeDescriptor<T>, Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
-
Get generated getters for an AVRO-generated SpecificRecord or a POJO.
- getGetters(TypeDescriptor<T>, Schema, FieldValueTypeSupplier, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
-
Return the list of
FieldValueGetters for a Java Bean class - getGetters(TypeDescriptor<T>, Schema, FieldValueTypeSupplier, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.POJOUtils
- getGetterTarget() - Method in class org.apache.beam.sdk.values.RowWithGetters
- getGlobalConfigRefreshPeriod() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getGlobalSequenceCombiner() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler.OrderedProcessingGlobalSequenceHandler
-
Provide the global sequence combiner.
- getGoogleAdsClientId() - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsOptions
-
OAuth 2.0 Client ID identifying the application.
- getGoogleAdsClientSecret() - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsOptions
-
OAuth 2.0 Client Secret for the specified Client ID.
- getGoogleAdsCredential() - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsOptions
-
The credential instance that should be used to authenticate against the Google Ads API.
- getGoogleAdsCredentialFactoryClass() - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsOptions
-
The class of the credential factory to create credentials if none have been explicitly set.
- getGoogleAdsDeveloperToken() - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsOptions
-
Google Ads developer token for the user connecting to the Google Ads API.
- getGoogleAdsEndpoint() - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsOptions
-
Host endpoint to use for connections to the Google Ads API.
- getGoogleAdsRefreshToken() - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsOptions
-
OAuth 2.0 Refresh Token for the user connecting to the Google Ads API.
- getGoogleApiTrace() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GoogleApiDebugOptions
-
This option enables tracing of API calls to Google services used within the Apache Beam SDK.
- getGoogleCloudStorageReadOptions() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
- getGrammarFileName() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
- getGrammarFileName() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- getGroupFilesFileLoad() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
-
Choose to use a GBK when gathering a list of files in batch FILE_LOAD.
- getGroupingTableMaxSizeMb() - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
-
Size (in MB) of each grouping table used to pre-combine elements.
- getGson() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.DataChangeRecordToRow
- getGzipCompressHeapDumps() - Method in interface org.apache.beam.sdk.options.MemoryMonitorOptions
-
Controls if heap dumps that are copied to remote destination are gzipped compressed.
- getHadoopConfiguration() - Method in class org.apache.beam.sdk.io.cdap.Plugin
-
Gets a plugin Hadoop configuration.
- getHasError() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Document
- getHashCode() - Method in class org.apache.beam.io.debezium.KafkaSourceConsumerFn
- getHdfsConfiguration() - Method in interface org.apache.beam.sdk.io.hdfs.HadoopFileSystemOptions
- getHeaderAccessor() - Static method in class org.apache.beam.sdk.fn.server.GrpcContextHeaderAccessorProvider
- getHeaderColumns() - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
- getHeaderColumns() - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText
- getHeaderColumns() - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
- getHeaders() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
- getHeaders() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
- getHeartbeatMillis() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
The number of milliseconds after the stream is idle, which a heartbeat record will be emitted in the change stream query.
- getHighWatermark() - Method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder.SparkWatermarks
- getHintMaxNumWorkers() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
A hint to the QoS system for the intended max number of workers for a pipeline.
- getHistogram(MetricName, HistogramData.BucketType) - Method in interface org.apache.beam.sdk.metrics.MetricsContainer
-
Return the
Histogramthat should be used for implementing the givenmetricNamein this container. - getHistograms() - Method in class org.apache.beam.sdk.metrics.MetricQueryResults
-
Return the metric results for the sets that matched the filter.
- getHistograms(MetricFilter) - Method in class org.apache.beam.runners.spark.metrics.WithMetricsSupport
- getHistograms(MetricFilter) - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.WithMetricsSupport
- getHL7v2Message() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadResponse
-
Gets hl7v2Message.
- getHL7v2Message(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Gets a Hl7v2 message by its name from a Hl7v2 store.
- getHL7v2Message(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
-
Gets HL7v2 message.
- getHl7v2MessageId() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadParameter
-
HL7v2MessageId string.
- getHL7v2Store(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Gets an HL7v2 store.
- getHL7v2Store(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
-
Gets HL7v2 store.
- getHoldability() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- getHost() - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration
- getHost() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver.ServerConfiguration
- getHost() - Method in class org.apache.beam.sdk.expansion.service.ExpansionServer
-
Get the host that this
ExpansionServeris bound to. - getHost() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getHostValue() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getHttpClient() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
- getHttpClientConfiguration() - Method in interface org.apache.beam.sdk.io.aws2.options.AwsOptions
-
HttpClientConfigurationused to configure AWS service clients. - getHttpPipeline() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
- getHTTPReadTimeout() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getHTTPWriteTimeout() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getHumanReadableJsonRecord() - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Record
-
The failing record, encoded as JSON.
- getIcebergCatalog() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration
- getId() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmShard
-
Return the shard id.
- getId() - Method in interface org.apache.beam.runners.fnexecution.control.RemoteBundle
-
Get an id used to represent this bundle.
- getId() - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient.BundleProcessor.ActiveBundle
-
Returns an id used to represent this bundle.
- getId() - Method in class org.apache.beam.runners.jobsubmission.JobInvocation
- getId() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource
- getId() - Method in class org.apache.beam.sdk.extensions.sql.example.model.Customer
- getId() - Method in class org.apache.beam.sdk.extensions.sql.example.model.Order
- getId() - Method in interface org.apache.beam.sdk.fn.IdGenerator
- getId() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.ProjectPath
- getId() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SchemaPath
- getId() - Method in class org.apache.beam.sdk.values.TupleTag
-
Returns the id of this
TupleTag. - getId() - Method in class org.apache.beam.sdk.values.ValueWithRecordId
- getIdAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
- getIdAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
-
Get the id attribute.
- getIdAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
-
Get the id attribute.
- getIdAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration
- getIdentifier() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.LocalMktDate
- getIdentifier() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCDateOnly
- getIdentifier() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimeOnly
- getIdentifier() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimestamp
- getIdentifier() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimeOnly
- getIdentifier() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimestamp
- getIdentifier() - Method in class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig.IcebergTableInfo
- getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.Date
- getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.DateTime
- getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
- getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.MicrosInstant
- getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.NanosDuration
- getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.NanosInstant
- getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
- getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.PassThroughLogicalType
- getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.PythonCallable
- getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.SchemaLogicalType
- getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.Time
- getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.UuidLogicalType
- getIdentifier() - Method in interface org.apache.beam.sdk.schemas.Schema.LogicalType
-
The unique identifier for this type.
- getIdleShutdownTimeout() - Method in interface org.apache.beam.runners.prism.PrismPipelineOptions
- getImpersonateServiceAccount() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
-
All API requests will be made as the given service account or target service account in an impersonation delegation chain instead of the currently selected account.
- getImplementor() - Method in class org.apache.beam.sdk.extensions.sql.impl.ScalarFunctionImpl
- getImplementor(boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.UdafImpl
- getInboundObserver() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.ArrayAgg.ArrayAggArray
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations.BitXOr
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions.PositionAwareCombineFn
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.transforms.Combine.SimpleCombineFn
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CountIf.CountIfFn
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CovarianceFn
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.extensions.ordered.combiner.DefaultSequenceCombiner
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.runners.flink.FlinkStreamingAggregationsTranslators.ConcatenateAsIterable
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.extensions.sql.impl.LazyAggregateCombineFn
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Concatenate
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggByte
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggString
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.VarianceFn
- getIncompatibleGlobalWindowErrorMessage() - Method in interface org.apache.beam.sdk.transforms.CombineFnBase.GlobalCombineFn
-
Returns the error message for not supported default values in Combine.globally().
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.transforms.Top.TopCombineFn
- getIndex() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPFieldRef
- getIndex() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.OrderKey
- getIndex() - Method in class org.apache.beam.sdk.extensions.sql.impl.utils.SerializableRexInputRef
- getIndex() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
- getIndex() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- getIndex() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration
- getIndex() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
-
The zero-based index of this trigger firing that produced this pane.
- getIndex(TupleTag<?>) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
-
Returns the index for the given tuple tag, if the tag is present in this schema, -1 if it isn't.
- getIndexes() - Method in class org.apache.beam.sdk.extensions.sql.impl.utils.SerializableRexFieldAccess
- getIndexOffset() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmShard
-
Return the absolute position within the Ism file where the index block begins.
- getIndexPosition() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.Footer
- getInferMaps() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.SchemaConversionOptions
-
/** Controls whether to use the map or row FieldType for a TableSchema field that appears to represent a map (it is an array of structs containing only
keyandvaluefields). - getInflightWaitSeconds() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StreamAppendClient
-
If the previous call to appendRows blocked due to flow control, returns how long the call blocked for.
- getIngestManager() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeStreamingServiceConfig
-
Getter for ingest manager which serves API to load data in streaming mode and retrieve a report about loaded data.
- getInitialBackoff() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
The initial backoff duration to be used before retrying a request for the first time.
- getInitializedProducer(SolaceIO.SubmissionMode) - Method in class org.apache.beam.sdk.io.solace.broker.JcsmpSessionService
- getInitializedProducer(SolaceIO.SubmissionMode) - Method in class org.apache.beam.sdk.io.solace.broker.SessionService
-
Returns a MessageProducer object for publishing messages to Solace.
- getInitialRestriction(InputT) - Method in class org.apache.beam.sdk.transforms.Watch.WatchGrowthFn
- getInitialRestriction(Map<String, String>) - Method in class org.apache.beam.io.debezium.KafkaSourceConsumerFn
- getInitialRestriction(PulsarSourceDescriptor) - Method in class org.apache.beam.sdk.io.pulsar.NaiveReadFromPulsarDoFn
- getInitialWatermarkEstimatorState(InitialPipelineState) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.DetectNewPartitionsDoFn
- getInitialWatermarkEstimatorState(PartitionRecord) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.ReadChangeStreamPartitionDoFn
- getInitialWatermarkEstimatorState(PartitionMetadata) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
- getInitialWatermarkEstimatorState(PartitionMetadata) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
- getInitialWatermarkEstimatorState(Instant) - Method in class org.apache.beam.io.debezium.KafkaSourceConsumerFn
- getInitialWatermarkEstimatorState(Instant) - Method in class org.apache.beam.sdk.io.pulsar.NaiveReadFromPulsarDoFn
- getInitialWatermarkEstimatorState(Instant) - Method in class org.apache.beam.sdk.transforms.Watch.WatchGrowthFn
- getInput() - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator.UnresolvedTranslation
- getInput() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
- getInput(PTransform<InputT, ?>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
- getInput(PTransform<T, ?>) - Method in class org.apache.beam.runners.flink.translation.utils.LookupPipelineVisitor
- getInput(PTransform<T, ?>) - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
- getInput(PTransform<T, ?>) - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
- getInput(RelNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils
- getInputDataSet(PValue) - Method in class org.apache.beam.runners.twister2.Twister2BatchTranslationContext
- getInputDataSet(PValue) - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
- getInputDoc() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Document
- getInputFile() - Method in interface org.apache.beam.runners.spark.structuredstreaming.examples.WordCount.WordCountOptions
- getinputFormatClass() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
- getinputFormatKeyClass() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
- getInputFormatProvider() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getinputFormatValueClass() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
- getInputId(PipelineNode.PTransformNode) - Static method in class org.apache.beam.runners.fnexecution.translation.PipelineTranslatorUtils
- getInputReceivers() - Method in interface org.apache.beam.runners.fnexecution.control.RemoteBundle
-
Get a map of PCollection ids to
receivers which consume input elements, forwarding them to the remote environment. - getInputReceivers() - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient.BundleProcessor.ActiveBundle
-
Get a map of PCollection ids to
receivers which consume input elements, forwarding them to the remote environment. - getInputs() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
- getInputs() - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
- getInputs(PTransform<?, ?>) - Method in class org.apache.beam.runners.flink.translation.utils.LookupPipelineVisitor
- getInputs(PTransform<?, ?>) - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
- getInputs(PTransform<InputT, ?>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
-
Returns the input of the currently being translated transform.
- getInputSchema() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getInputSchemas() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getInputSplitAssigner(SourceInputSplit[]) - Method in class org.apache.beam.runners.flink.translation.wrappers.SourceInputFormat
- getInputSplitAssigner(GenericInputSplit[]) - Method in class org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat
- getInputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
- getInputTVariable() - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
-
Returns the
TypeVariableofInputT. - getInputTVariable() - Method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
-
Returns the
TypeVariableofInputT. - getInputTVariable() - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
-
Returns the
TypeVariableofInputT. - getInputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.ArrayAgg.ArrayAggArray
-
Returns the
TypeVariableofInputT. - getInputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations.BitXOr
-
Returns the
TypeVariableofInputT. - getInputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions.PositionAwareCombineFn
-
Returns the
TypeVariableofInputT. - getInputTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn
-
Returns the
TypeVariableofInputT. - getInputTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
-
Returns the
TypeVariableofInputT. - getInputTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
-
Returns the
TypeVariableofInputT. - getInputTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
-
Returns the
TypeVariableofInputT. - getInputTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
-
Returns the
TypeVariableofInputT. - getInputTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
-
Returns the
TypeVariableofInputT. - getInputTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
-
Returns the
TypeVariableofInputT. - getInputTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.SimpleCombineFn
-
Returns the
TypeVariableofInputT. - getInputTVariable() - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
-
Returns the
TypeVariableofInputT. - getInputTVariable() - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
-
Returns the
TypeVariableofInputT. - getInputTVariable() - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
-
Returns the
TypeVariableofInputT. - getInputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CountIf.CountIfFn
-
Returns the
TypeVariableofInputT. - getInputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CovarianceFn
-
Returns the
TypeVariableofInputT. - getInputTVariable() - Method in class org.apache.beam.sdk.extensions.ordered.combiner.DefaultSequenceCombiner
-
Returns the
TypeVariableofInputT. - getInputTVariable() - Method in class org.apache.beam.runners.flink.FlinkStreamingAggregationsTranslators.ConcatenateAsIterable
-
Returns the
TypeVariableofInputT. - getInputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.LazyAggregateCombineFn
-
Returns the
TypeVariableofInputT. - getInputTVariable() - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
-
Returns the
TypeVariableofInputT. - getInputTVariable() - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
-
Returns the
TypeVariableofInputT. - getInputTVariable() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Concatenate
-
Returns the
TypeVariableofInputT. - getInputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggByte
-
Returns the
TypeVariableofInputT. - getInputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggString
-
Returns the
TypeVariableofInputT. - getInputTVariable() - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
-
Returns the
TypeVariableofInputT. - getInputTVariable() - Method in class org.apache.beam.sdk.transforms.Top.TopCombineFn
-
Returns the
TypeVariableofInputT. - getInputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.VarianceFn
-
Returns the
TypeVariableofInputT. - getInputType() - Method in class org.apache.beam.sdk.extensions.sql.impl.UdafImpl
- getInputType() - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
- getInputType() - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
-
Returns a
TypeDescriptorcapturing what is known statically about the input type of thisCombineFninstance's most-derived class. - getInputTypeDescriptor() - Method in class org.apache.beam.sdk.transforms.DoFn
-
Returns a
TypeDescriptorcapturing what is known statically about the input type of thisDoFninstance's most-derived class. - getInputTypeDescriptor() - Method in class org.apache.beam.sdk.transforms.InferableFunction
-
Returns a
TypeDescriptorcapturing what is known statically about the input type of thisInferableFunctioninstance's most-derived class. - getInputValueCoder(Coder<KV<K, V>>) - Static method in class org.apache.beam.runners.dataflow.internal.DataflowGroupByKey
-
Returns the
Coderof the values of the input to this transform. - getInputValueCoder(Coder<KV<K, V>>) - Static method in class org.apache.beam.sdk.transforms.GroupByKey
-
Returns the
Coderof the values of the input to this transform. - getInsertBundleParallelism() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getInsertCount() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- getInsertDefault() - Method in class org.apache.beam.sdk.transforms.Combine.GloballyAsSingletonView
- getInsertErrors() - Method in class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy.Context
- getInspectConfig() - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
- getInspectConfig() - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText
- getInspectConfig() - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
- getInspectTemplateName() - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
- getInspectTemplateName() - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText
- getInspectTemplateName() - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
- getInstance() - Static method in class org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageContextFactory
- getInstance() - Static method in class org.apache.beam.runners.spark.metrics.MetricsAccumulator
- getInstance() - Static method in class org.apache.beam.runners.spark.translation.SparkExecutableStageContextFactory
- getInstance() - Static method in class org.apache.beam.sdk.io.googleads.DefaultGoogleAdsClientFactory
- getInstance() - Static method in class org.apache.beam.sdk.metrics.NoOpCounter
- getInstance() - Static method in class org.apache.beam.sdk.metrics.NoOpHistogram
- getInstance(String, String) - Static method in class org.apache.beam.sdk.schemas.transforms.providers.StringCompiler
- getInstance(SparkSession) - Static method in class org.apache.beam.runners.spark.structuredstreaming.metrics.MetricsAccumulator
-
Get the
MetricsAccumulatoron this driver. - getInstanceAdminClient() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.BigtableChangeStreamAccessor
- getInstanceConfigId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor
- getInstanceId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
-
Returns the instance id being written to.
- getInstanceId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration
- getInstanceId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration
- getInstanceId() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration
- getInstanceId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getInstanceId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration
- getInstanceId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration
- getInstructionId() - Method in class org.apache.beam.sdk.fn.data.LogicalEndpoint
- getInstructionRequestHandler() - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient
- getInstructionRequestHandler() - Method in interface org.apache.beam.runners.fnexecution.control.StageBundleFactory
- getInstructionRequestHandler() - Method in class org.apache.beam.runners.fnexecution.environment.ProcessEnvironment
- getInstructionRequestHandler() - Method in interface org.apache.beam.runners.fnexecution.environment.RemoteEnvironment
-
Return an
InstructionRequestHandlerwhich can communicate with the environment. - getInstructionRequestHandler() - Method in class org.apache.beam.runners.fnexecution.environment.RemoteEnvironment.SimpleRemoteEnvironment
- getInstructionRequestHandler() - Method in class org.apache.beam.runners.fnexecution.environment.StaticRemoteEnvironment
- getInt(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- getInt(Map<String, Object>, String, Integer) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- getInt16() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
- getInt16(int) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.INT16value by field index,ClassCastExceptionis thrown if schema doesn't match. - getInt16(String) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.INT16value by field name,IllegalStateExceptionis thrown if schema doesn't match. - getInt32() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
- getInt32(int) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.INT32value by field index,ClassCastExceptionis thrown if schema doesn't match. - getInt32(String) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.INT32value by field name,IllegalStateExceptionis thrown if schema doesn't match. - getInt64() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
- getInt64(int) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.INT64value by field index,ClassCastExceptionis thrown if schema doesn't match. - getInt64(String) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.INT64value by field name,IllegalStateExceptionis thrown if schema doesn't match. - getInterface() - Method in enum class org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention
- getInterfaces() - Method in class org.apache.beam.sdk.values.TypeDescriptor
-
Returns a set of
TypeDescriptors, one for each interface implemented by this class. - getIntersectingPartition(Range.ByteStringRange, Range.ByteStringRange) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ByteStringRangeHelper
-
Return the overlapping parts of 2 partitions.
- getIo() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
- getIr() - Method in class org.apache.beam.sdk.extensions.sbe.SbeSchema
- getIrOptions() - Method in class org.apache.beam.sdk.extensions.sbe.SbeSchema
- getIsLocalChannelProvider() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getIsWindmillServiceDirectPathEnabled() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getIterable(int) - Method in class org.apache.beam.sdk.values.Row
-
Get an iterable value by field index,
IllegalStateExceptionis thrown if schema doesn't match. - getIterable(String) - Method in class org.apache.beam.sdk.values.Row
-
Get an iterable value by field name,
IllegalStateExceptionis thrown if schema doesn't match. - getIterableComponentType(TypeDescriptor) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
-
For an array T[] or a subclass of Iterable
, return a TypeDescriptor describing T. - getJarPath() - Method in class org.apache.beam.sdk.extensions.sql.impl.ScalarFunctionImpl
-
Optional Beam filesystem path to the jar containing the bytecode for this function.
- getJavaClass(RelDataType) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamJavaTypeFactory
- getJavaClassLookupAllowlist() - Method in interface org.apache.beam.sdk.expansion.service.ExpansionServiceOptions
- getJavaClassLookupAllowlistFile() - Method in interface org.apache.beam.sdk.expansion.service.ExpansionServiceOptions
- getJAXBClass() - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
- getJdbcType() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
- getJdbcType() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
- getJdbcUrl() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
- getJdbcUrl() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
- getJdkAddOpenModules() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
-
Open modules needed for reflection that access JDK internals with Java 9+
- getJdkAddOpenModules() - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
-
Open modules needed for reflection that access JDK internals with Java 9+.
- getJdkAddRootModules() - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
-
Add modules to the default root set with Java 11+.
- getJetDefaultParallelism() - Method in interface org.apache.beam.runners.jet.JetPipelineOptions
- getJetLocalMode() - Method in interface org.apache.beam.runners.jet.JetPipelineOptions
- getJetProcessorsCooperative() - Method in interface org.apache.beam.runners.jet.JetPipelineOptions
- getJetServers() - Method in interface org.apache.beam.runners.jet.JetPipelineOptions
- getJfrRecordingDurationSec() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
- getJmsCorrelationID() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
- getJmsDeliveryMode() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
- getJmsDestination() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
- getJmsExpiration() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
- getJmsMessageID() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
- getJmsPriority() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
- getJmsRedelivered() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
- getJmsReplyTo() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
- getJmsTimestamp() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
- getJmsType() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
- getJob() - Method in exception class org.apache.beam.runners.dataflow.DataflowJobException
-
Returns the failed job.
- getJob() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator.JobSpecification
- getJob(JobReference) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
-
Gets the specified
Jobby the givenJobReference. - getJob(JobReference) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
- getJob(String) - Method in class org.apache.beam.runners.dataflow.DataflowClient
-
Gets the Dataflow
Jobwith the givenjobId. - getJobCheckIntervalInSecs() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getJobEndpoint() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
- getJobFileZip() - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
- getJobId() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
-
Get the id of this job.
- getJobId() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerHarnessOptions
-
The identity of the Dataflow job.
- getJobId() - Method in class org.apache.beam.runners.dataflow.util.DataflowTemplateJob
- getJobInfo() - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.BatchTranslationContext
- getJobInfo() - Method in interface org.apache.beam.runners.flink.FlinkPortablePipelineTranslator.TranslationContext
- getJobInfo() - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator.StreamingTranslationContext
- getJobLabelsMap() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getJobMessages(String, long) - Method in class org.apache.beam.runners.dataflow.util.MonitoringUtil
-
Return job messages sorted in ascending order by timestamp.
- getJobMetrics(String) - Method in class org.apache.beam.runners.dataflow.DataflowClient
-
Gets the
JobMetricswith the givenjobId. - getJobMetrics(JobApi.GetJobMetricsRequest, StreamObserver<JobApi.GetJobMetricsResponse>) - Method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
- getJobMetrics(JobApi.GetJobMetricsRequest, StreamObserver<JobApi.GetJobMetricsResponse>) - Method in class org.apache.beam.runners.portability.testing.TestJobService
- getJobMonitoringPageURL(String, String) - Static method in class org.apache.beam.runners.dataflow.util.MonitoringUtil
-
Deprecated.this method defaults the region to "us-central1". Prefer using the overload with an explicit regionId parameter.
- getJobMonitoringPageURL(String, String, String) - Static method in class org.apache.beam.runners.dataflow.util.MonitoringUtil
- getJobName() - Method in interface org.apache.beam.sdk.options.PipelineOptions
- getJobs(JobApi.GetJobsRequest, StreamObserver<JobApi.GetJobsResponse>) - Method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
- getJobServerConfig() - Method in interface org.apache.beam.runners.portability.testing.TestPortablePipelineOptions
- getJobServerDriver() - Method in interface org.apache.beam.runners.portability.testing.TestPortablePipelineOptions
- getJobServerTimeout() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
- getJobServerUrl() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver
- getJobService(BigQueryOptions) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices
-
Returns a real, mock, or fake
BigQueryServices.JobService. - getJobService(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl
- getJobService(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
- getJobType() - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
- getJoinColumns(boolean, List<Pair<RexNode, RexNode>>, int, Schema) - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamJoinTransforms
- getJsonBytesToRowFunction(Schema) - Static method in class org.apache.beam.sdk.schemas.utils.JsonUtils
-
Returns a
SimpleFunctionmapping JSON byte[] arrays to BeamRows. - getJsonClustering() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- getJsonFactory() - Static method in class org.apache.beam.sdk.extensions.gcp.util.Transport
- getJsonFactory() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- getJsonStringToRowFunction(Schema) - Static method in class org.apache.beam.sdk.schemas.utils.JsonUtils
- getJsonTimePartitioning() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- getJsonToRowWithErrFn() - Method in class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn.ParseWithError
- getKeep() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
- getKeep() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration
- getKeep() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider.Configuration
- getKeepFields() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getKey() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkBroadcastStateInternals
- getKey() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals
- getKey() - Method in interface org.apache.beam.runners.local.Bundle
-
Returns the key that was output in the most recent
GroupByKeyin the execution of this bundle. - getKey() - Method in class org.apache.beam.runners.local.StructuralKey
-
Returns the key that this
StructuralKeywas created from. - getKey() - Method in class org.apache.beam.runners.spark.stateful.SparkStateInternals
- getKey() - Method in class org.apache.beam.runners.spark.translation.SparkProcessContext
- getKey() - Method in class org.apache.beam.sdk.io.aws2.s3.SSECustomerKey
- getKey() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
- getKey() - Method in class org.apache.beam.sdk.metrics.MetricResult
- getKey() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Identifier
- getKey() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
-
The key for the display item.
- getKey() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
-
The key for the display item.
- getKey() - Method in class org.apache.beam.sdk.values.KV
-
Returns the key of this
KV. - getKey() - Method in class org.apache.beam.sdk.values.ShardedKey
-
Deprecated.
- getKey(Coder<K>) - Method in class org.apache.beam.runners.flink.adapter.FlinkKey
- getKey(WindowedValue<KeyedWorkItem<K, V>>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.WorkItemKeySelector
- getKey(WindowedValue<KV<K, InputT>>) - Method in class org.apache.beam.runners.flink.translation.types.KvKeySelector
- getKey(WindowedValue<KV<K, InputT>>) - Method in class org.apache.beam.runners.flink.translation.types.WindowedKvKeySelector
- getKey(WindowedValue<KV<K, V>>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.KvToFlinkKeyKeySelector
- getKey(WindowedValue<KV<KV<K, V>, Double>>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.SdfFlinkKeyKeySelector
- getKeyClass() - Method in enum class org.apache.beam.sdk.io.cdap.PluginConstants.Hadoop
- getKeyCoder() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.MetadataKeyCoder
- getKeyCoder() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.SingletonKeyedWorkItemCoder
- getKeyCoder() - Method in class org.apache.beam.sdk.coders.KvCoder
- getKeyCoder() - Method in class org.apache.beam.sdk.coders.MapCoder
- getKeyCoder() - Method in class org.apache.beam.sdk.coders.SortedMapCoder
- getKeyCoder() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
- getKeyCoder() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getKeyCoder() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
- getKeyCoder(Coder<KV<K, V>>) - Static method in class org.apache.beam.sdk.transforms.GroupByKey
-
Returns the
Coderof the keys of the input to this transform, which is also used as theCoderof the keys of the output of this transform. - getKeyCoder(Pipeline, Coder<KV<KeyT, KV<Long, EventT>>>) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler
-
Provide the key coder.
- getKeyComponent(int) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmRecord
-
Returns the key component at the specified index.
- getKeyComponentCoder(int) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmRecordCoder
-
Returns the key coder at the specified index.
- getKeyComponentCoders() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmRecordCoder
-
Returns the list of key component coders.
- getKeyComponents() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmRecord
-
Returns the list of key components.
- getKeyDeserializerProvider() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getKeyedCollections() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
-
Returns a list of
TaggedKeyedPCollectionsfor thePCollectionscontained in thisKeyedPCollectionTuple. - getKeyedResources() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search.Result
-
Gets resources with input SearchParameter key.
- getKeyParts(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema
- getKeyRange() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
-
Returns the range of keys that will be read from the table.
- getKeys() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
- getKeySerializer() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- getKeySet() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- getKeysJson() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.Mod
-
The primary keys of this specific modification.
- getKeystorePassword() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
- getKeystorePath() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
- getKeyTranslationFunction() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
- getKeyTypeDescriptor() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
- getKind() - Method in class org.apache.beam.sdk.extensions.sql.impl.QueryPlanner.QueryParameters
- getKind() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
- getKind() - Method in interface org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.PubsubClientFactory
-
Return the display name for this factory.
- getKind() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.Qualifier
- getKind() - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields.Fanout
- getKindString() - Method in class org.apache.beam.sdk.io.BoundedReadFromUnboundedSource
- getKindString() - Method in class org.apache.beam.sdk.io.Read.Bounded
- getKindString() - Method in class org.apache.beam.sdk.io.Read.Unbounded
- getKindString() - Method in class org.apache.beam.sdk.transforms.Combine.Globally
- getKindString() - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
- getKindString() - Method in class org.apache.beam.sdk.transforms.Combine.PerKeyWithHotKeyFanout
- getKindString() - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
- getKindString() - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
- getKindString() - Method in class org.apache.beam.sdk.transforms.PTransform
-
Returns the name to use by default for this
PTransform(not including the names of any enclosingPTransforms). - getKindString() - Method in class org.apache.beam.sdk.transforms.Tee
- getKindString() - Method in class org.apache.beam.sdk.transforms.windowing.Window
- getKindString() - Method in class org.apache.beam.sdk.values.PValueBase
-
Returns a
Stringcapturing the kind of thisPValueBase. - getKinesisIOConsumerArns() - Method in interface org.apache.beam.sdk.io.aws2.kinesis.KinesisIOOptions
-
Used to enable / disable EFO.
- getKmsKey() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration
- getKmsKey() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
- getKV() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
- getLabel() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
-
Retrieve the optional label for an item.
- getLabel() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
-
The optional label for an item.
- getLabels() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
-
Labels that will be applied to the billing records for this job.
- getLabels() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
-
Gets labels.
- getLabels() - Method in class org.apache.beam.sdk.metrics.MetricName
-
Associated labels for the metric.
- getLanguage() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider.Configuration
- getLanguage() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider.Configuration
- getLastContiguousSequenceRange() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus
- getLastEmitted() - Method in class org.apache.beam.sdk.io.CountingSource.CounterMark
-
Returns the last value emitted by the reader.
- getLastFieldId() - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- getLastProcessedSequence() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus
- getLastRunTimestamp() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
- getLastUpdated() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
- getLastWatermarkedBatchTime() - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
- getLatencyNanos() - Method in class org.apache.beam.sdk.io.solace.data.Solace.PublishResult
-
The publishing latency in nanoseconds.
- getLatencyTrackingInterval() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getLatestBufferedSequence() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus
- getLatestHL7v2SendTime(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Gets the latest HL7v2 send time.
- getLatestHL7v2SendTime(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- getLateTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
- getLeaves() - Method in class org.apache.beam.runners.twister2.Twister2PipelineExecutionEnvironment
- getLeaves() - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
- getLegacyDevContainerVersion() - Method in class org.apache.beam.runners.dataflow.DataflowRunnerInfo
-
Provides the version/tag for legacy SDK FnAPI container image.
- getLegacyEnvironmentMajorVersion() - Method in class org.apache.beam.runners.dataflow.DataflowRunnerInfo
-
Provides the legacy environment's major version number.
- getLength() - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeSerializer
- getLength() - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueSerializer
- getLength() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.FlinkStateNamespaceKeySerializer
- getLength() - Method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeVarchar
- getLength() - Method in class org.apache.beam.sdk.schemas.logicaltypes.FixedBytes
- getLength() - Method in class org.apache.beam.sdk.schemas.logicaltypes.FixedString
- getLevel() - Method in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider.Configuration
- getLimitCountOfSortRel() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.AbstractBeamCalcRel
- getLinkUrl() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
-
Retrieve the optional link URL for an item.
- getLinkUrl() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
-
The optional link URL for an item.
- getList() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.Qualifier
- getListeners() - Method in interface org.apache.beam.runners.spark.SparkContextOptions
- getListOfMaps(Map<String, Object>, String, List<Map<String, Object>>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- getLiteralGqlQuery() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
- getLoadBalanceBundles() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
- getLocalhost() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
- getLocalJobServicePortFile() - Method in interface org.apache.beam.runners.portability.testing.TestUniversalRunner.Options
-
A file containing the job service port, since Gradle needs to know this filename statically to provide it in Beam testing options.
- getLocalValue() - Method in class org.apache.beam.runners.flink.metrics.MetricsAccumulator
- getLocalWindmillHostport() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getLocation() - Method in class org.apache.beam.sdk.extensions.sql.meta.Table
- getLocation() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
- getLocation() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
- getLocation() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
- getLocation() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
- getLockToAcquireForStateAccessDuringBundles() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
-
Subclasses may provide a lock to ensure that the state backend is not accessed concurrently during bundle execution.
- getLockToAcquireForStateAccessDuringBundles() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.ExecutableStageDoFnOperator
- getLogicalStartTime() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getLogicalType() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
- getLogicalType(Class<LogicalTypeT>) - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
-
Helper function for retrieving the concrete logical type subclass.
- getLogicalTypeName() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.DateConversion
- getLogicalTypeName() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.LocalTimestampMicrosConversion
- getLogicalTypeName() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.LocalTimestampMillisConversion
- getLogicalTypeName() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimeMicrosConversion
- getLogicalTypeName() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimeMillisConversion
- getLogicalTypeName() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimestampMicrosConversion
- getLogicalTypeName() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimestampMillisConversion
- getLogicalTypeName() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.DateConversion
- getLogicalTypeName() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimeConversion
- getLogicalTypeName() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimeMicrosConversion
- getLogicalTypeName() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimestampConversion
- getLogicalTypeName() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimestampMicrosConversion
- getLogicalTypeValue(int, Class<T>) - Method in class org.apache.beam.sdk.values.Row
-
Returns the Logical Type input type for this field.
- getLogicalTypeValue(String, Class<T>) - Method in class org.apache.beam.sdk.values.Row
-
Returns the Logical Type input type for this field.
- getLoginTimeout() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- getLoginTimeout() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- getLogLevel() - Method in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider.Configuration
- getLogMdc() - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
-
Whether to include SLF4J MDC in log entries.
- getLogTopicVerification() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getLong(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- getLong(Map<String, Object>, String, Long) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- getLowWatermark() - Method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder.SparkWatermarks
- getLowWatermark() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
- getMainOutputTag() - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.ParDoSingle
- getMainOutputTag() - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
Deprecated.Use
TestPipelinewith theDirectRunner. - getMainOutputTag() - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
- getMainTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.OrFinallyTrigger
-
The main trigger, which will continue firing until the "until" trigger fires.
- getManifestListLocation() - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo
- getMap() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.Qualifier
- getMap(int) - Method in class org.apache.beam.sdk.values.Row
-
Get a MAP value by field index,
IllegalStateExceptionis thrown if schema doesn't match. - getMap(String) - Method in class org.apache.beam.sdk.values.Row
-
Get a MAP value by field name,
IllegalStateExceptionis thrown if schema doesn't match. - getMapKeyType() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
-
If the field is a map type, returns the key type.
- getMapKeyType() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
- getMapping() - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.Globally
- getMapping() - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.PerKey
- getMapType(TypeDescriptor, int) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
- getMapValueType() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
-
If the field is a map type, returns the key type.
- getMapValueType() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
- getMatcher() - Method in class org.apache.beam.sdk.extensions.gcp.util.CustomHttpErrors.MatcherAndError
- getMatchUpdatedFiles() - Method in class org.apache.beam.sdk.io.FileIO.MatchConfiguration
- getMaterialization() - Method in class org.apache.beam.sdk.transforms.ViewFn
-
Gets the materialization of this
ViewFn. - getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryListFromMultimapViewFn
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryListViewFn
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryMapFromVoidKeyViewFn
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryMapViewFn
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryMultimapFromVoidKeyViewFn
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryMultimapViewFn
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableBackedListViewFn
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableViewFn
-
Deprecated.
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableViewFn2
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.ListViewFn
-
Deprecated.
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.ListViewFn2
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.MapViewFn
-
Deprecated.
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.MapViewFn2
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.MultimapViewFn
-
Deprecated.
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.MultimapViewFn2
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn
-
Deprecated.
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn2
- getMax() - Method in class org.apache.beam.sdk.metrics.DistributionResult
- getMaxAttempts() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
The maximum number of times a request will be attempted for a complete successful result.
- getMaxBufferingDuration() - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches.BatchingParams
- getMaxBufferingDurationMilliSec() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getMaxBundlesFromWindmillOutstanding() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
-
Maximum number of bundles outstanding from windmill before the worker stops requesting.
- getMaxBundleSize() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getMaxBundleTimeMills() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getMaxBytesFromWindmillOutstanding() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
-
Maximum number of bytes outstanding from windmill before the worker stops requesting.
- getMaxCacheMemoryUsage(PipelineOptions) - Method in class org.apache.beam.sdk.options.SdkHarnessOptions.DefaultMaxCacheMemoryUsageMb
- getMaxCacheMemoryUsage(PipelineOptions) - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions.MaxCacheMemoryUsageMb
- getMaxCacheMemoryUsageMb() - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
-
Size (in MB) for the process wide cache within the SDK harness.
- getMaxCacheMemoryUsageMbClass() - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
-
An instance of this class will be used to specify the maximum amount of memory to allocate to a cache within an SDK harness instance.
- getMaxCacheMemoryUsagePercent() - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
-
Size (in % [0 - 100]) for the process wide cache within the SDK harness.
- getMaxCommitDelay() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getMaxConnectionPoolConnections() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getMaxCumulativeBackoff() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getMaxElementCountToTriggerContinuousSequenceRangeReevaluation() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler.OrderedProcessingGlobalSequenceHandler
-
Number of new elements to trigger the re-evaluation.
- getMaxEndOffset(PipelineOptions) - Method in class org.apache.beam.sdk.io.FileBasedSource
- getMaxEndOffset(PipelineOptions) - Method in class org.apache.beam.sdk.io.OffsetBasedSource
-
Returns the actual ending offset of the current source.
- getMaxInvocationHistory() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver.ServerConfiguration
- getMaxLength() - Method in class org.apache.beam.sdk.schemas.logicaltypes.VariableBytes
- getMaxLength() - Method in class org.apache.beam.sdk.schemas.logicaltypes.VariableString
- getMaxNumericPrecision() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRelDataTypeSystem
- getMaxNumericScale() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRelDataTypeSystem
- getMaxNumRecords() - Method in class org.apache.beam.sdk.io.aws2.sqs.providers.SqsReadConfiguration
- getMaxNumRecords() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getMaxNumWorkers() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
-
The maximum number of workers to use for the workerpool.
- getMaxNumWritersPerBundle() - Method in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformConfiguration
- getMaxOutputElementsPerBundle() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler
-
Returns the maximum number of elements which will be output per each bundle.
- getMaxParallelism() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getMaxPrecision(SqlTypeName) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRelDataTypeSystem
- getMaxPreviewRecords() - Method in class org.apache.beam.sdk.io.cdap.context.BatchSourceContextImpl
- getMaxReadTime() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getMaxReadTimeSeconds() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- getMaxReadTimeSecs() - Method in class org.apache.beam.sdk.io.aws2.sqs.providers.SqsReadConfiguration
- getMaxRecordsPerBatch() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
- getMaxStackTraceDepthToReport() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getMaxStreamingBatchSize() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getMaxStreamingRowsToBatch() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getMD5() - Method in class org.apache.beam.sdk.io.aws2.s3.SSECustomerKey
- getMean() - Method in class org.apache.beam.sdk.metrics.DistributionResult
- getMean() - Method in class org.apache.beam.sdk.testing.CoderProperties.TestElementByteSizeObserver
- getMemoryMB() - Method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter.Options
-
Returns the configured size of the memory buffer.
- getMemoryMB() - Method in class org.apache.beam.sdk.extensions.sorter.ExternalSorter.Options
-
Returns the configured size of the memory buffer.
- getMessage() - Method in class org.apache.beam.io.requestresponse.ApiIOError
-
The
Exceptionmessage. - getMessage() - Method in exception class org.apache.beam.sdk.coders.Coder.NonDeterministicException
- getMessage() - Method in class org.apache.beam.sdk.io.csv.CsvIOParseError
-
The caught
Throwable.getMessage(). - getMessage() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.OutgoingMessage
-
Underlying Message.
- getMessage() - Method in exception class org.apache.beam.sdk.transforms.windowing.IncompatibleWindowException
- getMessageBacklog() - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicy.PartitionContext
-
Current backlog in messages (latest offset of the partition - last processed record offset).
- getMessageConverter(DestinationT, BigQueryServices.DatasetService) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiDynamicDestinationsTableRow
- getMessageId() - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsMessage
-
SQS message id.
- getMessageId() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
-
Returns the messageId of the message populated by Cloud Pub/Sub.
- getMessageId() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
- getMessageId() - Method in class org.apache.beam.sdk.io.solace.data.Solace.CorrelationKey
- getMessageId() - Method in class org.apache.beam.sdk.io.solace.data.Solace.PublishResult
-
The message id of the message that was published.
- getMessageId() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record
-
Gets the unique identifier of the message, a string for an application-specific message identifier.
- getMessageName() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
- getMessageName() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
- getMessageName() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- getMessageName() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration
- getMessages() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read.Result
- getMessages() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.Result
- getMessageStream(JobApi.JobMessagesRequest, StreamObserver<JobApi.JobMessagesResponse>) - Method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
- getMessageType() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
-
Returns the Protocol Buffers
Messagetype thisProtoCodersupports. - getMessageType() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
-
Gets message type.
- getMetadata() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmRecord
-
Returns the metadata.
- getMetadata() - Method in class org.apache.beam.sdk.io.FileIO.ReadableFile
-
Returns the
MatchResult.Metadataof the file. - getMetadata() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleParameter
-
String representing the metadata of the Bundle to be written.
- getMetadata() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadParameter
-
String representing the metadata of the messageId to be read.
- getMetadata() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadResponse
-
Gets metadata.
- getMetadata() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSet
-
Returns the gathered metadata for the change stream query so far.
- getMetadata() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The connector execution metadata for this record.
- getMetadata() - Method in class org.apache.beam.sdk.io.tika.ParseResult
-
Returns the extracted metadata.
- getMetadata() - Method in class org.apache.beam.sdk.values.PCollectionViews.ValueOrMetadata
- getMetadata(DestinationT) - Method in class org.apache.beam.sdk.extensions.avro.io.DynamicAvroDestinations
-
Return AVRO file metadata for a given destination.
- getMetadata(MetadataEntity) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getMetadata(MetadataScope, MetadataEntity) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getMetadata(String) - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
-
Deprecated.use schema options instead.
- getMetaData() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- getMetadataCoder() - Method in class org.apache.beam.sdk.io.ReadableFileCoder
- getMetadataKey() - Static method in class org.apache.beam.runners.dataflow.internal.IsmFormat
-
An object representing a wild card for a key component.
- getMetadataQuery() - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall
- getMetadataString(String) - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
-
Deprecated.use schema options instead.
- getMetadataTable() - Method in interface org.apache.beam.sdk.io.gcp.spanner.SpannerIO.SpannerChangeStreamOptions
-
Returns the name of the metadata table.
- getMetadataTableAdminDao() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.DaoFactory
- getMetadataTableDao() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.DaoFactory
- getMetadataTableDebugString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.DaoFactory
- getMetaStore() - Method in class org.apache.beam.sdk.extensions.sql.BeamSqlCli
- getMeters(MetricFilter) - Method in class org.apache.beam.runners.spark.metrics.WithMetricsSupport
- getMeters(MetricFilter) - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.WithMetricsSupport
- getMethod() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- getMethod() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
- getMethod() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
- getMethods(Class<?>) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
-
Returns the list of non private/protected, non-static methods in the class, caching the results.
- getMethodsMap(Class<?>) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
- getMetricGaugeName(String, int) - Static method in class org.apache.beam.sdk.io.kafka.KafkaSinkMetrics
-
Creates an MetricName based on topic name and partition id.
- getMetricGroup() - Method in class org.apache.beam.runners.flink.metrics.FlinkMetricContainer
- getMetricGroup() - Method in class org.apache.beam.runners.flink.metrics.FlinkMetricContainerWithoutAccumulator
- getMetricLabels() - Method in class org.apache.beam.sdk.metrics.LabeledMetricNameUtils.ParsedMetricName
- getMetrics() - Method in class org.apache.beam.runners.jobsubmission.JobInvocation
- getMetrics() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getMetricsContainer(String) - Method in class org.apache.beam.runners.flink.metrics.FlinkMetricContainer
- getMetricsContainer(String) - Method in class org.apache.beam.runners.flink.metrics.FlinkMetricContainerWithoutAccumulator
- getMetricsEnvironmentStateForCurrentThread() - Static method in class org.apache.beam.sdk.metrics.MetricsEnvironment
-
Returns the container holder for the current thread.
- getMetricsGraphiteHost() - Method in interface org.apache.beam.sdk.metrics.MetricsOptions
- getMetricsGraphitePort() - Method in interface org.apache.beam.sdk.metrics.MetricsOptions
- getMetricsHttpSinkUrl() - Method in interface org.apache.beam.sdk.metrics.MetricsOptions
- getMetricsMapName(long) - Static method in class org.apache.beam.runners.jet.metrics.JetMetricsContainer
- getMetricsPushPeriod() - Method in interface org.apache.beam.sdk.metrics.MetricsOptions
- getMetricsSink() - Method in interface org.apache.beam.sdk.metrics.MetricsOptions
- getMimeType() - Method in enum class org.apache.beam.sdk.io.FileBasedSink.CompressionType
-
Deprecated.
- getMimeType() - Method in interface org.apache.beam.sdk.io.FileBasedSink.OutputFileHints
-
Returns the MIME type that should be used for the files that will hold the output data.
- getMin() - Method in class org.apache.beam.sdk.metrics.DistributionResult
- getMinBundleSize() - Method in class org.apache.beam.sdk.io.OffsetBasedSource
-
Returns the minimum bundle size that should be used when splitting the source into sub-sources.
- getMinConnectionPoolConnections() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getMinCpuPlatform() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
-
Specifies a Minimum CPU platform for VM instances.
- getMinimumTimestamp() - Method in interface org.apache.beam.runners.local.Bundle
-
Return the minimum timestamp among elements in this bundle.
- getMinPauseBetweenCheckpoints() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getMinReadTimeMillis() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
- getMissingPartitionsFrom(List<Range.ByteStringRange>, ByteString, ByteString) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ByteStringRangeHelper
-
Return missing partitions within partitions that are within start and end.
- getMissingPartitionsFromEntireKeySpace(List<Range.ByteStringRange>) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ByteStringRangeHelper
-
Return missing partitions from the entire keyspace.
- getMode() - Method in class org.apache.beam.sdk.io.FileBasedSource
- getMode() - Method in class org.apache.beam.sdk.values.WindowingStrategy
- getModeNames() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
- getModifiableCollection() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteTable
- getMods() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The modifications within this record.
- getModType() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The type of operation that caused the modifications within this record.
- getMonitoringInfos() - Method in interface org.apache.beam.sdk.metrics.MetricsContainer
-
Return the cumulative values for any metrics in this container as MonitoringInfos.
- getMonthOfYear() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
- getMutableOutput(TupleTag<T>) - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
Deprecated.Use
TestPipelinewith theDirectRunner. - getMutationInformation() - Method in class org.apache.beam.sdk.io.gcp.bigquery.RowMutation
- GetMutationsFromBeamRow() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.GetMutationsFromBeamRow
- getMutationType() - Method in class org.apache.beam.sdk.io.gcp.bigquery.RowMutationInformation
- getNaiveObjectSerializer() - Static method in class org.apache.beam.runners.flink.translation.utils.SerdeUtils
- getName() - Method in enum class org.apache.beam.io.debezium.Connectors
-
The name of this connector class.
- getName() - Method in class org.apache.beam.runners.jet.metrics.BoundedTrieImpl
- getName() - Method in class org.apache.beam.runners.jet.metrics.CounterImpl
- getName() - Method in class org.apache.beam.runners.jet.metrics.DistributionImpl
- getName() - Method in class org.apache.beam.runners.jet.metrics.GaugeImpl
- getName() - Method in class org.apache.beam.runners.jet.metrics.StringSetImpl
- getName() - Method in class org.apache.beam.sdk.extensions.sql.example.model.Customer
- getName() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPMeasure
- getName() - Method in enum class org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention
- getName() - Method in class org.apache.beam.sdk.extensions.sql.meta.Table
- getName() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
-
Gets name.
- getName() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath
- getName() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.TopicPath
- getName() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
-
The name of the column.
- getName() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema.Column
- getName() - Method in class org.apache.beam.sdk.io.snowflake.data.SnowflakeColumn
- getName() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Destination
-
Gets the name of the destination.
- getName() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Queue
- getName() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Topic
- getName() - Method in class org.apache.beam.sdk.metrics.DelegatingCounter
- getName() - Method in class org.apache.beam.sdk.metrics.DelegatingDistribution
- getName() - Method in class org.apache.beam.sdk.metrics.DelegatingGauge
- getName() - Method in class org.apache.beam.sdk.metrics.DelegatingHistogram
- getName() - Method in interface org.apache.beam.sdk.metrics.Metric
-
The
MetricNamegiven to this metric. - getName() - Method in class org.apache.beam.sdk.metrics.MetricName
-
The name of this metric.
- getName() - Method in class org.apache.beam.sdk.metrics.MetricNameFilter
-
If set, the metric must have this name to match this
MetricNameFilter. - getName() - Method in class org.apache.beam.sdk.metrics.MetricResult
-
Return the name of the metric.
- getName() - Method in class org.apache.beam.sdk.metrics.NoOpCounter
- getName() - Method in class org.apache.beam.sdk.metrics.NoOpHistogram
- getName() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
-
Returns the field name.
- getName() - Method in class org.apache.beam.sdk.schemas.logicaltypes.FixedBytes
- getName() - Method in class org.apache.beam.sdk.schemas.logicaltypes.FixedString
- getName() - Method in class org.apache.beam.sdk.schemas.logicaltypes.VariableBytes
- getName() - Method in class org.apache.beam.sdk.schemas.logicaltypes.VariableString
- getName() - Method in class org.apache.beam.sdk.schemas.Schema.Field
-
Returns the field name.
- getName() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf.Configuration
- getName() - Method in class org.apache.beam.sdk.transforms.PTransform
-
Returns the transform name.
- getName() - Method in class org.apache.beam.sdk.values.PCollection
-
Returns the name of this
PCollection. - getName() - Method in interface org.apache.beam.sdk.values.PValue
-
Returns the name of this
PValue. - getName() - Method in class org.apache.beam.sdk.values.PValueBase
-
Returns the name of this
PValueBase. - getName(int) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- getNameCount() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- getNameOverride() - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
- getNameOverride() - Method in class org.apache.beam.sdk.transforms.Top.TopCombineFn
- getNameOverride(String, T) - Static method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
- getNamespace() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getNamespace() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
- getNamespace() - Method in class org.apache.beam.sdk.metrics.MetricName
-
The namespace associated with this metric.
- getNamespace() - Method in class org.apache.beam.sdk.metrics.MetricNameFilter
-
The inNamespace that a metric must be in to match this
MetricNameFilter. - getNamespace() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Identifier
- getNamespace() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
-
The namespace for the display item.
- getNamespace() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
-
The namespace for the display item.
- getNeedsAttributes() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
- getNeedsMessageId() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
- getNeedsOrderingKey() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
- getNestedFieldsAccessed() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
- getNetwork() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
-
GCE network for launching workers.
- getNetworkTimeout() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- getNewBigqueryClient(String) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
- getNewValuesJson() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.Mod
-
The new column values after the modification was applied.
- getNextId() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.UniqueIdGenerator
-
Return a random base64 encoded 8 byte string.
- getNextOffset() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark.PartitionMark
- getNextProcessingTimer() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
-
Finds the latest timer in
TimeDomain.PROCESSING_TIMEdomain that has expired based on the current processing time. - getNextWindow() - Method in class org.apache.beam.sdk.extensions.timeseries.FillGaps.InterpolateData
- getNodeStats() - Method in interface org.apache.beam.sdk.extensions.sql.impl.planner.NodeStatsMetadata
- getNodeStats(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRelMetadataQuery
- getNodeStats(RelNode, BeamRelMetadataQuery) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils
- getNodeStats(RelNode, RelMetadataQuery) - Method in interface org.apache.beam.sdk.extensions.sql.impl.planner.NodeStatsMetadata.Handler
- getNodeStats(RelNode, RelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.RelMdNodeStats
- getNonCumulativeCost(RelNode, RelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner.NonCumulativeCostImpl
- getNonNullPrefix() - Method in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider.Configuration
- getNonSpeculativeIndex() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
-
The zero-based index of this trigger firing among non-speculative panes.
- getNonWildcardPrefix(String) - Static method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
-
Returns the prefix portion of the glob that doesn't contain wildcards.
- getNormalizeKeyLen() - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueComparator
- getNoSpilling() - Method in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformConfiguration
- getNotSupported() - Method in interface org.apache.beam.sdk.extensions.sql.meta.BeamSqlTableFilter
-
Identify parts of a predicate that are not supported by the IO push-down capabilities to be preserved in a
CalcfollowingBeamIOSourceRel. - getNotSupported() - Method in class org.apache.beam.sdk.extensions.sql.meta.DefaultTableFilter
-
Since predicate push-down is assumed not to be supported by default - return an unchanged list of filters to be preserved.
- getNotSupported() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryFilter
- getNotSupported() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.iceberg.IcebergFilter
- getNotSupported() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableFilter
- getNullable() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
- getNullableValue() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow
-
Workaround for autovalue code generation, which does not allow type variables to be instantiated with nullable actual parameters.
- getNullFirst() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.OrderKey
- getNullParams() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamSqlUnparseContext
- getNum() - Method in class org.apache.beam.runners.spark.io.ConsoleIO.Write.Unbound
- getNumber() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
-
Optionally returns the field index.
- getNumber() - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields.Fanout
- getNumber() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
- getNumber() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
- getNumber() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
- getNumberOfBufferedEvents() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus
- getNumberOfExecutionRetries() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getNumberOfKeys() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.Footer
- getNumberOfPartitionsInTransaction() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The total number of partitions for the given transaction.
- getNumberOfReceivedEvents() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus
- getNumberOfRecordsInTransaction() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The total number of data change records for the given transaction.
- getNumberOfRecordsRead() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSetMetadata
-
Returns the total number of records read from the change stream so far.
- getNumberOfRecordsRead() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The number of records read in the partition change stream query before reading this record.
- getNumberOfShardKeyCoders(List<?>) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmRecordCoder
- getNumberOfWorkerHarnessThreads() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
Number of threads to use on the Dataflow worker harness.
- getNumberOverride(int, T) - Static method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
- getNumBytes() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.WriteSuccessSummary
- getNumBytes() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteSuccessSummary
- getNumConcurrentCheckpoints() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getNumConsumers(PValue) - Method in class org.apache.beam.runners.flink.translation.utils.CountingPipelineVisitor
-
Calculate number of consumers of a given
PValue. - getNumEntities(PipelineOptions, String, String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
-
Returns Number of entities available for reading.
- getNumExtractJobCalls() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
- getNumPartitions() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
- getNumQuerySplits() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
- getNumRows(BigQueryOptions, TableReference) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
-
It returns the number of rows for a given table.
- getNumSampledBytesPerFile() - Method in class org.apache.beam.sdk.io.TextRowCountEstimator
- getNumShards() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration
- getNumShards() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- getNumShards() - Method in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformConfiguration
- getNumShardsProvider() - Method in class org.apache.beam.sdk.io.WriteFiles
- getNumSplits() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSource
- getNumStorageWriteApiStreamAppendClients() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getNumStorageWriteApiStreams() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getNumStreamingKeys() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getNumStreams() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
- getNumWorkers() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
-
Number of workers to use when executing the Dataflow job.
- getNumWrites() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.WriteSuccessSummary
- getNumWrites() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteSuccessSummary
- getOauthToken() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- getOauthToken() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- getOAuthToken() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- getObject() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
-
Returns the object name associated with this GCS path, or an empty string if no object is specified.
- getObject(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- getObject(Map<String, Object>, String, Map<String, Object>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- getObject(GcsPath) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
-
Returns the
StorageObjectfor the givenGcsPath. - getObjectMapper() - Static method in class org.apache.beam.sdk.extensions.sql.TableUtils
- getObjectReuse() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getObjects(List<GcsPath>) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
-
Returns
StorageObjectOrIOExceptionsfor the givenGcsPaths. - getObservedTime() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOError
- getObservedTimestamp() - Method in class org.apache.beam.io.requestresponse.ApiIOError
-
The observed timestamp of the error.
- getObservedTimestamp() - Method in class org.apache.beam.sdk.io.csv.CsvIOParseError
-
The date and time when the
Exceptionoccurred. - getOffset() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
- getOffset() - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
- getOffset() - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
- getOffset() - Method in class org.apache.beam.sdk.transforms.windowing.TimestampTransform.AlignTo
- getOffsetConsumerConfig() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getOffsetDeduplication() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getOffsetDeduplication() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- getOffsetLimit() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark
- getOffsetLimit() - Method in interface org.apache.beam.sdk.io.UnboundedSource.CheckpointMark
- getOldValuesJson() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.Mod
-
The old column values before the modification was applied.
- getOnCreateMatcher() - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
- getOneOfSchema() - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
-
Returns the schema of the underlying
Rowthat is used to represent the union. - getOneOfTypes() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
- getOneRecord(Map<String, String>) - Method in class org.apache.beam.io.debezium.KafkaSourceConsumerFn
- getOnly() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
- getOnly() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration
- getOnly(String) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
-
Like
CoGbkResult.getOnly(TupleTag)but using a String instead of a TupleTag. - getOnly(String, V) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
-
Like
CoGbkResult.getOnly(TupleTag, Object)but using a String instead of a TupleTag. - getOnly(TupleTag<V>) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
-
If there is a singleton value for the given tag, returns it.
- getOnly(TupleTag<V>, V) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
-
If there is a singleton value for the given tag, returns it.
- getOnSuccessMatcher() - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
- getOnTimeBehavior() - Method in class org.apache.beam.sdk.values.WindowingStrategy
- getOperand0() - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall
- getOperandList() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCheckConstraint
- getOperandList() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlColumnDeclaration
- getOperandList() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateCatalog
- getOperandList() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateDatabase
- getOperandList() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateExternalTable
- getOperandList() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateFunction
- getOperandList() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlDropCatalog
- getOperandList() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlDropDatabase
- getOperandList() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlDropTable
- getOperandList() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlUseCatalog
- getOperandList() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlUseDatabase
- getOperands() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPCall
- getOperation() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPMeasure
- getOperation() - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo
- getOperationMode() - Method in class org.apache.beam.runners.twister2.BeamBatchTSetEnvironment
- getOperator() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPCall
- getOperator() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCheckConstraint
- getOperator() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlColumnDeclaration
- getOperator() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateFunction
- getOperator() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlUseCatalog
- getOperator() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlUseDatabase
- getOperatorChaining() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getOptionNames() - Method in class org.apache.beam.sdk.schemas.Schema.Options
- getOptions() - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator.TranslationState
- getOptions() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
- getOptions() - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
- getOptions() - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
- getOptions() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getOptions() - Method in class org.apache.beam.sdk.Pipeline
- getOptions() - Method in class org.apache.beam.sdk.schemas.Schema.Field
-
Returns the fields
Schema.Options. - getOptions() - Method in class org.apache.beam.sdk.schemas.Schema
- getOptions() - Method in class org.apache.beam.sdk.testing.TestPipeline
- getOptionsId() - Method in interface org.apache.beam.sdk.options.PipelineOptions
-
Provides a process wide unique ID for this
PipelineOptionsobject, assigned at graph construction time. - getOptionsSupplier() - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator.TranslationState
- getOptionsSupplier() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
- getOrCreate(BigtableConfig) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.BigtableChangeStreamAccessor
-
Create a BigtableAccess if it doesn't exist and store it in the cache for faster access.
- getOrCreate(SpannerConfig) - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor
- getOrCreateReader(PipelineOptions, CheckpointMarkT) - Method in class org.apache.beam.runners.spark.io.MicrobatchSource
- getOrCreateSession(SparkStructuredStreamingPipelineOptions) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.SparkSessionFactory
-
Gets active
SparkSessionor creates one usingSparkStructuredStreamingPipelineOptions. - getOrDecode(Coder<T>) - Method in class org.apache.beam.runners.spark.translation.ValueAndCoderLazySerializable
- getOrDefault(K, V) - Method in interface org.apache.beam.sdk.state.MapState
-
A deferred lookup.
- getOrderingKey() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
-
Returns the ordering key of the message.
- getOrdinalPosition() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
-
The position of the column in the table.
- getOrphanedNewPartitions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler.OrphanedMetadataCleaner
-
Returns a list of NewPartition that have been around for a while and do not overlap with any missing partition.
- getOrThrowException() - Method in class org.apache.beam.sdk.io.cdap.context.FailureCollectorWrapper
- getOutboundObserver() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer
- getOutName(int) - Method in class org.apache.beam.sdk.values.TupleTag
-
If this
TupleTagis tagging outputoutputIndexof aPTransform, returns the name that should be used by default for the output. - getOutput() - Method in interface org.apache.beam.runners.spark.structuredstreaming.examples.WordCount.WordCountOptions
- getOutput() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
- getOutput() - Method in class org.apache.beam.sdk.io.csv.CsvIOParseResult
-
The
CsvIOParseResultPCollectionas a result of successfully parsing CSV records. - getOutput() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.ErrorHandling
- getOutput() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.ErrorHandling
- getOutput() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.ErrorHandling
- getOutput() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteResult
- getOutput() - Method in class org.apache.beam.sdk.schemas.transforms.providers.ErrorHandling
- getOutput() - Method in class org.apache.beam.sdk.transforms.errorhandling.ErrorHandler.DefaultErrorHandler
- getOutput() - Method in interface org.apache.beam.sdk.transforms.errorhandling.ErrorHandler
- getOutput() - Method in class org.apache.beam.sdk.transforms.errorhandling.ErrorHandler.PTransformErrorHandler
- getOutput(PTransform<?, OutputT>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
- getOutput(PTransform<?, T>) - Method in class org.apache.beam.runners.flink.translation.utils.LookupPipelineVisitor
- getOutput(PTransform<?, T>) - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
- getOutput(PTransform<?, T>) - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
- getOutput(TupleTag<T>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
- getOutputCoder() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.impulse.BeamImpulseSource
- getOutputCoder() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource
- getOutputCoder() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource
- getOutputCoder() - Method in class org.apache.beam.sdk.io.azure.cosmos.CosmosIO.BoundedCosmosBDSource
- getOutputCoder() - Method in class org.apache.beam.sdk.io.CompressedSource
-
Returns the delegate source's output coder.
- getOutputCoder() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BoundedElasticsearchSource
- getOutputCoder() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
- getOutputCoder() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedSourceImpl
- getOutputCoder() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.HadoopInputFormatBoundedSource
- getOutputCoder() - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read.BoundedGridFSSource
- getOutputCoder() - Method in class org.apache.beam.sdk.io.solace.read.UnboundedSolaceSource
- getOutputCoder() - Method in class org.apache.beam.sdk.io.Source
-
Returns the
Coderto use for the data read from this source. - getOutputCoder() - Method in class org.apache.beam.sdk.io.TextSource
- getOutputCoder() - Method in class org.apache.beam.sdk.io.xml.XmlSource
- getOutputCoder(SerializableFunction<InputT, OutputT>, Coder<InputT>) - Method in class org.apache.beam.sdk.coders.CoderRegistry
-
Deprecated.This method is to change in an unknown backwards incompatible way once support for this functionality is refined.
- getOutputCoders() - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
- getOutputCoders() - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
- getOutputCoders(PTransform<?, ?>) - Method in class org.apache.beam.runners.flink.translation.utils.LookupPipelineVisitor
- getOutputExecutablePath() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
- getOutputFile() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
- getOutputFilePrefix() - Method in interface org.apache.beam.sdk.extensions.sql.example.BeamSqlDataCatalogExample.DCExamplePipelineOptions
-
Output file prefix.
- getOutputFormatProvider() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getOutputId(PipelineNode.PTransformNode) - Static method in class org.apache.beam.runners.fnexecution.translation.PipelineTranslatorUtils
- getOutputKvCoder(Coder<KV<K, V>>) - Static method in class org.apache.beam.runners.dataflow.internal.DataflowGroupByKey
-
Returns the
Coderof the output of this transform. - getOutputKvCoder(Coder<KV<K, V>>) - Static method in class org.apache.beam.sdk.transforms.GroupByKey
-
Returns the
Coderof the output of this transform. - getOutputManager() - Method in interface org.apache.beam.runners.spark.translation.SparkInputDataProcessor
- getOutputOrNull(ErrorHandling) - Static method in class org.apache.beam.sdk.schemas.transforms.providers.ErrorHandling
- getOutputParallelization() - Method in class org.apache.beam.sdk.io.FileIO.MatchAll
-
Returns whether to avoid the reshuffle operation.
- getOutputParallelization() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
- getOutputParallelization() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadConfiguration
- getOutputPortSchemas() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getOutputPrefix() - Method in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformConfiguration
- getOutputs() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
- getOutputs() - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
- getOutputs() - Method in class org.apache.beam.sdk.testing.TestOutputReceiver
- getOutputs(PTransform<?, ?>) - Method in class org.apache.beam.runners.flink.translation.utils.LookupPipelineVisitor
- getOutputs(PTransform<?, ?>) - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
- getOutputs(PTransform<?, OutputT>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
-
Returns the output of the currently being translated transform.
- getOutputSchema() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getOutputSchema(Schema, FieldAccessDescriptor) - Static method in class org.apache.beam.sdk.schemas.utils.SelectHelpers
-
Get the output schema resulting from selecting the given
FieldAccessDescriptorfrom the given schema. - getOutputStrategyInternal(WindowingStrategy<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.Window
-
Get the output strategy of this
Window PTransform. - getOutputStream() - Method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService.ArtifactDestination
- getOutputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
- getOutputTVariable() - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
-
Returns the
TypeVariableofOutputT. - getOutputTVariable() - Method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
-
Returns the
TypeVariableofOutputT. - getOutputTVariable() - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
-
Returns the
TypeVariableofOutputT. - getOutputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.ArrayAgg.ArrayAggArray
-
Returns the
TypeVariableofOutputT. - getOutputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations.BitXOr
-
Returns the
TypeVariableofOutputT. - getOutputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions.PositionAwareCombineFn
-
Returns the
TypeVariableofOutputT. - getOutputTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn
-
Returns the
TypeVariableofOutputT. - getOutputTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
-
Returns the
TypeVariableofOutputT. - getOutputTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
-
Returns the
TypeVariableofOutputT. - getOutputTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
-
Returns the
TypeVariableofOutputT. - getOutputTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
-
Returns the
TypeVariableofOutputT. - getOutputTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
-
Returns the
TypeVariableofOutputT. - getOutputTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
-
Returns the
TypeVariableofOutputT. - getOutputTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.SimpleCombineFn
-
Returns the
TypeVariableofOutputT. - getOutputTVariable() - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
-
Returns the
TypeVariableofOutputT. - getOutputTVariable() - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
-
Returns the
TypeVariableofOutputT. - getOutputTVariable() - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
-
Returns the
TypeVariableofOutputT. - getOutputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CountIf.CountIfFn
-
Returns the
TypeVariableofOutputT. - getOutputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CovarianceFn
-
Returns the
TypeVariableofOutputT. - getOutputTVariable() - Method in class org.apache.beam.sdk.extensions.ordered.combiner.DefaultSequenceCombiner
-
Returns the
TypeVariableofOutputT. - getOutputTVariable() - Method in class org.apache.beam.runners.flink.FlinkStreamingAggregationsTranslators.ConcatenateAsIterable
-
Returns the
TypeVariableofOutputT. - getOutputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.LazyAggregateCombineFn
-
Returns the
TypeVariableofOutputT. - getOutputTVariable() - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
-
Returns the
TypeVariableofOutputT. - getOutputTVariable() - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
-
Returns the
TypeVariableofOutputT. - getOutputTVariable() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Concatenate
-
Returns the
TypeVariableofOutputT. - getOutputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggByte
-
Returns the
TypeVariableofOutputT. - getOutputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggString
-
Returns the
TypeVariableofOutputT. - getOutputTVariable() - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
-
Returns the
TypeVariableofOutputT. - getOutputTVariable() - Method in class org.apache.beam.sdk.transforms.Top.TopCombineFn
-
Returns the
TypeVariableofOutputT. - getOutputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.VarianceFn
-
Returns the
TypeVariableofOutputT. - getOutputType() - Method in class org.apache.beam.sdk.extensions.sql.impl.UdafImpl
- getOutputType() - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
- getOutputType() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf
- getOutputType() - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
-
Returns a
TypeDescriptorcapturing what is known statically about the output type of thisCombineFninstance's most-derived class. - getOutputTypeDescriptor() - Method in class org.apache.beam.runners.dataflow.BatchStatefulParDoOverrides.BatchStatefulDoFn
- getOutputTypeDescriptor() - Method in class org.apache.beam.sdk.transforms.DoFn
-
Returns a
TypeDescriptorcapturing what is known statically about the output type of thisDoFninstance's most-derived class. - getOutputTypeDescriptor() - Method in class org.apache.beam.sdk.transforms.InferableFunction
-
Returns a
TypeDescriptorcapturing what is known statically about the output type of thisInferableFunctioninstance's most-derived class. - getOutputWatermarkHold() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.stableinput.BufferingDoFnRunner
- getOverlappingPartitions(List<Range.ByteStringRange>) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ByteStringRangeHelper
-
Return a list of overlapping partitions.
- getOverloadRatio() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
The target ratio between requests sent and successful requests.
- getOverrideWindmillBinary() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
-
Custom windmill_main binary to use with the streaming runner.
- getPaneInfo() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
- getPaneInfo() - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow
-
Returns the pane of this
FailsafeValueInSingleWindowin its window. - getPaneInfo() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow
-
Returns the pane of this
ValueInSingleWindowin its window. - getPaneInfo() - Method in interface org.apache.beam.sdk.values.WindowedValue
-
The
PaneInfoassociated with this WindowedValue. - getPaneInfo() - Method in class org.apache.beam.sdk.values.WindowedValues.Builder
- getPaneInfo() - Method in class org.apache.beam.sdk.values.WindowedValues.ParamWindowedValueCoder
- getParallelism() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getParallelism() - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
- getParameters() - Method in class org.apache.beam.sdk.extensions.sql.impl.UdafImpl
- getParameters() - Method in class org.apache.beam.sdk.extensions.sql.impl.UdfImplReflectiveFunctionBase
-
Returns the parameters of this function.
- getParamWindowedValueCoder(Coder<T>) - Static method in class org.apache.beam.sdk.values.WindowedValues
-
Returns the
ParamWindowedValueCoderfrom the given valueCoder. - getParent() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
-
Returns the parent path, or
nullif this path does not have a parent. - getParentId() - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo
- getParentLowWatermark() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
- getParentPartitions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
- getParentPartitions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
- getParents() - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall
- getParentTokens() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartition
-
The unique partition identifiers of the parent partitions where this child partition originated from.
- getParentTokens() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
The unique partition identifiers of the parent partitions where this child partition originated from.
- getParquetConfiguration() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration
- getParseFn() - Method in class org.apache.beam.sdk.io.solace.read.UnboundedSolaceSource
- getParser() - Method in class org.apache.beam.sdk.extensions.protobuf.DynamicProtoCoder
-
Get the memoized
Parser, possibly initializing it lazily. - getParser() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
-
Get the memoized
Parser, possibly initializing it lazily. - getPartition() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
- getPartition() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
- getPartition() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.StreamPartitionWithWatermark
- getPartition() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark.PartitionMark
- getPartition() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
- getPartition(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
-
Fetches the partition metadata row data for the given partition token.
- getPartition(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.InTransactionContext
-
Fetches the partition metadata row data for the given partition token.
- getPartitionColumn() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
- getPartitionCreatedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The time at which this partition was first detected and created in the metadata table.
- getPartitionEndTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The end time for the partition change stream query, which produced this record.
- getPartitionFields() - Method in class org.apache.beam.sdk.extensions.sql.meta.Table
- getPartitionFields() - Method in class org.apache.beam.sdk.io.iceberg.IcebergTableCreateConfig
- getPartitionFields() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration
- getPartitionKey() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
- getPartitionKey(T) - Method in interface org.apache.beam.sdk.io.aws2.kinesis.KinesisPartitioner.ExplicitPartitioner
- getPartitionKey(T) - Method in interface org.apache.beam.sdk.io.aws2.kinesis.KinesisPartitioner
-
Determines which shard in the stream the record is assigned to.
- getPartitionMetadataAdminDao() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.DaoFactory
-
Creates and returns a singleton DAO instance for admin operations over the partition metadata table.
- getPartitionMetadataDao() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.DaoFactory
-
Creates and returns a singleton DAO instance for accessing the partition metadata table.
- getPartitionQueryTimeout() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getPartitionReadTimeout() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getPartitionRunningAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The time at which the connector started processing this partition.
- getPartitions() - Method in class org.apache.beam.runners.spark.io.SourceRDD.Bounded
- getPartitions() - Method in class org.apache.beam.runners.spark.io.SourceRDD.Unbounded
- getPartitions() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark
- getPartitionScheduledAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The time at which this partition was scheduled to be queried.
- getPartitionSpec() - Method in class org.apache.beam.sdk.io.iceberg.IcebergTableCreateConfig
-
Partition spec destination, in the event that it must be dynamically created.
- getPartitionStartTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The start time for the partition change stream query, which produced this record.
- getPartitionsToReconcile(Instant, Instant) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler.PartitionReconciler
-
For missing partitions, try to organize the mismatched parent tokens in a way to fill the missing partitions.
- getPartitionToken() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The partition token that produced this change stream record.
- getPartitionToken() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The unique identifier of the partition that generated this record.
- getPartitionToken() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
Unique partition identifier, which can be used to perform a change stream query.
- getPartitionTokens() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionStartRecord
-
List of partitions yielded within this record.
- getPassword() - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration
- getPassword() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
- getPassword() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
- getPassword() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
- getPassword() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- getPassword() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- getPassword() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- getPath() - Method in class org.apache.beam.sdk.extensions.sql.impl.TableName
-
Table path up to the leaf table name.
- getPath() - Method in class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider.CsvWriteConfiguration
- getPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.ProjectPath
- getPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SchemaPath
- getPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath
- getPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.TopicPath
- getPath() - Method in class org.apache.beam.sdk.io.json.providers.JsonWriteTransformProvider.JsonWriteConfiguration
- getPath() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf.Configuration
- getPath() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Identifier
- getPath() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
-
The path for the display item within a component hierarchy.
- getPathPrefix() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
- getPathValidator() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
-
The path validator instance that should be used to validate paths.
- getPathValidatorClass() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
-
The class of the validator that should be created and used to validate paths.
- getPatientCompartments() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything.Result
-
Gets the patient compartment responses for GetPatientEverything requests.
- getPatientEverything() - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
-
Get the patient compartment for a FHIR Patient using the GetPatientEverything/$everything API.
- getPatientEverything(String, Map<String, Object>, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Fhir get patient everything http body.
- getPatientEverything(String, Map<String, Object>, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- getPatternCondition() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPPattern
- getPatternVar() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPPattern
- getPayload() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload
- getPayload() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
-
Returns the main PubSub message.
- getPayload() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
- getPayload() - Method in class org.apache.beam.sdk.io.mqtt.MqttRecord
- getPayload() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record
-
Gets the payload of the message as a byte array.
- getPayload() - Method in class org.apache.beam.sdk.schemas.io.Failure
-
Bytes containing the payload which has failed.
- getPayload() - Method in class org.apache.beam.sdk.schemas.logicaltypes.UnknownLogicalType
- getPayload(AvroGenericCoder) - Method in class org.apache.beam.sdk.extensions.avro.AvroGenericCoderTranslator
- getPayload(WindowedValues.ParamWindowedValueCoder<?>) - Static method in class org.apache.beam.sdk.values.WindowedValues.ParamWindowedValueCoder
-
Returns the serialized payload that will be provided when deserializing this coder.
- getPCollection() - Method in interface org.apache.beam.runners.local.Bundle
-
Returns the PCollection that the elements of this bundle belong to.
- getPCollection() - Method in interface org.apache.beam.sdk.values.PCollectionView
-
For internal use only.
- getPCollection() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
- getPCollectionConsumptionMap() - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
-
Get the map of
PCollectionto the number ofPTransformconsuming it. - getPCollectionInputs() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
- getPCollectionInputs() - Method in interface org.apache.beam.sdk.extensions.sql.impl.rel.BeamRelNode
- getPerDestinationOutputFilenames() - Method in class org.apache.beam.sdk.io.WriteFilesResult
-
Returns a
PCollectionof all output filenames generated by thisWriteFilesorganized by user destination type. - getPerElementConsumers(CollectionT) - Method in interface org.apache.beam.runners.direct.ExecutableGraph
- getPerElementInputs(ExecutableT) - Method in interface org.apache.beam.runners.direct.ExecutableGraph
- getPeriod() - Method in class org.apache.beam.sdk.extensions.sql.impl.TVFSlidingWindowFn
-
Amount of time between generated windows.
- getPeriod() - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
- getPeriod() - Method in class org.apache.beam.sdk.transforms.windowing.TimestampTransform.AlignTo
- getPeriodicStatusPageOutputDirectory() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getPerWorkerCounter(MetricName) - Method in interface org.apache.beam.sdk.metrics.MetricsContainer
-
Return the
Counterthat should be used for implementing the given per-worker invalid input: '{@code metricName) in this container.' - getPerWorkerMetricsUpdateReportingPeriodMillis() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getPgJsonb(int) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSet
-
Returns the record at the current pointer as.
invalid reference
JsonB - getPipeline() - Method in class org.apache.beam.io.requestresponse.Result
- getPipeline() - Method in class org.apache.beam.runners.jobsubmission.JobInvocation
-
Retrieve the job's pipeline.
- getPipeline() - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
- getPipeline() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedEventProcessorResult
- getPipeline() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Write.Result
- getPipeline() - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.WriteBatches.Result
- getPipeline() - Method in class org.apache.beam.sdk.io.csv.CsvIOParseResult
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata.Result
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read.Result
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search.Result
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.AbstractResult
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.Result
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything.Result
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read.Result
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.Result
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write.Result
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteResult
- getPipeline() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteResult
- getPipeline() - Method in class org.apache.beam.sdk.io.jms.WriteJmsResult
- getPipeline() - Method in class org.apache.beam.sdk.io.solace.write.SolaceOutput
- getPipeline() - Method in class org.apache.beam.sdk.io.WriteFilesResult
- getPipeline() - Method in class org.apache.beam.sdk.Pipeline.PipelineVisitor.Defaults
- getPipeline() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
- getPipeline() - Method in class org.apache.beam.sdk.transforms.JsonToRow.ParseResult
- getPipeline() - Method in class org.apache.beam.sdk.transforms.WithFailures.Result
- getPipeline() - Method in class org.apache.beam.sdk.values.PBegin
- getPipeline() - Method in class org.apache.beam.sdk.values.PCollectionList
- getPipeline() - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
- getPipeline() - Method in class org.apache.beam.sdk.values.PCollectionTuple
- getPipeline() - Method in class org.apache.beam.sdk.values.PDone
- getPipeline() - Method in interface org.apache.beam.sdk.values.PInput
- getPipeline() - Method in interface org.apache.beam.sdk.values.POutput
- getPipeline() - Method in class org.apache.beam.sdk.values.PValueBase
- getPipeline(JobApi.GetJobPipelineRequest, StreamObserver<JobApi.GetJobPipelineResponse>) - Method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
- getPipelineFromClasspath(String) - Static method in class org.apache.beam.runners.jobsubmission.PortablePipelineJarUtils
- getPipelineName() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getPipelineOptions() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineRegistrar.Options
- getPipelineOptions() - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
-
Returns the configured pipeline options.
- getPipelineOptions() - Method in class org.apache.beam.runners.direct.DirectRegistrar.Options
- getPipelineOptions() - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.BatchTranslationContext
- getPipelineOptions() - Method in interface org.apache.beam.runners.flink.FlinkPortablePipelineTranslator.TranslationContext
- getPipelineOptions() - Method in class org.apache.beam.runners.flink.FlinkRunner
-
For testing.
- getPipelineOptions() - Method in class org.apache.beam.runners.flink.FlinkRunnerRegistrar.Options
- getPipelineOptions() - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator.StreamingTranslationContext
- getPipelineOptions() - Method in class org.apache.beam.runners.flink.TestFlinkRunner
- getPipelineOptions() - Method in class org.apache.beam.runners.jet.JetRunnerRegistrar.Options
- getPipelineOptions() - Method in class org.apache.beam.runners.portability.testing.TestPortablePipelineOptions.TestPortablePipelineOptionsRegistrar
- getPipelineOptions() - Method in class org.apache.beam.runners.portability.testing.TestUniversalRunner.OptionsRegistrar
- getPipelineOptions() - Method in class org.apache.beam.runners.prism.PrismRegistrar.Options
- getPipelineOptions() - Method in class org.apache.beam.runners.spark.SparkRunnerRegistrar.Options
- getPipelineOptions() - Method in class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingRunnerRegistrar.Options
- getPipelineOptions() - Method in class org.apache.beam.runners.twister2.Twister2RunnerRegistrar.Options
- getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.gcp.options.GcpPipelineOptionsRegistrar
- getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.python.PythonExternalTransformOptionsRegistrar
- getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
- getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
- getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlPipelineOptionsRegistrar
- getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.sql.impl.JdbcConnection
- getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSinkRel
- getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
- getPipelineOptions() - Method in interface org.apache.beam.sdk.extensions.sql.impl.rel.BeamRelNode
-
Perform a DFS(Depth-First-Search) to find the PipelineOptions config.
- getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamValuesRel
- getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogPipelineOptionsRegistrar
- getPipelineOptions() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIOOptions.KinesisIOOptionsRegistrar
- getPipelineOptions() - Method in class org.apache.beam.sdk.io.aws2.options.AwsPipelineOptionsRegistrar
- getPipelineOptions() - Method in class org.apache.beam.sdk.io.azure.options.AzurePipelineOptionsRegistrar
- getPipelineOptions() - Method in class org.apache.beam.sdk.io.gcp.common.GcpIoPipelineOptionsRegistrar
- getPipelineOptions() - Method in class org.apache.beam.sdk.io.hdfs.HadoopFileSystemOptionsRegistrar
- getPipelineOptions() - Method in class org.apache.beam.sdk.options.DefaultPipelineOptionsRegistrar
- getPipelineOptions() - Method in class org.apache.beam.sdk.options.ManualDockerEnvironmentOptions.Options
- getPipelineOptions() - Method in interface org.apache.beam.sdk.options.PipelineOptionsRegistrar
- getPipelineOptions() - Method in class org.apache.beam.sdk.options.RemoteEnvironmentOptions.Options
- getPipelineOptions() - Method in interface org.apache.beam.sdk.state.StateContext
-
Returns the
PipelineOptionsspecified with thePipelineRunner. - getPipelineOptions() - Method in class org.apache.beam.sdk.transforms.CombineWithContext.Context
- getPipelineOptions() - Method in class org.apache.beam.sdk.transforms.DoFn.FinishBundleContext
- getPipelineOptions() - Method in class org.apache.beam.sdk.transforms.DoFn.StartBundleContext
- getPipelineOptions() - Method in class org.apache.beam.sdk.transforms.DoFn.WindowedContext
- getPipelineOptions() - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
Deprecated.Use
TestPipelinewith theDirectRunner. - getPipelineOptions() - Method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHintsOptions.Options
- getPipelineOptionsFromClasspath(String) - Static method in class org.apache.beam.runners.jobsubmission.PortablePipelineJarUtils
- getPipelinePolicy() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
- getPipelineProto() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
-
Get the Runner API pipeline proto if available.
- getPipelineProto() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator.JobSpecification
- getPipelineRunners() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineRegistrar.Runner
- getPipelineRunners() - Method in class org.apache.beam.runners.direct.DirectRegistrar.Runner
- getPipelineRunners() - Method in class org.apache.beam.runners.flink.FlinkRunnerRegistrar.Runner
- getPipelineRunners() - Method in class org.apache.beam.runners.jet.JetRunnerRegistrar.Runner
- getPipelineRunners() - Method in class org.apache.beam.runners.portability.PortableRunnerRegistrar
- getPipelineRunners() - Method in class org.apache.beam.runners.portability.testing.TestUniversalRunner.RunnerRegistrar
- getPipelineRunners() - Method in class org.apache.beam.runners.prism.PrismRegistrar.Runner
- getPipelineRunners() - Method in class org.apache.beam.runners.spark.SparkRunnerRegistrar.Runner
- getPipelineRunners() - Method in class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingRunnerRegistrar.Runner
- getPipelineRunners() - Method in class org.apache.beam.runners.twister2.Twister2RunnerRegistrar.Runner
- getPipelineUrl() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
-
The URL of the staged portable pipeline.
- getPlainText() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getPlanner() - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall
- getPlannerName() - Method in interface org.apache.beam.sdk.extensions.sql.impl.BeamSqlPipelineOptions
- getPluginClass() - Method in class org.apache.beam.sdk.io.cdap.Plugin
-
Gets the main class of a plugin.
- getPluginConfig() - Method in class org.apache.beam.sdk.io.cdap.Plugin
-
Gets a plugin config.
- getPluginProperties() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getPluginProperties(String) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getPluginType() - Method in class org.apache.beam.sdk.io.cdap.Plugin
-
Gets a plugin type.
- getPollInterval() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getPollIntervalMillis() - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration
-
The time, in milliseconds, to wait before polling for new files.
- getPort() - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration
- getPort() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver.ServerConfiguration
- getPort() - Method in class org.apache.beam.sdk.expansion.service.ExpansionServer
-
Get the port that this
ExpansionServeris bound to. - getPort() - Method in class org.apache.beam.sdk.fn.data.RemoteGrpcPortRead
- getPort() - Method in class org.apache.beam.sdk.fn.data.RemoteGrpcPortWrite
- getPortNumber() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- getPortNumber() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- getPositionForFractionConsumed(double) - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
-
Returns a position
Psuch that the range[start, P)represents approximately the given fraction of the range[start, end). - getPrecision() - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.Globally
- getPrecision() - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.PerKey
- getPrecision() - Method in class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeNumber
- getPredefinedCsvFormat() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.CsvConfiguration
-
See
CSVFormat.Predefined.values()for a list of allowed values. - getPreferGroupByKeyToHandleHugeValues() - Method in interface org.apache.beam.runners.spark.SparkCommonPipelineOptions
- getPrefix() - Method in class org.apache.beam.sdk.extensions.sql.impl.TableName
-
First element in the path.
- getPrefix() - Method in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider.Configuration
- getPrefixedEndpoint(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
- getPreviousWindow() - Method in class org.apache.beam.sdk.extensions.timeseries.FillGaps.InterpolateData
- getPrimary() - Method in class org.apache.beam.sdk.transforms.splittabledofn.SplitResult
-
Returns the primary restriction.
- getPrimaryKey() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
- getPriority() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
- getPriority() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record
-
Gets the priority level of the message (0-255, higher is more important).
- getPrismLocation() - Method in interface org.apache.beam.runners.prism.PrismPipelineOptions
- getPrismLogLevel() - Method in interface org.apache.beam.runners.prism.PrismPipelineOptions
- getPrismVersionOverride() - Method in interface org.apache.beam.runners.prism.PrismPipelineOptions
- getPrivateKey() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- getPrivateKeyPassphrase() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- getPrivateKeyPassphrase() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- getPrivateKeyPassphrase() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- getPrivateKeyPath() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- getPrivateKeyPath() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- getProcessBundleDescriptor() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.ExecutableProcessBundleDescriptor
- getProcessBundleDescriptor() - Method in interface org.apache.beam.runners.fnexecution.control.StageBundleFactory
- getProcessBundleDescriptor(String) - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClient
- getProcessBundleDescriptor(BeamFnApi.GetProcessBundleDescriptorRequest, StreamObserver<BeamFnApi.ProcessBundleDescriptor>) - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService
- getProcessingTimeAdvance() - Method in class org.apache.beam.sdk.testing.TestStream.ProcessingTimeEvent
- getProcessor(BeamFnApi.ProcessBundleDescriptor, List<RemoteInputDestination>) - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient
-
Provides
SdkHarnessClient.BundleProcessorthat is capable of processing bundles not containing timers or state accesses such as: Side inputs User state Remote references - getProcessor(BeamFnApi.ProcessBundleDescriptor, List<RemoteInputDestination>, StateDelegator) - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient
-
Provides
SdkHarnessClient.BundleProcessorthat is capable of processing bundles not containing timers. - getProcessor(BeamFnApi.ProcessBundleDescriptor, List<RemoteInputDestination>, StateDelegator, Map<String, Map<String, ProcessBundleDescriptors.TimerSpec>>) - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient
-
Provides
SdkHarnessClient.BundleProcessorthat is capable of processing bundles containing timers and state accesses such as: Side inputs User state Remote references - getProcessWideContainer() - Static method in class org.apache.beam.sdk.metrics.MetricsEnvironment
-
Return the
MetricsContainerfor the current process. - getProduced(ExecutableT) - Method in interface org.apache.beam.runners.direct.ExecutableGraph
- getProducedType() - Method in class org.apache.beam.runners.flink.translation.types.KvKeySelector
- getProducedType() - Method in class org.apache.beam.runners.flink.translation.types.WindowedKvKeySelector
- getProducedType() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.KvToFlinkKeyKeySelector
- getProducedType() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.SdfFlinkKeyKeySelector
- getProducedType() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.WorkItemKeySelector
- getProducer(CollectionT) - Method in interface org.apache.beam.runners.direct.ExecutableGraph
- getProducer(PValue) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
-
Get the
AppliedPTransformthat produced the providedPValue. - getProducerConfig() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- getProducerConfigUpdates() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration
- getProducerFactoryFn() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- getProducersMapCardinality() - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
- getProfilingAgentConfiguration() - Method in interface org.apache.beam.runners.dataflow.options.DataflowProfilingOptions
- getProgress() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
-
Returns the progress made within the restriction so far.
- getProgress() - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.ReadFiles.BlockTracker
- getProgress() - Method in class org.apache.beam.sdk.transforms.PeriodicSequence.OutputRangeTracker
- getProgress() - Method in class org.apache.beam.sdk.transforms.splittabledofn.ByteKeyRangeTracker
- getProgress() - Method in class org.apache.beam.sdk.transforms.splittabledofn.GrowableOffsetRangeTracker
- getProgress() - Method in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
- getProgress() - Method in interface org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker.HasProgress
-
A representation for the amount of known completed and known remaining work.
- getProject() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
- getProject() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
-
Project id to use when launching jobs.
- getProject() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
-
Get the project path.
- getProject() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
- getProject() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
- getProjectedSchema() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
-
Returns the projected Schema after applying column pruning.
- getProjectId() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
-
Get the project this job exists in.
- getProjectId() - Method in class org.apache.beam.runners.dataflow.util.DataflowTemplateJob
- getProjectId() - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
- getProjectId() - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText
- getProjectId() - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
-
Returns the project id being written to.
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntityWithSummary
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKeyWithSummary
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.WriteWithSummary
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SchemaPath
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher.TableAndQuery
- getProperties() - Method in class org.apache.beam.runners.dataflow.DataflowRunnerInfo
- getProperties() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- getProperties() - Method in class org.apache.beam.sdk.extensions.sql.meta.Table
- getProperties() - Method in class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig.IcebergTableInfo
- getProperties() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
- getProtoBytesToRowFn(Class<T>) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoMessageSchema
- getProtoBytesToRowFromSchemaFunction(String, String) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoByteUtils
- getProtoBytesToRowFunction(String, String) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoByteUtils
- getProtoChangeStreamRecord() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSet
-
Returns the only change stream record proto at the current pointer of the result set.
- getProvidedSparkContext() - Method in interface org.apache.beam.runners.spark.SparkContextOptions
- getProvider(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.store.InMemoryMetaStore
-
Fetches a
TableProviderfor this type. - getProviderRuntimeValues() - Method in interface org.apache.beam.sdk.testing.TestPipeline.TestValueProviderOptions
- getProvisionInfo(ProvisionApi.GetProvisionInfoRequest, StreamObserver<ProvisionApi.GetProvisionInfoResponse>) - Method in class org.apache.beam.runners.fnexecution.provisioning.StaticGrpcProvisionService
- getProxyConfiguration() - Method in interface org.apache.beam.sdk.io.aws2.options.AwsOptions
-
ProxyConfigurationused to configure AWS service clients. - getPTransformForInput() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaCSVTable
- getPTransformForInput() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
- getPTransformForInput() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.PayloadSerializerKafkaTable
- getPTransformForOutput() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaCSVTable
- getPTransformForOutput() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
- getPTransformForOutput() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.PayloadSerializerKafkaTable
- getPTransformId() - Method in class org.apache.beam.runners.fnexecution.data.RemoteInputDestination
- getPublishBatchWithOrderingKey() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
- getPublished() - Method in class org.apache.beam.sdk.io.solace.data.Solace.PublishResult
-
Whether the message was published or not.
- getPublishedResultsQueue() - Method in class org.apache.beam.sdk.io.solace.broker.JcsmpSessionService
- getPublishedResultsQueue() - Method in class org.apache.beam.sdk.io.solace.broker.SessionService
-
Returns the
Queue<Solace.PublishResult>instance associated with this session, with the asynchronously received callbacks from Solace for message publications. - getPublishLatencyMetric() - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
- getPublishMonotonicNanos() - Method in class org.apache.beam.sdk.io.solace.data.Solace.CorrelationKey
- getPublishTimestampFunction() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- getPubsubRootUrl() - Method in interface org.apache.beam.sdk.io.gcp.pubsub.PubsubOptions
-
Root URL for use with the Google Cloud Pub/Sub API.
- getPViews() - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
-
Return the current views creates in the pipeline.
- getQualifiers() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor
- getQuantifier() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPPattern
- getQueries() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
- getQuery() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration
-
Configures the BigQuery read job with the SQL query.
- getQuery() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration
- getQuery() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
- getQuery() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- getQuery() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration
- getQuery() - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher.TableAndQuery
- getQuery() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadConfiguration
- getQuery() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- getQuery() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
-
Getting a query which can be source for reading.
- getQuery() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- getQueryLocation() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration
-
BigQuery geographic location where the query job will be executed.
- getQueryName() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- getQueryStartedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSetMetadata
-
Returns the timestamp at which the change stream query for a
ChangeStreamResultSetfirst started. - getQueryStartedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The time that the change stream query which produced this record started.
- getQueryString() - Method in interface org.apache.beam.sdk.extensions.sql.example.BeamSqlDataCatalogExample.DCExamplePipelineOptions
-
SQL Query.
- getQueue() - Method in class org.apache.beam.sdk.io.solace.broker.SessionServiceFactory
-
Getter for the queue.
- getQueue() - Method in class org.apache.beam.sdk.io.solace.read.UnboundedSolaceSource
- getQueueUrl() - Method in class org.apache.beam.sdk.io.aws2.sqs.providers.SqsReadConfiguration
- getQuotationMark() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
-
Getting a character that will surround
Stringin staged CSV files. - getRamMegaBytes() - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
- getRange() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
-
Returns the current range.
- getRate() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
- getRate() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.NodeStats
- getRate() - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration
- getRateLimitPolicy() - Method in interface org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory
- getRateLimitPolicy() - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsIO.RateLimitPolicyFactory
- getRaw(ObjectT) - Method in interface org.apache.beam.sdk.schemas.FieldValueGetter
-
Returns the raw value of the getter before any further transformations.
- getRawBytesToRowFunction(Schema) - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
- getRawBytesToRowFunction(Schema) - Static method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider
- getRawPrivateKey() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- getRawPrivateKey() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- getRawPrivateKey() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- getRawStringToRowFunction(Schema) - Static method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider
- getRawType() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
-
Returns the raw class type.
- getRawType() - Method in class org.apache.beam.sdk.values.TypeDescriptor
- getRDD() - Method in class org.apache.beam.runners.spark.translation.BoundedDataset
- getRead() - Method in class org.apache.beam.io.requestresponse.Cache.Pair
- getReadCounterPrefix() - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.GcsCountersOptions
- getReaderCacheTimeoutSec() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
The amount of time before UnboundedReaders are considered idle and closed during streaming execution.
- getReaderCheckpoint(int, FlinkSourceReaderBase.ReaderAndOutput) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.unbounded.FlinkUnboundedSourceReader
- getReaderCheckpoint(int, FlinkSourceReaderBase.ReaderAndOutput) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.bounded.FlinkBoundedSourceReader
- getReaderCheckpoint(int, FlinkSourceReaderBase.ReaderAndOutput) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
-
Create
FlinkSourceSplitfor givensplitId. - getReadOperation() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
- getReadQuery() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
- getReadResponse() - Method in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata.Result
-
Gets resources.
- getReadTime() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
- getReadTime() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
- getReadTimePercentage() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
- getReason() - Method in exception class org.apache.beam.sdk.coders.CannotProvideCoderException
- getReason() - Method in class org.apache.beam.sdk.extensions.ordered.UnprocessedEvent
- getReasons() - Method in exception class org.apache.beam.sdk.coders.Coder.NonDeterministicException
- getReceiptHandle() - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsMessage
-
SQS receipt handle.
- getReceiver() - Method in class org.apache.beam.runners.fnexecution.control.RemoteOutputReceiver
- getReceiver() - Method in class org.apache.beam.sdk.fn.data.DataEndpoint
- getReceiver() - Method in class org.apache.beam.sdk.fn.data.TimerEndpoint
- getReceiver() - Method in class org.apache.beam.sdk.io.solace.broker.JcsmpSessionService
- getReceiver() - Method in class org.apache.beam.sdk.io.solace.broker.SessionService
-
Returns a MessageReceiver object for receiving messages from Solace.
- getReceiverBuilder() - Method in class org.apache.beam.sdk.io.cdap.Plugin
-
Gets a
ReceiverBuilder. - getReceiverClass() - Method in class org.apache.beam.sdk.io.cdap.Plugin
-
Gets Spark
Receiverclass for a CDAP plugin. - getReceiveTimestamp() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record
-
Gets the timestamp (in milliseconds since the Unix epoch) when the message was received by the Solace broker.
- getRecommendedSchema() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.DateConversion
- getRecommendedSchema() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.LocalTimestampMicrosConversion
- getRecommendedSchema() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.LocalTimestampMillisConversion
- getRecommendedSchema() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimeMicrosConversion
- getRecommendedSchema() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimeMillisConversion
- getRecommendedSchema() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimestampMicrosConversion
- getRecommendedSchema() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimestampMillisConversion
- getRecord() - Method in class org.apache.beam.sdk.io.gcp.bigquery.SchemaAndRecord
- getRecord() - Method in class org.apache.beam.sdk.io.kudu.TableAndRecord
- getRecord() - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord
-
Information about the record that failed.
- getRecordId() - Method in interface org.apache.beam.sdk.values.WindowedValue
- getRecordId() - Method in class org.apache.beam.sdk.values.WindowedValues.Builder
- getRecordJfrOnGcThrashing() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
If true, save a JFR profile when GC thrashing is first detected.
- getRecordOffset() - Method in interface org.apache.beam.sdk.values.WindowedValue
- getRecordOffset() - Method in class org.apache.beam.sdk.values.WindowedValues.Builder
- getRecordReadAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSetMetadata
-
Returns the timestamp at which a record was read from the
ChangeStreamResultSet. - getRecordReadAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The time at which the record was fully read.
- getRecordSchema() - Method in class org.apache.beam.io.debezium.DebeziumIO.Read
- getRecordSequence() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
-
Indicates the order in which a record was put to the stream.
- getRecordSequence() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
Indicates the order in which this record was put into the change stream in the scope of a partition, commit timestamp and transaction tuple.
- getRecordSequence() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionEndRecord
-
Indicates the order in which a record was put to the stream.
- getRecordSequence() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionEventRecord
-
Indicates the order in which a record was put to the stream.
- getRecordSequence() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionStartRecord
-
Indicates the order in which a record was put to the stream.
- getRecordStreamEndedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSetMetadata
-
Returns the timestamp at which a record finished to be streamed.
- getRecordStreamEndedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The time at which the record finished streaming.
- getRecordStreamStartedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSetMetadata
-
Returns the timestamp at which a record first started to be streamed.
- getRecordStreamStartedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The time at which the record started to be streamed.
- getRecordTimestamp() - Method in interface org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecord
-
The timestamp associated with the record.
- getRecordTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The Cloud Spanner timestamp time when this record occurred.
- getRecordTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
-
Returns the timestamp that which this partition started being valid in Cloud Spanner.
- getRecordTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The timestamp at which the modifications within were committed in Cloud Spanner.
- getRecordTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.HeartbeatRecord
-
Indicates the timestamp for which the change stream query has returned all changes.
- getRecordTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionEndRecord
-
Indicates the timestamp for which the change stream partition is terminated.
- getRecordTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionEventRecord
-
Returns the timestamp at which the key range change occurred.
- getRecordTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionStartRecord
-
Returns the timestamp that which these partitions started being valid in Cloud Spanner.
- getRecordType() - Method in class org.apache.beam.sdk.coders.SerializableCoder
- getRedelivered() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record
-
Indicates whether the message has been redelivered due to a prior delivery failure.
- getRedistributeByRecordKey() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getRedistributeByRecordKey() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- getRedistributed() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- getRedistributeNumKeys() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getRedistributeNumKeys() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- getReferentialConstraints() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
- getRegexFromPattern(RexNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPUtils
-
Recursively construct a regular expression from a
RexNode. - getRegion() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
-
Get the region this job exists in.
- getRegion() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
-
The Google Compute Engine region for creating Dataflow jobs.
- getRegionFromEnvironment() - Static method in class org.apache.beam.runners.dataflow.options.DefaultGcpRegionFactory
- getRegisteredOptions() - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
- getReidentifyConfig() - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
- getReidentifyTemplateName() - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
- getReIterableGroupByKeyResult() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getRelList() - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall
- getRelTypes() - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamCalcSplittingRule
- getRemoteHeapDumpLocation() - Method in interface org.apache.beam.sdk.options.MemoryMonitorOptions
-
A remote file system to upload captured heap dumps to.
- getRemoteInputDestinations() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.ExecutableProcessBundleDescriptor
-
Get
RemoteInputDestinations that input data are sent to theBeamFnApi.ProcessBundleDescriptorover. - getRemoteOutputCoders() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.ExecutableProcessBundleDescriptor
-
Get all of the transforms materialized by this
ProcessBundleDescriptors.ExecutableProcessBundleDescriptorand the JavaCoderfor the wire format of that transform. - getRepeatedTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.Repeatedly
- getReplacedByJob() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
-
Returns a new
DataflowPipelineJobfor the job that replaced this one, if applicable. - getReplacedByJob() - Method in class org.apache.beam.runners.dataflow.util.DataflowTemplateJob
- getReplacementTransform(AppliedPTransform<PCollection<? extends InputT>, PCollection<OutputT>, ParDo.SingleOutput<InputT, OutputT>>) - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory
- getReplacementTransform(AppliedPTransform<PCollection<? extends InputT>, PCollectionTuple, PTransform<PCollection<? extends InputT>, PCollectionTuple>>) - Method in class org.apache.beam.runners.direct.ParDoMultiOverrideFactory
- getReplacementTransform(AppliedPTransform<PCollection<ElemT>, PCollection<ElemT>, PTransform<PCollection<ElemT>, PCollection<ElemT>>>) - Method in class org.apache.beam.runners.spark.translation.streaming.CreateStreamingSparkView.Factory
- getReplicationGroupMessageId() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record
-
Gets the ID for the message within its replication group (if applicable).
- getReplyTo() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
- getReplyTo() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record
-
Gets the destination to which replies to this message should be sent.
- getReportCheckpointDuration() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getRequestAsString() - Method in class org.apache.beam.io.requestresponse.ApiIOError
-
The string representation of the request associated with the error.
- getRequestTimeStamp() - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsMessage
-
Timestamp the message was received at (in epoch millis).
- getRequiredSchema() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
-
Returns a Schema that includes all the fields required for a successful read.
- getRequirements() - Method in class org.apache.beam.sdk.transforms.Contextful
-
Returns the requirements needed to run the closure.
- getResidual() - Method in class org.apache.beam.sdk.transforms.splittabledofn.SplitResult
-
Returns the residual restriction.
- getResourceHints() - Method in class org.apache.beam.sdk.transforms.PTransform
-
Returns resource hints set on the transform.
- getResourceHints() - Method in interface org.apache.beam.sdk.transforms.resourcehints.ResourceHintsOptions
- getResources() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read.Result
-
Gets resources.
- getResources() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search.Result
-
Gets resources.
- getResourceType() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
- getResponse() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleResponse
-
HTTP response from the FHIR store after attempting to write the Bundle method.
- getResponseItemJson() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Document
- getResponses() - Method in class org.apache.beam.io.requestresponse.Result
- getRestrictionCoder() - Method in class org.apache.beam.io.debezium.KafkaSourceConsumerFn
- getRestrictionCoder() - Method in class org.apache.beam.sdk.io.pulsar.NaiveReadFromPulsarDoFn
- getRestrictionCoder() - Method in class org.apache.beam.sdk.transforms.Watch.WatchGrowthFn
- getResult() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.TransactionResult
-
Returns the result of the transaction execution.
- getResult() - Method in class org.apache.beam.sdk.metrics.BoundedTrieResult
- getResultCoder(Pipeline) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler
-
Provide the result coder.
- getResultCount() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus
- getResults() - Method in class org.apache.beam.sdk.transforms.JsonToRow.ParseResult
-
Returns a
PCollectioncontaining theRows that have been parsed. - getRetainDockerContainers() - Method in interface org.apache.beam.sdk.options.ManualDockerEnvironmentOptions
- getRetainExternalizedCheckpointsOnCancellation() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getRetryableCodes() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getReturnType(RelDataTypeFactory) - Method in class org.apache.beam.sdk.extensions.sql.impl.ScalarFunctionImpl
- getReturnType(RelDataTypeFactory) - Method in class org.apache.beam.sdk.extensions.sql.impl.UdafImpl
- getReturnType(RelDataTypeFactory, SqlOperatorBinding) - Method in class org.apache.beam.sdk.extensions.sql.impl.ScalarFunctionImpl
- getRole() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- getRole() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- getRole() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- getRoot() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- getRootCause() - Method in exception class org.apache.beam.sdk.coders.CannotProvideCoderException
-
Returns the inner-most
CannotProvideCoderExceptionwhen they are deeply nested. - getRootCheckpointDir() - Method in class org.apache.beam.runners.spark.translation.streaming.Checkpoint.CheckpointDir
- getRootElement() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.XmlConfiguration
- getRootSchema() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- getRootTransforms() - Method in interface org.apache.beam.runners.direct.ExecutableGraph
- getRoutingKey() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
- getRow() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertError
- getRow() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertError
- getRow(int) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Rowvalue by field index,IllegalStateExceptionis thrown if schema doesn't match. - getRow(String) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.ROWvalue by field name,IllegalStateExceptionis thrown if schema doesn't match. - getRowCount() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
- getRowCount() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.NodeStats
- getRowGroupSize() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.ParquetConfiguration
- getRowReceiver(TupleTag<T>) - Method in interface org.apache.beam.sdk.transforms.DoFn.MultiOutputReceiver
-
Returns a
DoFn.OutputReceiverfor publishingRowobjects to the given tag. - getRowRestriction() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration
- getRows() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
- getRows() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider.TableWithRows
- getRowSchema() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
- getRowSelector(Schema, FieldAccessDescriptor) - Static method in class org.apache.beam.sdk.schemas.utils.SelectHelpers
- getRowSelectorOptimized(Schema, FieldAccessDescriptor) - Static method in class org.apache.beam.sdk.schemas.utils.SelectHelpers
- getRowsWritten() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResult
-
The number of rows written in this batch.
- getRowToAvroBytesFunction(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
-
Returns a function mapping Beam
Rows to encoded AVROGenericRecords. - getRowToBytesFn(String) - Static method in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformProvider
- getRowToGenericRecordFunction(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
-
Returns a function mapping Beam
Rows to AVROGenericRecords for use inPCollection.setSchema(org.apache.beam.sdk.schemas.Schema, org.apache.beam.sdk.values.TypeDescriptor<T>, org.apache.beam.sdk.transforms.SerializableFunction<T, org.apache.beam.sdk.values.Row>, org.apache.beam.sdk.transforms.SerializableFunction<org.apache.beam.sdk.values.Row, T>). - getRowToJsonBytesFunction(Schema) - Static method in class org.apache.beam.sdk.schemas.utils.JsonUtils
-
Returns a
SimpleFunctionmapping BeamRows to JSON byte[] arrays. - getRowToJsonStringsFunction(Schema) - Static method in class org.apache.beam.sdk.schemas.utils.JsonUtils
- getRowToProtoBytes(String, String) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoByteUtils
- getRowToProtoBytesFn(Class<T>) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoMessageSchema
- getRowToProtoBytesFromSchema(String, String) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoByteUtils
- getRowToRawBytesFunction(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
- getRowToRawBytesFunction(String) - Static method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider
- getRowType() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The type of the primary keys and modified columns within this record.
- getRowType(RelDataTypeFactory) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteTable
- getRpcPriority() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getRule() - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall
- getRuleIndex() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.ArrayQualifierContext
- getRuleIndex() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionComponentContext
- getRuleIndex() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionContext
- getRuleIndex() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.FieldSpecifierContext
- getRuleIndex() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.MapQualifierContext
- getRuleIndex() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifiedComponentContext
- getRuleIndex() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifierListContext
- getRuleNames() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
- getRuleNames() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- getRuleSets() - Static method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRuleSets
- getRunner() - Method in interface org.apache.beam.sdk.options.PipelineOptions
-
The pipeline runner that will be used to execute the pipeline.
- getRunningAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
The time at which the connector started processing this partition.
- getS3ClientBuilder() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
-
Builder used to create the
S3Client. - getS3ClientFactoryClass() - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
- getS3StorageClass() - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
- getS3StorageClass() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
-
The AWS S3 storage class used for creating S3 objects.
- getS3ThreadPoolSize() - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
- getS3ThreadPoolSize() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
-
Thread pool size, limiting the max concurrent S3 operations.
- getS3UploadBufferSizeBytes() - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
- getS3UploadBufferSizeBytes() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
-
Size of S3 upload chnks.
- getSafeFilepattern() - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration
- getSafeSchema() - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration
- getSamplePeriod() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
The length of time sampled request data will be retained.
- getSamplePeriodBucketSize() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
The size of buckets within the specified
samplePeriod. - getSamplingStrategy() - Method in class org.apache.beam.sdk.io.TextRowCountEstimator
- getSasToken() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
- getSaveHeapDumpsToGcsPath() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
CAUTION: This option implies dumpHeapOnOOM, and has similar caveats.
- getSavepointPath() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getSaveProfilesToGcs() - Method in interface org.apache.beam.runners.dataflow.options.DataflowProfilingOptions
- getSbeFields() - Method in class org.apache.beam.sdk.extensions.sbe.SbeSchema
- getScale() - Method in class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeNumber
- getScan() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
- getScanType() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getScheduledAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
The time at which this partition was scheduled to be queried.
- getScheduledExecutorService() - Method in interface org.apache.beam.sdk.options.ExecutorOptions
-
The
ScheduledExecutorServiceinstance to use to create threads, can be overridden to specify aScheduledExecutorServicethat is compatible with the user's environment. - getSchema() - Method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
-
Returns the schema used by this coder.
- getSchema() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoDynamicMessageSchema
-
Deprecated.
- getSchema() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- getSchema() - Method in interface org.apache.beam.sdk.extensions.sql.meta.BeamSqlTable
-
Get the schema info of the table.
- getSchema() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.hcatalog.HCatalogTable
- getSchema() - Method in class org.apache.beam.sdk.extensions.sql.meta.SchemaBaseBeamTable
- getSchema() - Method in class org.apache.beam.sdk.extensions.sql.meta.Table
- getSchema() - Static method in class org.apache.beam.sdk.io.contextualtextio.RecordWithMetadata
- getSchema() - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration
-
The schema used by sources to deserialize data and create Beam Rows.
- getSchema() - Method in class org.apache.beam.sdk.io.gcp.bigquery.AvroWriteRequest
- getSchema() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
- getSchema() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
- getSchema() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
- getSchema() - Method in class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig.IcebergTableInfo
- getSchema() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getSchema() - Method in class org.apache.beam.sdk.io.iceberg.IcebergTableCreateConfig
-
Schema for the destination, in the event that it must be dynamically created.
- getSchema() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- getSchema() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration
- getSchema() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- getSchema() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
-
Getting a schema of a Snowflake table.
- getSchema() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- getSchema() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- getSchema() - Method in class org.apache.beam.sdk.schemas.SchemaCoder
-
Returns the schema associated with this type.
- getSchema() - Method in class org.apache.beam.sdk.schemas.utils.ReflectUtils.ClassWithSchema
- getSchema() - Method in class org.apache.beam.sdk.schemas.utils.ReflectUtils.TypeDescriptorWithSchema
- getSchema() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
- getSchema() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
-
Returns the schema used by this
CoGbkResult. - getSchema() - Method in class org.apache.beam.sdk.values.PCollection
-
Returns the attached schema.
- getSchema() - Method in class org.apache.beam.sdk.values.Row.Builder
-
Return the schema for the row being built.
- getSchema() - Method in class org.apache.beam.sdk.values.Row.FieldValueBuilder
- getSchema() - Method in class org.apache.beam.sdk.values.Row
-
Return
Schemawhich describes the fields. - getSchema(TableReference, BigQueryServices.DatasetService) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableSchemaCache
- getSchema(DestinationT) - Method in class org.apache.beam.sdk.extensions.avro.io.DynamicAvroDestinations
-
Return an AVRO schema for a given destination.
- getSchema(DestinationT) - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
-
Returns the table schema for the destination.
- getSchema(DestinationT) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiDynamicDestinationsTableRow
- getSchema(Class<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
- getSchema(Class<T>, Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
- getSchema(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.PortableBigQueryDestinations
- getSchema(PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Return a Beam
Schemafrom the Pub/Sub schema resource, if exists. - getSchema(PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
-
Return a Beam
Schemafrom the Pub/Sub schema resource, if exists. - getSchema(PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
-
Return a Beam
Schemafrom the Pub/Sub schema resource, if exists. - getSchema(PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- getSchema(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
-
Retrieve a
Schemafor a givenTypeDescriptortype. - getSchemaCoder() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageSchemaCoder
- getSchemaCoder(Class<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
-
Retrieve a
SchemaCoderfor a givenClasstype. - getSchemaCoder(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
-
Retrieve a
SchemaCoderfor a givenTypeDescriptortype. - getSchemaId() - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo
- getSchemaIOProvider() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.avro.AvroTableProvider
- getSchemaIOProvider() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datastore.DataStoreV1TableProvider
- getSchemaIOProvider() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.pubsub.PubsubTableProvider
- getSchemaIOProvider() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.SchemaIOTableProviderWrapper
- getSchemaPath(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Return
PubsubClient.SchemaPathfromPubsubClient.TopicPathif exists. - getSchemaPath(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
-
Return
PubsubClient.SchemaPathfromPubsubClient.TopicPathif exists. - getSchemaPath(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
-
Return
PubsubClient.SchemaPathfromPubsubClient.TopicPathif exists. - getSchemaPath(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- getSchemaProvider(Class<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
-
Retrieve a registered
SchemaProviderfor a givenClass. - getSchemaProvider(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
-
Retrieve a registered
SchemaProviderfor a givenTypeDescriptor. - getSchemaProviders() - Method in class org.apache.beam.sdk.io.aws2.schemas.AwsSchemaRegistrar
- getSchemaProviders() - Method in class org.apache.beam.sdk.schemas.annotations.DefaultSchema.DefaultSchemaProviderRegistrar
- getSchemaProviders() - Method in interface org.apache.beam.sdk.schemas.SchemaProviderRegistrar
-
Returns a list of
schema providerswhich will be registered by default within eachschema registryinstance. - getSchemaRegistry() - Method in class org.apache.beam.sdk.Pipeline
- getSchematizedData() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
-
Gets schematized data.
- getSchemaWithoutAttributes(Schema, List<String>) - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
- getScheme() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsResourceId
- getScheme() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
-
The uri scheme used by resources on this filesystem.
- getScheme() - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem.ClassLoaderResourceId
- getScheme() - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem
- getScheme() - Method in class org.apache.beam.sdk.io.FileSystem
-
Get the URI scheme which defines the namespace of the
FileSystem. - getScheme() - Method in interface org.apache.beam.sdk.io.fs.ResourceId
-
Get the scheme which defines the namespace of the
ResourceId. - getSdkComponents() - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
- getSdkContainerImage() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
-
Container image used to configure SDK execution environment on worker.
- getSdkHarnessContainerImageOverrides() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
Overrides for SDK harness container images.
- getSdkHarnessLogLevelOverrides() - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
-
This option controls the log levels for specifically named loggers.
- getSdkWorkerId() - Method in interface org.apache.beam.sdk.fn.server.HeaderAccessor
-
This method should be called from the request method.
- getSdkWorkerParallelism() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
- getSearchEndPoint() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
- getSeconds() - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Rate
- getSelectedFields() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration
- getSemiPersistDir() - Method in interface org.apache.beam.sdk.options.RemoteEnvironmentOptions
- getSempClientFactory() - Method in class org.apache.beam.sdk.io.solace.read.UnboundedSolaceSource
- getSenderTimestamp() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record
-
Gets the timestamp (in milliseconds since the Unix epoch) when the message was sent by the sender.
- getSendFacility() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
-
Gets send facility.
- getSendTime() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
-
Gets send time.
- getSequenceNumber() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
- getSequenceNumber() - Method in class org.apache.beam.sdk.io.gcp.bigquery.RowMutationInformation
-
Deprecated.
- getSequenceNumber() - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo
- getSequenceNumber() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record
-
Gets the sequence number of the message (if applicable).
- getSerializableFunctionUdfs() - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.UdfUdafProvider
-
For UDFs implement
SerializableFunction. - getSerializableOptions() - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
- getSerializableOptions() - Method in class org.apache.beam.runners.spark.translation.SparkTranslationContext
- getSerializedATN() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
- getSerializedATN() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- getSerializedKey() - Method in class org.apache.beam.runners.flink.adapter.FlinkKey
- getSerializedWindowingStrategy() - Method in class org.apache.beam.sdk.expansion.service.WindowIntoTransformProvider.Configuration
- getSerializer(String, Schema, Map<String, Object>) - Static method in class org.apache.beam.sdk.schemas.io.payloads.PayloadSerializers
- getSerializer(Schema, Map<String, Object>) - Method in class org.apache.beam.sdk.extensions.avro.schemas.io.payloads.AvroPayloadSerializerProvider
- getSerializer(Schema, Map<String, Object>) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoPayloadSerializerProvider
- getSerializer(Schema, Map<String, Object>) - Method in class org.apache.beam.sdk.io.thrift.ThriftPayloadSerializerProvider
- getSerializer(Schema, Map<String, Object>) - Method in class org.apache.beam.sdk.schemas.io.payloads.JsonPayloadSerializerProvider
- getSerializer(Schema, Map<String, Object>) - Method in interface org.apache.beam.sdk.schemas.io.payloads.PayloadSerializerProvider
-
Get a PayloadSerializer.
- getServer() - Method in class org.apache.beam.sdk.fn.server.GrpcFnServer
-
Get the underlying
Servercontained by thisGrpcFnServer. - getServerFactory() - Method in class org.apache.beam.runners.fnexecution.environment.DockerEnvironmentFactory.Provider
- getServerFactory() - Method in class org.apache.beam.runners.fnexecution.environment.EmbeddedEnvironmentFactory.Provider
- getServerFactory() - Method in interface org.apache.beam.runners.fnexecution.environment.EnvironmentFactory.Provider
-
Create the
ServerFactoryapplicable to this environment. - getServerFactory() - Method in class org.apache.beam.runners.fnexecution.environment.ExternalEnvironmentFactory.Provider
- getServerName() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- getServerName() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- getServerName() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- getServerTransactionId() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The unique transaction id in which the modifications occurred.
- getService() - Method in class org.apache.beam.sdk.fn.server.GrpcFnServer
-
Get the service exposed by this
GrpcFnServer. - getServiceAccount() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
-
Run the job as a specific service account, instead of the default GCE robot.
- getServiceURL(String) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getServiceURL(String, String) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getSessionProperties() - Method in class org.apache.beam.sdk.io.solace.broker.JcsmpSessionService
- getSessionProperties() - Method in class org.apache.beam.sdk.io.solace.broker.SessionService
-
Override this method and provide your specific properties, including all those related to authentication, and possibly others too.
- getSessionServiceFactory() - Method in class org.apache.beam.sdk.io.solace.read.UnboundedSolaceSource
- getSetFieldCreator(TypeDescriptor<T>, Schema, FieldValueTypeSupplier, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.POJOUtils
- getSetters(TypeDescriptor<?>, Schema, FieldValueTypeSupplier, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
-
Return the list of
FieldValueSetters for a Java Bean class - getSetters(TypeDescriptor<T>, Schema, FieldValueTypeSupplier, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.POJOUtils
- getSha256() - Method in class org.apache.beam.runners.dataflow.util.PackageUtil.StagedFile
-
The SHA-256 hash of the source file.
- getShard() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
- getShardId() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
- getShardingFunction() - Method in class org.apache.beam.sdk.io.WriteFiles
- getShardNameTemplate() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration
-
See
ShardNameTemplatefor the expected values. - getShardNumber() - Method in class org.apache.beam.sdk.values.ShardedKey
-
Deprecated.
- getShardTemplate() - Method in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformConfiguration
- getSharedKeySize() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.KeyPrefix
- getShortTableUrn() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
-
Return shortened tablespec in datasets/[dataset]/tables/[table] format.
- getShortValue() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
-
Return the optional short value for an item, or null if none is provided.
- getShortValue() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
-
The optional short value for an item, or
nullif none is provided. - getShutdownSourcesAfterIdleMs() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getSideInput(String) - Method in interface org.apache.beam.runners.fnexecution.translation.BatchSideInputHandlerFactory.SideInputGetter
- getSideInputBroadcast(PCollection<T>, SideInputValues.Loader<T>) - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator.TranslationState
- getSideInputBroadcast(PCollection<T>, SideInputValues.Loader<T>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
- getSideInputDataSets() - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
- getSideInputKeys() - Method in class org.apache.beam.runners.twister2.translators.functions.DoFnFunction
-
get the tag id's of all the keys.
- getSideInputs() - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.ParDoSingle
- getSideInputs() - Method in class org.apache.beam.runners.twister2.Twister2PipelineExecutionEnvironment
- getSideInputs() - Method in class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
-
Override to specify that this object needs access to one or more side inputs.
- getSideInputs() - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
-
Specifies that this object needs access to one or more side inputs.
- getSideInputs() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiDynamicDestinationsTableRow
- getSideInputs() - Method in class org.apache.beam.sdk.transforms.Combine.Globally
-
Returns the side inputs used by this Combine operation.
- getSideInputs() - Method in class org.apache.beam.sdk.transforms.Combine.GroupedValues
- getSideInputs() - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
-
Returns the side inputs used by this Combine operation.
- getSideInputs() - Method in class org.apache.beam.sdk.transforms.Combine.PerKeyWithHotKeyFanout
-
Returns the side inputs used by this Combine operation.
- getSideInputs() - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
- getSideInputs() - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
- getSideInputs() - Method in class org.apache.beam.sdk.transforms.Requirements
-
The side inputs that this
Contextfulneeds access to. - getSideInputs(Iterable<PCollectionView<?>>, JavaSparkContext, SparkPCollectionView) - Static method in class org.apache.beam.runners.spark.translation.TranslationUtils
-
Create SideInputs as Broadcast variables.
- getSideInputs(ExecutableStage) - Static method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors
- getSideInputSpecs() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.ExecutableProcessBundleDescriptor
-
Get a mapping from PTransform id to side input id to
side inputsthat are used during execution. - getSideInputWindow(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.WindowMappingFn
-
Returns the window of the side input corresponding to the given window of the main input.
- getSingleFileMetadata() - Method in class org.apache.beam.sdk.io.FileBasedSource
-
Returns the information about the single file that this source is reading from.
- getSinglePCollection() - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
-
Like
PCollectionRowTuple.get(String), but is a convenience method to get a single PCollection without providing a tag for that output. - getSingleTokenNewPartition(Range.ByteStringRange) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
-
Return a new NewPartition that only contains one token that matches the parentPartition.
- getSingleWorkerStatus(String, long, TimeUnit) - Method in class org.apache.beam.runners.fnexecution.status.BeamWorkerStatusGrpcService
-
Get the latest SDK worker status from the client's corresponding SDK harness.
- getSink() - Method in interface org.apache.beam.runners.fnexecution.control.ControlClientPool
-
Sink for control clients.
- getSink() - Method in class org.apache.beam.runners.fnexecution.control.MapControlClientPool
- getSink() - Method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
-
Returns the FileBasedSink for this write operation.
- getSink() - Method in class org.apache.beam.sdk.io.WriteFiles
- getSinkGroupId() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- getSinks() - Static method in class org.apache.beam.sdk.metrics.Lineage
-
Lineagerepresenting sinks. - getSize() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.FixedBytesField
-
Get the size.
- getSize() - Method in class org.apache.beam.sdk.extensions.sql.impl.TVFSlidingWindowFn
-
Size of the generated windows.
- getSize() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.DetectNewPartitionsDoFn
- getSize() - Method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeBinary
- getSize() - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
- getSize() - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
- getSize(StreamProgress) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.ReadChangeStreamPartitionDoFn
- getSize(PartitionMetadata, TimestampRange) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
- getSize(TimestampRange) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
- getSize(PulsarSourceDescriptor, OffsetRange) - Method in class org.apache.beam.sdk.io.pulsar.NaiveReadFromPulsarDoFn
- getSketchFromByteBuffer(ByteBuffer) - Static method in class org.apache.beam.sdk.extensions.zetasketch.HllCount
-
Converts the passed-in sketch from
ByteBuffertobyte[], mappingnull ByteBuffers (representing empty sketches) to emptybyte[]s. - getSkipHeaderLines() - Method in class org.apache.beam.sdk.io.TextRowCountEstimator
- getSkipKeyClone() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
- getSkipValueClone() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
- getSnapshot() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getSnapshotId() - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo
- getSnapshots() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteResult
- getSnowPipe() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- getSocketTimeout() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
- getSorterType() - Method in class org.apache.beam.sdk.extensions.sorter.ExternalSorter.Options
-
Returns the sorter type.
- getSource() - Method in class org.apache.beam.runners.dataflow.util.PackageUtil.StagedFile
-
The file to stage.
- getSource() - Method in class org.apache.beam.runners.flink.translation.wrappers.SourceInputSplit
- getSource() - Method in interface org.apache.beam.runners.fnexecution.control.ControlClientPool
-
Source of control clients.
- getSource() - Method in class org.apache.beam.runners.fnexecution.control.MapControlClientPool
- getSource() - Method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.Read
- getSource() - Method in class org.apache.beam.sdk.io.Read.Bounded
-
Returns the
BoundedSourceused to create thisReadPTransform. - getSource() - Method in class org.apache.beam.sdk.io.Read.Unbounded
-
Returns the
UnboundedSourceused to create thisReadPTransform. - getSource() - Method in class org.apache.beam.sdk.io.TextIO.Read
- getSource() - Method in class org.apache.beam.sdk.io.TFRecordIO.Read
- getSources() - Static method in class org.apache.beam.sdk.metrics.Lineage
-
Lineagerepresenting sources and optionally side inputs. - getSparkCheckpointDir() - Method in class org.apache.beam.runners.spark.translation.streaming.Checkpoint.CheckpointDir
- getSparkContext() - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
- getSparkContext() - Method in class org.apache.beam.runners.spark.translation.SparkTranslationContext
- getSparkContext(SparkPipelineOptions) - Static method in class org.apache.beam.runners.spark.translation.SparkContextFactory
- getSparkMaster() - Method in interface org.apache.beam.runners.spark.SparkCommonPipelineOptions
- getSparkPCollectionViewType() - Method in class org.apache.beam.runners.spark.util.SideInputBroadcast
- getSparkReceiverClass() - Method in class org.apache.beam.sdk.io.sparkreceiver.ReceiverBuilder
- getSparkSession() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.EvaluationContext
- getSparkSession() - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator.TranslationState
- getSparkSession() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
- getSplit() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.SerializableSplit
- getSplitBacklogBytes() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
- getSplitBacklogBytes() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
-
Returns the size of the backlog of unread data in the underlying data source represented by this split of this source.
- getSplitNumber() - Method in class org.apache.beam.runners.flink.translation.wrappers.SourceInputSplit
- getSplitPointsConsumed() - Method in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
-
Returns the total amount of parallelism in the consumed (returned and processed) range of this reader's current
BoundedSource(as would be returned byBoundedSource.BoundedReader.getCurrentSource()). - getSplitPointsConsumed() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
- getSplitPointsConsumed() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
- getSplitPointsConsumed() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
- getSplitPointsProcessed() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
-
Returns the total number of split points that have been processed.
- getSplitPointsRemaining() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource.AvroReader
- getSplitPointsRemaining() - Method in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
-
Returns the total amount of parallelism in the unprocessed part of this reader's current
BoundedSource(as would be returned byBoundedSource.BoundedReader.getCurrentSource()). - getSplitPointsRemaining() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
- getSplitPointsRemaining() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
- getSplitSerializer() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSource
- getSplitSources() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.UnboundedSourceWrapper
-
Visible so that we can check this in tests.
- getSplitState() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceSplit
- getSSEAlgorithm() - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
- getSSEAlgorithm() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
-
Algorithm for SSE-S3 encryption, e.g.
- getSSECustomerKey() - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
- getSSECustomerKey() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
-
SSE key for SSE-C encryption, e.g.
- getSSEKMSKeyId() - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
- getSSEKMSKeyId() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
-
KMS key id for SSE-KMS encyrption, e.g.
- getSsl() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- getStableUniqueNames() - Method in interface org.apache.beam.sdk.options.PipelineOptions
-
Whether to check for stable unique names on each transform.
- getStackTrace() - Method in class org.apache.beam.io.requestresponse.ApiIOError
-
The
Exceptionstack trace. - getStackTrace() - Method in class org.apache.beam.sdk.io.csv.CsvIOParseError
-
The caught
Throwable.getStackTrace(). - getStackTrace() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOError
- getStageBundleFactory(ExecutableStage) - Method in class org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext
- getStageBundleFactory(ExecutableStage) - Method in interface org.apache.beam.runners.fnexecution.control.ExecutableStageContext
- getStagedArtifacts(String) - Method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService
-
Returns the rewritten artifacts associated with this job, keyed by environment.
- getStageName() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getStager() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
The resource stager instance that should be used to stage resources.
- getStagerClass() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
The class responsible for staging resources to be accessible by workers during job execution.
- getStagingBucketDir() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
-
Getting directory where files are staged.
- getStagingBucketDir() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeStreamingServiceConfig
-
Getter for a bucket name with directory where files were staged and waiting for loading.
- getStagingBucketName() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- getStagingBucketName() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- getStagingLocation() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
-
GCS path for staging local files, e.g.
- getStart() - Method in class org.apache.beam.sdk.extensions.ordered.ContiguousSequenceRange
- getStart() - Method in class org.apache.beam.sdk.io.cassandra.RingRange
- getStart() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.HL7v2MessagePages
- getStart() - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration
- getStartAtTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration
- getStartDate() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
- getStartDate() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
- getStartDate() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
- getStartingStrategy() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getStartKey() - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
-
Returns the
ByteKeyrepresenting the lower bound of thisByteKeyRange. - getStartOffset() - Method in class org.apache.beam.sdk.io.OffsetBasedSource
-
Returns the starting offset of the source.
- getStartPosition() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
- getStartPosition() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
- getStartPosition() - Method in interface org.apache.beam.sdk.io.range.RangeTracker
-
Returns the starting position of the current range, inclusive.
- getStartReadTime() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getStartTime() - Method in class org.apache.beam.sdk.io.CountingSource.CounterMark
-
Returns the time the reader was started.
- getStartTime() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.InitialPipelineState
- getStartTime() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
- getStartTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
-
It is the partition_start_time of the child partition token.
- getStartTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
It is the start time at which the partition started existing in Cloud Spanner.
- getStartTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionStartRecord
-
It is the partition start time of the partition tokens.
- getState() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
- getState() - Method in class org.apache.beam.runners.dataflow.util.DataflowTemplateJob
- getState() - Method in class org.apache.beam.runners.direct.DirectRunner.DirectPipelineResult
- getState() - Method in class org.apache.beam.runners.flink.FlinkDetachedRunnerResult
- getState() - Method in class org.apache.beam.runners.flink.FlinkRunnerResult
- getState() - Method in class org.apache.beam.runners.jet.FailedRunningPipelineResults
- getState() - Method in class org.apache.beam.runners.jet.JetPipelineResult
- getState() - Method in class org.apache.beam.runners.jobsubmission.JobInvocation
-
Retrieve the job's current state.
- getState() - Method in class org.apache.beam.runners.spark.SparkPipelineResult
- getState() - Method in class org.apache.beam.runners.spark.stateful.SparkStateInternals
- getState() - Method in class org.apache.beam.runners.spark.stateful.StateAndTimers
- getState() - Method in class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingPipelineResult
- getState() - Method in class org.apache.beam.runners.twister2.Twister2PipelineResult
- getState() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
The state in which the current partition is in.
- getState() - Method in interface org.apache.beam.sdk.PipelineResult
-
Retrieves the current state of the pipeline execution.
- getState() - Method in interface org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimator
-
Get current state of the
WatermarkEstimatorinstance, which can be used to recreate theWatermarkEstimatorwhen processing the restriction. - getState() - Method in class org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimators.Manual
- getState() - Method in class org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimators.MonotonicallyIncreasing
- getState() - Method in class org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimators.WallTime
- getState(JobApi.GetJobStateRequest, StreamObserver<JobApi.JobStateEvent>) - Method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
- getState(JobApi.GetJobStateRequest, StreamObserver<JobApi.JobStateEvent>) - Method in class org.apache.beam.runners.portability.testing.TestJobService
- getStateBackend() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getStateBackendFactory() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
-
Deprecated.Please use setStateBackend below.
- getStateBackendStoragePath() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getStateCoder() - Method in interface org.apache.beam.sdk.transforms.Watch.Growth.TerminationCondition
-
Used to encode the state of this
Watch.Growth.TerminationCondition. - getStateCoder(Pipeline) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler
-
Provide the state coder.
- getStateEvent() - Method in class org.apache.beam.runners.jobsubmission.JobInvocation
-
Retrieve the job's current state.
- getStateStream(JobApi.GetJobStateRequest, StreamObserver<JobApi.JobStateEvent>) - Method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
- getStaticCreator(TypeDescriptor<?>, Method, Schema, FieldValueTypeSupplier, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
- getStaticCreator(TypeDescriptor<?>, Method, Schema, FieldValueTypeSupplier, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.POJOUtils
- getStatistic() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteTable
- getStatistics(BaseStatistics) - Method in class org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat
- getStatistics(BaseStatistics) - Method in class org.apache.beam.runners.flink.translation.wrappers.SourceInputFormat
- getStatus() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteFailure
- getStatusCode() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOError
- getStatusDate() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus
- getStatusUpdateFrequency() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler
-
Determines the frequency of emission of the
OrderedProcessingStatuselements. - getStepName() - Method in class org.apache.beam.runners.spark.translation.SparkProcessContext
- getStepNames() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator.JobSpecification
-
Returns the mapping of
AppliedPTransformsto the internal step name for thatAppliedPTransform. - getStopPipelineWatermark() - Method in interface org.apache.beam.runners.spark.TestSparkPipelineOptions
- getStopPosition() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
- getStopPosition() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
- getStopPosition() - Method in interface org.apache.beam.sdk.io.range.RangeTracker
-
Returns the ending position of the current range, exclusive.
- getStopReadTime() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getStorageApiAppendThresholdBytes() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getStorageApiAppendThresholdRecordCount() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getStorageClient(BigQueryOptions) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices
-
Returns a real, mock, or fake
BigQueryServices.StorageClient. - getStorageClient(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl
- getStorageClient(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
- getStorageIntegrationName() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- getStorageIntegrationName() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
-
Getting Snowflake integration which is used in COPY statement.
- getStorageIntegrationName() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- getStorageLevel() - Method in interface org.apache.beam.runners.spark.SparkCommonPipelineOptions
- getStorageWriteApiMaxRequestSize() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getStorageWriteApiMaxRetries() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getStorageWriteApiTriggeringFrequencySec() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getStorageWriteMaxInflightBytes() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getStorageWriteMaxInflightRequests() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getStreamAppendClient(String, DescriptorProtos.DescriptorProto, boolean, AppendRowsRequest.MissingValueInterpretation) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.WriteStreamService
-
Create an append client for a given Storage API write stream.
- getStreamAppendClient(String, DescriptorProtos.DescriptorProto, boolean, AppendRowsRequest.MissingValueInterpretation) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.WriteStreamServiceImpl
- getStreamAppendClient(String, DescriptorProtos.DescriptorProto, boolean, AppendRowsRequest.MissingValueInterpretation) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- getStreaming() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getStreamingContext() - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
- getStreamingContext() - Method in class org.apache.beam.runners.spark.translation.SparkStreamingTranslationContext
- getStreamingService() - Method in interface org.apache.beam.sdk.io.snowflake.services.SnowflakeServices
- getStreamingService() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeServicesImpl
- getStreamingSideInputCacheExpirationMillis() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getStreamingSideInputCacheMb() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getStreamingTimeoutMs() - Method in interface org.apache.beam.runners.spark.SparkPortableStreamingPipelineOptions
- getStreamName() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
- getStreamSources() - Method in class org.apache.beam.runners.spark.translation.streaming.UnboundedDataset
- getStreamTableDebugString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.DaoFactory
- getString() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
- getString(int) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Stringvalue by field index,ClassCastExceptionis thrown if schema doesn't match. - getString(String) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.STRINGvalue by field name,IllegalStateExceptionis thrown if schema doesn't match. - getString(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- getString(Map<String, Object>, String, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- getStrings(Map<String, Object>, String, List<String>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- getStringSet() - Method in class org.apache.beam.sdk.metrics.StringSetResult.EmptyStringSetResult
-
Returns an empty immutable set.
- getStringSet() - Method in class org.apache.beam.sdk.metrics.StringSetResult
- getStringSet(MetricName) - Method in class org.apache.beam.runners.jet.metrics.JetMetricsContainer
- getStringSet(MetricName) - Method in interface org.apache.beam.sdk.metrics.MetricsContainer
-
Return the
StringSetthat should be used for implementing the givenmetricNamein this container. - getStringSets() - Method in class org.apache.beam.sdk.metrics.MetricQueryResults
-
Return the metric results for the sets that matched the filter.
- getStuckCommitDurationMillis() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getSubmissionMode() - Method in class org.apache.beam.sdk.io.solace.broker.SessionServiceFactory
- getSubmissionMode() - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
- getSubnetwork() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
-
GCE subnetwork for launching workers.
- getSubProvider(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.FullNameTableProvider
- getSubProvider(String) - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider
-
Returns a sub-provider, e.g.
- getSubProviders() - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider
-
Returns all sub-providers, e.g.
- getSubSchema(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.CatalogManagerSchema
- getSubSchema(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.CatalogSchema
- getSubSchema(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
-
If this is the root schema (in other words, a
CatalogManager), the sub schema will be aCatalog's metastore. - getSubSchemaNames() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
- getSubSchemaNames() - Method in class org.apache.beam.sdk.extensions.sql.impl.CatalogManagerSchema
- getSubSchemaNames() - Method in class org.apache.beam.sdk.extensions.sql.impl.CatalogSchema
- getSubscription() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
- getSubscription() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
-
Get the subscription being read from.
- getSubscriptionName() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
- getSubscriptionProvider() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
-
Get the
ValueProviderfor the subscription being read from. - getSubSequenceNumber() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
- getSubtype(Class<? extends T>) - Method in class org.apache.beam.sdk.values.TypeDescriptor
-
Returns the generic form of a subtype.
- getSuccessfulBodies() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
- getSuccessfulBodies() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.AbstractResult
- getSuccessfulBodies() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.Result
-
Gets successful bodies from Write.
- getSuccessfulBundles() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
-
Gets successful FhirBundleResponse from execute bundles operation.
- getSuccessfulInserts() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
-
Returns a
PCollectioncontaining theTableRows that were written to BQ via the streaming insert API. - getSuccessfulPublish() - Method in class org.apache.beam.sdk.io.solace.write.SolaceOutput
- getSuccessfulStorageApiInserts() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
-
Return all rows successfully inserted using one of the storage-api insert methods.
- getSuccessfulTableLoads() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
-
Returns a
PCollectioncontaining theTableDestinations that were successfully loaded using the batch load API. - getSuggestedFilenameSuffix() - Method in enum class org.apache.beam.sdk.io.FileBasedSink.CompressionType
-
Deprecated.
- getSuggestedFilenameSuffix() - Method in interface org.apache.beam.sdk.io.FileBasedSink.OutputFileHints
- getSuggestedSuffix() - Method in enum class org.apache.beam.sdk.io.Compression
- getSum() - Method in class org.apache.beam.sdk.metrics.DistributionResult
- getSum() - Method in class org.apache.beam.sdk.testing.CoderProperties.TestElementByteSizeObserver
- getSumAndReset() - Method in class org.apache.beam.sdk.testing.CoderProperties.TestElementByteSizeObserver
- getSummary() - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo
- getSupertype(Class<? super T>) - Method in class org.apache.beam.sdk.values.TypeDescriptor
-
Returns the generic form of a supertype.
- getSupported() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryFilter
- getSupported() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.iceberg.IcebergFilter
- getSupported() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableFilter
- getSupportedClass() - Method in interface org.apache.beam.runners.dataflow.util.CloudObjectTranslator
-
Gets the class this
CloudObjectTranslatoris capable of converting. - getSupportedClass() - Method in class org.apache.beam.runners.dataflow.util.RowCoderCloudObjectTranslator
- getSupportedClass() - Method in class org.apache.beam.runners.dataflow.util.SchemaCoderCloudObjectTranslator
- getSynchronizedProcessingOutputWatermark() - Method in interface org.apache.beam.runners.local.Bundle
-
Returns the processing time output watermark at the time the producing
Executablecommitted this bundle. - getSynchronizedProcessingTime() - Method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder.SparkWatermarks
- getTable() - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration
- getTable() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertError
- getTable() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
-
Returns the table to read, or
nullif reading from a query instead. - getTable() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- getTable() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Returns the table reference, or
null. - getTable() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
- getTable() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration
- getTable() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- getTable() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getTable() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration
- getTable() - Method in class org.apache.beam.sdk.io.kudu.TableAndRecord
- getTable() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadConfiguration
- getTable() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteConfiguration
- getTable() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- getTable() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
-
Getting a table as a source of reading or destination to write.
- getTable() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- getTable(TableReference) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Gets the specified
Tableresource by table ID. - getTable(TableReference) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
-
Gets the specified
Tableresource by table ID. - getTable(TableReference) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- getTable(TableReference, List<String>) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
- getTable(TableReference, List<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
- getTable(TableReference, List<String>) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- getTable(TableReference, List<String>, BigQueryServices.DatasetService.TableMetadataView) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
- getTable(TableReference, List<String>, BigQueryServices.DatasetService.TableMetadataView) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
- getTable(TableReference, List<String>, BigQueryServices.DatasetService.TableMetadataView) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- getTable(DestinationT) - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
-
Returns a
TableDestinationobject for the destination. - getTable(DestinationT) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiDynamicDestinationsTableRow
- getTable(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
- getTable(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.CatalogManagerSchema
- getTable(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.CatalogSchema
- getTable(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
- getTable(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.iceberg.IcebergMetastore
- getTable(String) - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider
-
Get a specific table from this provider it is present, or null if it is not present.
- getTable(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.PortableBigQueryDestinations
- getTable(BigQueryOptions, TableReference) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
- getTableAdminClient() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.BigtableChangeStreamAccessor
- getTableByFullName(TableName) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
- getTableByFullName(TableName) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.FullNameTableProvider
- getTableConstraints(DestinationT) - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
-
Returns TableConstraints (including primary and foreign key) to be used when creating the table.
- getTableConstraints(DestinationT) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiDynamicDestinationsTableRow
- getTableConstraints(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.PortableBigQueryDestinations
- getTableCreateConfig() - Method in class org.apache.beam.sdk.io.iceberg.IcebergDestination
-
Metadata and constraints for creating a new table, if it must be done dynamically.
- getTableDescription() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- getTableId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Returns the table being read from.
- getTableId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration
- getTableId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration
- getTableId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
-
Return the metadata table name.
- getTableId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration
- getTableId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration
- getTableId() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
- getTableId() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Write
- getTableId() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.WriteRowMutations
- getTableIdentifier() - Method in class org.apache.beam.sdk.io.iceberg.IcebergDestination
-
The iceberg table identifier to write data to.
- getTableIdentifier() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getTableIdentifier() - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo
- getTableIdentifierString() - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo
- getTableImpl(TableReference, List<String>, BigQueryServices.DatasetService.TableMetadataView) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- getTableName() - Method in class org.apache.beam.sdk.extensions.sql.impl.TableName
-
Table name, the last element of the fully-specified table name with path.
- getTableName() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataTableNames
- getTableName() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The name of the table in which the modifications within this record occurred.
- getTableNames() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
- getTableNames() - Method in class org.apache.beam.sdk.extensions.sql.impl.CatalogManagerSchema
- getTableNames() - Method in class org.apache.beam.sdk.extensions.sql.impl.CatalogSchema
- getTableProperties() - Method in class org.apache.beam.sdk.io.iceberg.IcebergTableCreateConfig
- getTableProperties() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration
- getTableProvider() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
- getTableProvider() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
-
Returns the table to read, or
nullif reading from a query instead. - getTableProvider() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- getTableReference() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- getTableResource(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
- getTableRow() - Method in class org.apache.beam.sdk.io.gcp.bigquery.RowMutation
- getTables() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
- getTables() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.iceberg.IcebergMetastore
- getTables() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.InMemoryMetaTableProvider
- getTables() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.ReadOnlyTableProvider
- getTables() - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider
-
Get all tables from this provider.
- getTables() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider
- getTables() - Method in class org.apache.beam.sdk.extensions.sql.meta.store.InMemoryMetaStore
- getTables() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema
- getTableSchema() - Method in class org.apache.beam.sdk.io.gcp.bigquery.SchemaAndRecord
- getTableSchema() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.WriteBuilder.Configuration
- getTableSchema() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
- getTableSchema(String, String) - Static method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO
-
Returns
TableSchemafor a given table. - getTableSchema(String, String) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogBeamSchema
-
Gets the table schema, or absent optional if the table doesn't exist in the database.
- getTableSpec() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration
-
Specifies a table for a BigQuery read job.
- getTableSpec() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration
- getTableSpec() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
-
Return the tablespec in [project:].dataset.tableid format.
- getTableStatistics(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.sql.meta.BaseBeamTable
- getTableStatistics(PipelineOptions) - Method in interface org.apache.beam.sdk.extensions.sql.meta.BeamSqlTable
-
Estimates the number of rows or the rate for unbounded Tables.
- getTableStatistics(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datagen.DataGeneratorTable
- getTableStatistics(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
- getTableStatistics(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTable
- getTableStatistics(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestBoundedTable
- getTableStatistics(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestUnboundedTable
- getTableStatistics(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTable
- getTableStatistics(PipelineOptions, SchemaIO) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datastore.DataStoreV1TableProvider
- getTableStatistics(PipelineOptions, SchemaIO) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.SchemaIOTableProviderWrapper
- getTableStringIdentifier(ValueInSingleWindow<Row>) - Method in interface org.apache.beam.sdk.io.iceberg.DynamicDestinations
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.avro.AvroTableProvider
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTableProvider
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigtable.BigtableTableProvider
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datagen.DataGeneratorTableProvider
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datastore.DataStoreV1TableProvider
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.iceberg.IcebergMetastore
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.KafkaTableProvider
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTableProvider
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.parquet.ParquetTableProvider
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.pubsub.PubsubTableProvider
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.pubsublite.PubsubLiteTableProvider
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.ReadOnlyTableProvider
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.SchemaIOTableProviderWrapper
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.seqgen.GenerateSequenceTableProvider
- getTableType() - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider
-
Gets the table type this provider handles.
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTableProvider
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.store.InMemoryMetaStore
- getTableUrn(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
-
Return the tablespec in projects/[project]/datasets/[dataset]/tables/[table] format.
- getTag() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getTag() - Method in class org.apache.beam.sdk.values.TaggedPValue
-
Returns the local tag associated with the
PValue. - getTag(int) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
-
Returns the tuple tag at the given index.
- getTagInternal() - Method in interface org.apache.beam.sdk.values.PCollectionView
-
Deprecated.this method will be removed entirely. The
PCollectionunderlying a side input, is part of the side input's specification with aParDotransform, which will obtain that information via a package-private channel. - getTagInternal() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
-
Returns a unique
TupleTagidentifying thisPCollectionView. - getTargetDataset() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.TestBigQueryOptions
- getTargetParallelism() - Method in interface org.apache.beam.runners.direct.DirectOptions
- getTargetTable(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
- getTargetTableId(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
- getTempDatasetId() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getTempDirectory() - Method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
- getTempDirectoryProvider() - Method in class org.apache.beam.sdk.io.FileBasedSink
-
Returns the directory inside which temporary files will be written according to the configured
FileBasedSink.FilenamePolicy. - getTempFilename() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
- getTemplateLocation() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
-
Where the runner should generate a template file.
- getTempLocation() - Method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter.Options
-
Returns the configured temporary location.
- getTempLocation() - Method in class org.apache.beam.sdk.extensions.sorter.ExternalSorter.Options
-
Returns the configured temporary location.
- getTempLocation() - Method in interface org.apache.beam.sdk.options.PipelineOptions
-
A pipeline level default location for storing temporary files.
- getTempRoot() - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
- GETTER_WITH_NULL_METHOD_ERROR - Static variable in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
- GetterBasedSchemaProvider - Class in org.apache.beam.sdk.schemas
-
Deprecated.new implementations should extend the
GetterBasedSchemaProviderV2class' methods which receiveTypeDescriptors instead of ordinaryClasses as arguments, which permits to support generic type signatures during schema inference - GetterBasedSchemaProvider() - Constructor for class org.apache.beam.sdk.schemas.GetterBasedSchemaProvider
-
Deprecated.
- GetterBasedSchemaProviderBenchmark - Class in org.apache.beam.sdk.jmh.schemas
-
Benchmarks for
GetterBasedSchemaProvideron reading / writing fields based ontoRowFunction/fromRowFunction. - GetterBasedSchemaProviderBenchmark() - Constructor for class org.apache.beam.sdk.jmh.schemas.GetterBasedSchemaProviderBenchmark
- GetterBasedSchemaProviderV2 - Class in org.apache.beam.sdk.schemas
-
A newer version of
GetterBasedSchemaProvider, which works withTypeDescriptors, and which by default delegates the old,Classbased methods, to the new ones. - GetterBasedSchemaProviderV2() - Constructor for class org.apache.beam.sdk.schemas.GetterBasedSchemaProviderV2
- getTerminateAfterSecondsSinceNewOutput() - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration
-
If no new files are found after this many seconds, this transform will cease to watch for new files.
- GetterTypeSupplier() - Constructor for class org.apache.beam.sdk.schemas.JavaBeanSchema.GetterTypeSupplier
- getTestMode() - Method in interface org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingPipelineOptions
-
Set to true to run the job in test mode.
- getTestTimeoutSeconds() - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
- getThrottleDuration() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
The amount of time an attempt will be throttled if deemed necessary based on previous success rate.
- getThroughputEstimate() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
- getTimeDomain() - Method in interface org.apache.beam.sdk.state.TimerSpec
- getTimePartitioning() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- getTimerDataIterator() - Method in class org.apache.beam.runners.spark.translation.SparkProcessContext
- getTimerFamilyId() - Method in class org.apache.beam.sdk.fn.data.LogicalEndpoint
- getTimerFamilyId() - Method in class org.apache.beam.sdk.fn.data.TimerEndpoint
- getTimerReceivers() - Method in interface org.apache.beam.runners.fnexecution.control.RemoteBundle
-
Get a map of (transform id, timer id) to
receivers which consume timers, forwarding them to the remote environment. - getTimerReceivers() - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient.BundleProcessor.ActiveBundle
- getTimers() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
- getTimers() - Method in class org.apache.beam.runners.spark.stateful.StateAndTimers
- getTimers(MetricFilter) - Method in class org.apache.beam.runners.spark.metrics.WithMetricsSupport
- getTimers(MetricFilter) - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.WithMetricsSupport
- getTimerSpec() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.TimerSpec
- getTimerSpecs() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.ExecutableProcessBundleDescriptor
-
Get a mapping from PTransform id to timer id to
timer specsthat are used during execution. - getTimes() - Method in class org.apache.beam.runners.spark.io.CreateStream
-
Get times so they can be pushed into the
GlobalWatermarkHolder. - getTimestamp() - Method in class org.apache.beam.sdk.extensions.ordered.ContiguousSequenceRange
- getTimestamp() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Document
- getTimestamp() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload
- getTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.HeartbeatRecord
-
Indicates the timestamp for which the change stream query has returned all changes.
- getTimestamp() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getTimestamp() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
- getTimestamp() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
- getTimestamp() - Method in class org.apache.beam.sdk.metrics.GaugeResult.EmptyGaugeResult
- getTimestamp() - Method in class org.apache.beam.sdk.metrics.GaugeResult
- getTimestamp() - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow
-
Returns the timestamp of this
FailsafeValueInSingleWindow. - getTimestamp() - Method in class org.apache.beam.sdk.values.TimestampedValue
- getTimestamp() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow
-
Returns the timestamp of this
ValueInSingleWindow. - getTimestamp() - Method in interface org.apache.beam.sdk.values.WindowedValue
-
The timestamp of this value in event time.
- getTimestamp() - Method in class org.apache.beam.sdk.values.WindowedValues.Builder
- getTimestamp() - Method in class org.apache.beam.sdk.values.WindowedValues.ParamWindowedValueCoder
- getTimestamp(T, Instant) - Method in interface org.apache.beam.sdk.io.kafka.KafkaPublishTimestampFunction
-
Returns timestamp for element being published to Kafka.
- getTimeStamp() - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsMessage
-
Timestamp the message was sent at (in epoch millis).
- getTimestampAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
- getTimestampAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
-
Get the timestamp attribute.
- getTimestampAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
-
Get the timestamp attribute.
- getTimestampAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration
- getTimestampCombiner() - Method in interface org.apache.beam.sdk.state.WatermarkHoldState
-
Return the
TimestampCombinerwhich will be used to determine a watermark hold time given an element timestamp, and to combine watermarks from windows which are about to be merged. - getTimestampCombiner() - Method in class org.apache.beam.sdk.values.WindowingStrategy
- getTimestampFn() - Method in class org.apache.beam.sdk.io.solace.read.UnboundedSolaceSource
- getTimestampForRecord(TimestampPolicy.PartitionContext, KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.CustomTimestampPolicyWithLimitedDelay
- getTimestampForRecord(TimestampPolicy.PartitionContext, KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicy
-
Returns record timestamp (aka event time).
- getTimestampForRecord(TimestampPolicy.PartitionContext, KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.LogAppendTimePolicy
- getTimestampForRecord(TimestampPolicy.PartitionContext, KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.ProcessingTimePolicy
- getTimestampForRecord(TimestampPolicy.PartitionContext, KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.TimestampFnPolicy
- getTimestampMillis() - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo
- getTimestampMsSinceEpoch() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.OutgoingMessage
-
Timestamp for element (ms since epoch).
- getTimestampPolicyFactory() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
- getTimestampPolicyFactory() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getTimestampTransforms() - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
-
The transforms applied to the arrival time of an element to determine when this trigger allows output.
- getTimestampType() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
- getTimeToLive() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record
-
The number of milliseconds before the message is discarded or moved to Dead Message Queue.
- getTimeZone() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
- getTimeZone() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
- getTimeZone() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
- getTiming() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
-
Return the timing of this pane.
- getTo() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRange
-
Returns the range end timestamp (exclusive).
- getTo() - Method in class org.apache.beam.sdk.io.range.OffsetRange
- getToken() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartition
-
Unique partition identifier, which can be used to perform a change stream query.
- getTokenNames() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
-
Deprecated.
- getTokenNames() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
-
Deprecated.
- getTokenWithCorrectPartition(Range.ByteStringRange, ChangeStreamContinuationToken) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamContinuationTokenHelper
-
Return the continuation token with correct partition.
- getToKV() - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
- getTopic() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
- getTopic() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
- getTopic() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
-
Get the topic being written to.
- getTopic() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
-
Get the topic being read from.
- getTopic() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration
- getTopic() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark.PartitionMark
- getTopic() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- getTopic() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
-
Sets the topic from which to read.
- getTopic() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
- getTopic() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration
- getTopic() - Method in class org.apache.beam.sdk.io.mqtt.MqttRecord
- getTopicName() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
- getTopicPartition() - Method in class org.apache.beam.sdk.io.kafka.KafkaSourceDescriptor
- getTopicPartitions() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getTopicPattern() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getTopicProvider() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
-
Get the
ValueProviderfor the topic being written to. - getTopicProvider() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
-
Get the
ValueProviderfor the topic being read from. - getTopics() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
- getTopics() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getToRowFunction() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoDynamicMessageSchema
-
Deprecated.
- getToRowFunction() - Method in class org.apache.beam.sdk.schemas.SchemaCoder
-
Returns the fromRow conversion function.
- getToRowFunction() - Method in class org.apache.beam.sdk.values.PCollection
-
Returns the attached schema's toRowFunction.
- getToRowFunction(Class<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
-
Retrieve the function that converts an object of the specified type to a
Rowobject. - getToRowFunction(Class<T>, Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
- getToRowFunction(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
-
Retrieve the function that converts an object of the specified type to a
Rowobject. - getToSnapshot() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getToSnapshotRef() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getTotalBacklogBytes() - Method in interface org.apache.beam.sdk.io.jms.AutoScaler
-
Returns the size of the backlog of unread data in the underlying data source represented by all splits of this source.
- getTotalBacklogBytes() - Method in class org.apache.beam.sdk.io.jms.DefaultAutoscaler
- getTotalBacklogBytes() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
-
Returns the size of the backlog of unread data in the underlying data source represented by all splits of this source.
- getTotalFields() - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeInformation
- getTotalFields() - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueTypeInformation
- getTotalStreamDuration() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSetMetadata
-
Returns the total stream duration of change stream records so far.
- getTotalStreamTimeMillis() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The total streaming time (in millis) for this record.
- getToTimestamp() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getTraitDef() - Method in enum class org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention
- getTransactionIsolation() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- getTransactionTag() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The transaction tag associated with the given transaction.
- getTransform(RunnerApi.FunctionSpec, PipelineOptions) - Method in class org.apache.beam.sdk.expansion.service.ExpansionServiceSchemaTransformProvider
- getTransform(RunnerApi.FunctionSpec, PipelineOptions) - Method in interface org.apache.beam.sdk.expansion.service.TransformProvider
- getTransformId() - Method in class org.apache.beam.sdk.fn.data.DataEndpoint
- getTransformId() - Method in class org.apache.beam.sdk.fn.data.LogicalEndpoint
- getTransformId() - Method in class org.apache.beam.sdk.fn.data.TimerEndpoint
- getTransformingMap(Map<K1, V1>, Function<K1, K2>, Function<V1, V2>) - Static method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils
- getTransformNameMapping() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
Mapping of old PTransform names to new ones, specified as JSON
{"oldName":"newName",...}. - getTransformPayloadTranslators() - Method in class org.apache.beam.runners.dataflow.DataflowRunner.DataflowTransformTranslator
- getTransformPayloadTranslators() - Method in class org.apache.beam.runners.dataflow.internal.DataflowGroupByKey.Registrar
- getTransformPayloadTranslators() - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.Registrar
- getTransformPayloadTranslators() - Method in class org.apache.beam.runners.spark.translation.streaming.StreamingTransformTranslator.SparkTransformsRegistrar
- getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOTranslation.ReadRegistrar
- getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOTranslation.WriteRegistrar
- getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQuerySchemaTransformTranslation.ReadWriteRegistrar
- getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubSubPayloadTranslation.ReadRegistrar
- getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubSubPayloadTranslation.WriteRegistrar
- getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.iceberg.IcebergSchemaTransformTranslation.ReadRegistrar
- getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.iceberg.IcebergSchemaTransformTranslation.WriteRegistrar
- getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.jdbc.providers.MySqlSchemaTransformTranslation.ReadRegistrar
- getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.jdbc.providers.MySqlSchemaTransformTranslation.WriteRegistrar
- getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.jdbc.providers.PostgresSchemaTransformTranslation.ReadRegistrar
- getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.jdbc.providers.PostgresSchemaTransformTranslation.WriteRegistrar
- getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.jdbc.providers.SqlServerSchemaTransformTranslation.ReadRegistrar
- getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.jdbc.providers.SqlServerSchemaTransformTranslation.WriteRegistrar
- getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.kafka.KafkaSchemaTransformTranslation.ReadRegistrar
- getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.kafka.KafkaSchemaTransformTranslation.WriteRegistrar
- getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.kafka.upgrade.KafkaIOTranslation.ReadRegistrar
- getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.kafka.upgrade.KafkaIOTranslation.WriteRegistrar
- getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.TFRecordSchemaTransformTranslation.ReadWriteRegistrar
- getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.managed.ManagedSchemaTransformTranslation.ManagedTransformRegistrar
- getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.transforms.Redistribute.Registrar
- getTransformStepNames() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
- getTransformTranslator(Class<TransformT>) - Method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator
-
Returns the
TransformTranslatorto use for instances of the specified PTransform class, or null if none registered. - getTransformTranslator(TransformT) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.PipelineTranslatorBatch
-
Returns a
TransformTranslatorfor the givenPTransformif known. - getTransformTranslator(TransformT) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator
-
Returns a
TransformTranslatorfor the givenPTransformif known. - getTransformUniqueID(RunnerApi.FunctionSpec) - Method in interface org.apache.beam.sdk.expansion.service.TransformProvider
- getTranslator() - Method in class org.apache.beam.runners.dataflow.DataflowRunner
-
Returns the DataflowPipelineTranslator associated with this object.
- getTransport() - Static method in class org.apache.beam.sdk.extensions.gcp.util.Transport
- getTrigger() - Method in class org.apache.beam.sdk.values.WindowingStrategy
- getTriggeringFrequencySeconds() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
- getTriggeringFrequencySeconds() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration
- getTruncatedRestriction() - Method in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker.TruncateResult
- getTruncateTimestamps() - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogPipelineOptions
-
Whether to truncate timestamps in tables described by Data Catalog.
- getTruncateTimestamps() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions
- getTSetEnvironment() - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
- getTSetGraph() - Method in class org.apache.beam.runners.twister2.Twister2PipelineExecutionEnvironment
- getTupleTag() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple.TaggedKeyedPCollection
-
Returns the TupleTag of this TaggedKeyedPCollection.
- getTupleTagCoders(Map<TupleTag<?>, PCollection<?>>) - Static method in class org.apache.beam.runners.spark.translation.TranslationUtils
-
Utility to get mapping between TupleTag and a coder.
- getTupleTagDecodeFunction(Map<TupleTag<?>, Coder<WindowedValue<?>>>) - Static method in class org.apache.beam.runners.spark.translation.TranslationUtils
-
Returns a pair function to convert bytes to value via coder.
- getTupleTagEncodeFunction(Map<TupleTag<?>, Coder<WindowedValue<?>>>) - Static method in class org.apache.beam.runners.spark.translation.TranslationUtils
-
Returns a pair function to convert value to bytes via coder.
- getTupleTagId(PValue) - Static method in class org.apache.beam.runners.jet.Utils
- getTupleTagList() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
-
Returns the TupleTagList tuple associated with this schema.
- getTwister2Home() - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
- getType() - Method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
-
Returns the type this coder encodes/decodes.
- getType() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory
-
Returns the type for the datum factory.
- getType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.TypeWithNullability
- getType() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPMeasure
- getType() - Method in class org.apache.beam.sdk.extensions.sql.meta.Table
-
type of the table.
- getType() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
- getType() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
-
The type of the column.
- getType() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema.Column
- getType() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
- getType() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Destination
-
Gets the type of the destination (TOPIC, QUEUE or UNKNOWN).
- getType() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
-
Returns the field type.
- getType() - Method in class org.apache.beam.sdk.schemas.Schema.Field
-
Returns the fields
Schema.FieldType. - getType() - Method in interface org.apache.beam.sdk.testing.TestStream.Event
- getType() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
-
Retrieve the
DisplayData.Typeof display data. - getType() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
-
The
DisplayData.Typeof display data. - getType() - Method in class org.apache.beam.sdk.values.TypeDescriptor
-
Returns the
Typerepresented by thisTypeDescriptor. - getType(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
- getType(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.CatalogManagerSchema
- getType(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.CatalogSchema
- getType(String) - Method in class org.apache.beam.sdk.schemas.Schema.Options
-
Get the type of an option.
- getTypeClass() - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeInformation
- getTypeClass() - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueTypeInformation
- getTypeDescriptor() - Method in class org.apache.beam.sdk.schemas.utils.ReflectUtils.TypeDescriptorWithSchema
- getTypeDescriptor() - Method in class org.apache.beam.sdk.transforms.ViewFn
-
Return the
TypeDescriptordescribing the output of this fn. - getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollection
-
Returns a
TypeDescriptor<T>with some reflective information aboutT, if possible. - getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryListFromMultimapViewFn
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryListViewFn
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryMapFromVoidKeyViewFn
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryMapViewFn
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryMultimapFromVoidKeyViewFn
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryMultimapViewFn
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableBackedListViewFn
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableViewFn
-
Deprecated.
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableViewFn2
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.ListViewFn
-
Deprecated.
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.ListViewFn2
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.MapViewFn
-
Deprecated.
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.MapViewFn2
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.MultimapViewFn
-
Deprecated.
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.MultimapViewFn2
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn
-
Deprecated.
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn2
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.TupleTag
-
Returns a
TypeDescriptorcapturing what is known statically about the type of thisTupleTaginstance's most-derived class. - getTypeFactory() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- getTypeMap() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- getTypeName() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
- getTypeName() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
- getTypeNames() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
- getTypeNames() - Method in class org.apache.beam.sdk.extensions.sql.impl.CatalogManagerSchema
- getTypeNames() - Method in class org.apache.beam.sdk.extensions.sql.impl.CatalogSchema
- getTypeParameter(String) - Method in class org.apache.beam.sdk.values.TypeDescriptor
-
Returns a
TypeVariablefor the named type parameter. - getTypePayload() - Method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService.ArtifactDestination
- getTypes() - Method in class org.apache.beam.sdk.values.TypeDescriptor
-
Returns a set of
TypeDescriptor, one for each superclass as well as each interface implemented by this class. - getTypeUrn() - Method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService.ArtifactDestination
- getUdaf(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct
- getUdafImpl() - Method in class org.apache.beam.sdk.extensions.sql.impl.LazyAggregateCombineFn
- getUdafs() - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.UdfUdafProvider
- getUnalignedCheckpointEnabled() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getUnboundedReaderMaxElements() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
The max elements read from an UnboundedReader before checkpointing.
- getUnboundedReaderMaxReadTimeMs() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
The max amount of time an UnboundedReader is consumed before checkpointing.
- getUnboundedReaderMaxReadTimeSec() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
Deprecated.
- getUnboundedReaderMaxWaitForElementsMs() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
The max amount of time waiting for elements when reading from UnboundedReader.
- getUnderlyingDoFn() - Method in class org.apache.beam.runners.dataflow.BatchStatefulParDoOverrides.BatchStatefulDoFn
- getUnderlyingSchemaProvider(Class<T>) - Method in class org.apache.beam.sdk.schemas.annotations.DefaultSchema.DefaultSchemaProvider
-
Retrieves the underlying
SchemaProviderfor the givenClass. - getUnderlyingSchemaProvider(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.annotations.DefaultSchema.DefaultSchemaProvider
-
Retrieves the underlying
SchemaProviderfor the givenTypeDescriptor. - getUnfinishedEndpoints() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver
-
Get all unfinished data and timers endpoints represented as [transform_id]:data and [transform_id]:timers:[timer_family_id].
- getUnfinishedMinWatermark() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.cache.AsyncWatermarkCache
- getUnfinishedMinWatermark() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.cache.NoOpWatermarkCache
- getUnfinishedMinWatermark() - Method in interface org.apache.beam.sdk.io.gcp.spanner.changestreams.cache.WatermarkCache
-
Fetches the earliest partition watermark from the partition metadata table that is not in a
PartitionMetadata.State.FINISHEDstate. - getUnfinishedMinWatermarkFrom(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
-
Fetches the earliest partition watermark from the partition metadata table that is not in a
PartitionMetadata.State.FINISHEDstate. - getUnionCoder() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
- getUnionTag() - Method in class org.apache.beam.sdk.transforms.join.RawUnionValue
- getUniqueId() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
- getUnknownFields() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload
- getUnknownFieldsPayload() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload
- getUnsharedKeySize() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.KeyPrefix
- getUntilTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.OrFinallyTrigger
-
The trigger that signals termination of this trigger.
- getUpdateCompatibilityVersion() - Method in interface org.apache.beam.sdk.options.StreamingOptions
- getUpdatedSchema() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StreamAppendClient
-
If the table schema has been updated, returns the new schema.
- getUpdatedSchema(TableSchema, TableSchema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableSchemaUpdateUtils
- getUploadBufferSizeBytes() - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.CreateOptions
-
If non-null, the upload buffer size to be used.
- getUrl() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- getUrl() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- getUrn() - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.PayloadTranslator
- getUrn() - Method in class org.apache.beam.sdk.schemas.transforms.SchemaTransformTranslation.SchemaTransformPayloadTranslator
- getUrn() - Method in interface org.apache.beam.sdk.transforms.Materialization
-
Gets the URN describing this
Materialization. - getUseActiveSparkSession() - Method in interface org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingPipelineOptions
- getUseAltsServer() - Method in interface org.apache.beam.sdk.expansion.service.ExpansionServiceOptions
- getUseAtLeastOnceSemantics() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
- getUseCdc() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getUseCdcWrites() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
- getUseConfigDependenciesForManaged() - Method in interface org.apache.beam.sdk.expansion.service.ExpansionServiceOptions
- getUseDataStreamForBatch() - Method in interface org.apache.beam.runners.flink.VersionDependentFlinkPipelineOptions
- getUsePublicIps() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
-
Specifies whether worker pools should be started with public IP addresses.
- getUserAgent() - Method in interface org.apache.beam.sdk.options.PipelineOptions
-
A user agent string as per RFC2616, describing the pipeline to external services.
- getUserId() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
- getUsername() - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration
- getUsername() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
- getUsername() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
- getUsername() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
- getUsername() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- getUsername() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- getUsername() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- getUseSeparateWindmillHeartbeatStreams() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getUsesProvidedSparkContext() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
- getUseStandardSql() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration
-
Enables BigQuery's Standard SQL dialect when reading from a query.
- getUseStorageApiConnectionPool() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getUseStorageWriteApi() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getUseStorageWriteApiAtLeastOnce() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getUseTransformService() - Method in interface org.apache.beam.sdk.extensions.python.PythonExternalTransformOptions
- getUseWindmillIsolatedChannels() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getUsingStandardSql() - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher.TableAndQuery
- getUuid() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
- getUUID() - Method in class org.apache.beam.sdk.schemas.Schema
-
Get this schema's UUID.
- getUuidFromMessage(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
- getValidate() - Method in class org.apache.beam.sdk.io.TFRecordReadSchemaTransformConfiguration
- getValidationFailures() - Method in class org.apache.beam.sdk.io.cdap.context.FailureCollectorWrapper
- getValue() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmRecord
-
Returns the value.
- getValue() - Method in class org.apache.beam.runners.spark.util.ByteArray
- getValue() - Method in class org.apache.beam.runners.spark.util.SideInputBroadcast
- getValue() - Method in class org.apache.beam.sdk.coders.StructuralByteArray
- getValue() - Method in class org.apache.beam.sdk.extensions.timeseries.FillGaps.InterpolateData
- getValue() - Method in class org.apache.beam.sdk.io.range.ByteKey
-
Returns a read-only
ByteBufferrepresenting thisByteKey. - getValue() - Method in class org.apache.beam.sdk.metrics.GaugeResult.EmptyGaugeResult
- getValue() - Method in class org.apache.beam.sdk.metrics.GaugeResult
- getValue() - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType.Value
-
Return the integer enum value.
- getValue() - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType.Value
-
Returns the current value of the OneOf.
- getValue() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
-
Retrieve the value of the display item.
- getValue() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
-
The value of the display item.
- getValue() - Method in class org.apache.beam.sdk.transforms.join.RawUnionValue
- getValue() - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow
-
Returns the value of this
FailsafeValueInSingleWindow. - getValue() - Method in class org.apache.beam.sdk.values.KV
-
Returns the value of this
KV. - getValue() - Method in class org.apache.beam.sdk.values.TaggedPValue
-
Returns the
PCollection. - getValue() - Method in class org.apache.beam.sdk.values.TimestampedValue
- getValue() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow
-
Returns the value of this
ValueInSingleWindow. - getValue() - Method in class org.apache.beam.sdk.values.ValueWithRecordId
- getValue() - Method in interface org.apache.beam.sdk.values.WindowedValue
-
The primary data for this value.
- getValue() - Method in class org.apache.beam.sdk.values.WindowedValues.Builder
- getValue(int) - Method in class org.apache.beam.sdk.values.Row
-
Get value by field index,
ClassCastExceptionis thrown if schema doesn't match. - getValue(int) - Method in class org.apache.beam.sdk.values.RowWithGetters
- getValue(int) - Method in class org.apache.beam.sdk.values.RowWithStorage
- getValue(Class<T>) - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType.Value
-
Returns the current value of the OneOf as the destination type.
- getValue(String) - Method in class org.apache.beam.sdk.schemas.Schema.Options
-
Get the value of an option.
- getValue(String) - Method in class org.apache.beam.sdk.values.Row
-
Get value by field name,
ClassCastExceptionis thrown if type doesn't match. - getValue(String, Class<T>) - Method in class org.apache.beam.sdk.schemas.Schema.Options
-
Get the value of an option.
- getValueCaptureType() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The capture type of the change stream that generated this record.
- getValueClass() - Method in enum class org.apache.beam.sdk.io.cdap.PluginConstants.Hadoop
- getValueCoder() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmRecordCoder
-
Returns the value coder.
- getValueCoder() - Method in class org.apache.beam.sdk.coders.KvCoder
- getValueCoder() - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
-
Gets the value coder that will be prefixed by the length.
- getValueCoder() - Method in class org.apache.beam.sdk.coders.MapCoder
- getValueCoder() - Method in class org.apache.beam.sdk.coders.NullableCoder
-
Returns the inner
Coderwrapped by thisNullableCoderinstance. - getValueCoder() - Method in class org.apache.beam.sdk.coders.OptionalCoder
-
Returns the inner
Coderwrapped by thisOptionalCoderinstance. - getValueCoder() - Method in class org.apache.beam.sdk.coders.SortedMapCoder
- getValueCoder() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
- getValueCoder() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getValueCoder() - Method in class org.apache.beam.sdk.testing.TestStream
- getValueCoder() - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
- getValueCoder() - Method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
- getValueCoder() - Method in class org.apache.beam.sdk.values.WindowedValues.WindowedValueCoder
-
Returns the value coder.
- getValueDeserializerProvider() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getValueOnlyCoder(Coder<T>) - Static method in class org.apache.beam.sdk.values.WindowedValues
-
Returns the
ValueOnlyCoderfrom the given valueCoder. - getValueOrDefault(String, T) - Method in class org.apache.beam.sdk.schemas.Schema.Options
-
Get the value of an option.
- getValues() - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
- getValues() - Method in class org.apache.beam.sdk.values.Row
-
Return the list of raw unmodified data values to enable 0-copy code.
- getValues() - Method in class org.apache.beam.sdk.values.RowWithGetters
-
Return the list of raw unmodified data values to enable 0-copy code.
- getValues() - Method in class org.apache.beam.sdk.values.RowWithStorage
- getValueSerializer() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- getValuesMap() - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
- getValueTranslationFunction() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
- getValueTypeDescriptor() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
- getVerifyRowValues() - Method in interface org.apache.beam.sdk.extensions.sql.impl.BeamSqlPipelineOptions
- getVersion() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.Footer
- getView() - Method in class org.apache.beam.runners.dataflow.CreateDataflowView
- getView() - Method in class org.apache.beam.runners.dataflow.DataflowRunner.StreamingPCollectionViewWriterFn
- getView() - Method in class org.apache.beam.runners.spark.translation.streaming.CreateStreamingSparkView.CreateSparkPCollectionView
- getView() - Method in class org.apache.beam.sdk.transforms.View.CreatePCollectionView
-
Deprecated.This should not be used to obtain the output of any given application of this
PTransform. That should be obtained by inspecting theTransformHierarchy.Nodethat contains thisView.CreatePCollectionView, as this view may have been replaced within pipeline surgery. - getViewFn() - Method in interface org.apache.beam.sdk.values.PCollectionView
- getViewFn() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
- getVocabulary() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
- getVocabulary() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- getWarehouse() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- getWarehouse() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- getWarehouse() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- getWarnings() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- getWatchInterval() - Method in class org.apache.beam.sdk.io.FileIO.MatchConfiguration
- getWatchTopicPartitionDuration() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getWatermark() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
- getWatermark() - Method in interface org.apache.beam.sdk.io.aws2.kinesis.WatermarkPolicy
- getWatermark() - Method in class org.apache.beam.sdk.io.aws2.kinesis.WatermarkPolicyFactory.ArrivalTimeWatermarkPolicy
- getWatermark() - Method in class org.apache.beam.sdk.io.aws2.kinesis.WatermarkPolicyFactory.CustomWatermarkPolicy
- getWatermark() - Method in class org.apache.beam.sdk.io.aws2.kinesis.WatermarkPolicyFactory.ProcessingTimeWatermarkPolicy
- getWatermark() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.DetectNewPartitionsState
- getWatermark() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.StreamPartitionWithWatermark
- getWatermark() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
- getWatermark() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
The time for which all records with a timestamp less than it have been processed.
- getWatermark() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
-
Returns a timestamp before or at the timestamps of all future elements read by this reader.
- getWatermark() - Method in class org.apache.beam.sdk.testing.TestStream.WatermarkEvent
- getWatermark(TimestampPolicy.PartitionContext) - Method in class org.apache.beam.sdk.io.kafka.CustomTimestampPolicyWithLimitedDelay
- getWatermark(TimestampPolicy.PartitionContext) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicy
-
Returns watermark for the partition.
- getWatermark(TimestampPolicy.PartitionContext) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.LogAppendTimePolicy
- getWatermark(TimestampPolicy.PartitionContext) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.ProcessingTimePolicy
- getWatermark(TimestampPolicy.PartitionContext) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.TimestampFnPolicy
- getWatermarkAndState() - Method in interface org.apache.beam.sdk.fn.splittabledofn.WatermarkEstimators.WatermarkAndStateObserver
- getWatermarkCache() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.cache.CacheFactory
- getWatermarkFn() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getWatermarkIdleDurationThreshold() - Method in class org.apache.beam.sdk.io.solace.read.UnboundedSolaceSource
- getWatermarkIndexName() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataTableNames
- getWatermarkLastUpdated() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.DetectNewPartitionsState
- getWatermarkMillis() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark.PartitionMark
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterAll
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterEach
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterFirst
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterPane
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterSynchronizedProcessingTime
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.FromEndOfWindow
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.DefaultTrigger
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.Never.NeverTrigger
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.OrFinallyTrigger
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.Repeatedly
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.ReshuffleTrigger
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
-
For internal use only; no backwards-compatibility guarantees.
- getWeigher(Coder<InputT>) - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches.BatchingParams
- getWeight() - Method in class org.apache.beam.sdk.fn.data.WeightedList
- getWindmillGetDataStreamCount() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getWindmillHarnessUpdateReportingPeriod() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getWindmillMessagesBetweenIsReadyChecks() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getWindmillRequestBatchedGetWorkResponse() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getWindmillServiceCommitThreads() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getWindmillServiceEndpoint() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
-
Custom windmill service endpoint.
- getWindmillServicePort() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getWindmillServiceRpcChannelAliveTimeoutSec() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getWindmillServiceStreamingLogEveryNStreamFailures() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getWindmillServiceStreamingRpcBatchLimit() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getWindmillServiceStreamingRpcHealthCheckPeriodMs() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getWindmillServiceStreamMaxBackoffMillis() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getWindow() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.NodeStats
-
This method returns the number of tuples in each window.
- getWindow() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
- getWindow() - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow
-
Returns the window of this
FailsafeValueInSingleWindow. - getWindow() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow
-
Returns the window of this
ValueInSingleWindow. - getWindow() - Method in interface org.apache.beam.sdk.values.WindowedValues.SingleWindowedValue
- getWindowCoder() - Method in class org.apache.beam.sdk.coders.TimestampPrefixingWindowCoder
- getWindowCoder() - Method in class org.apache.beam.sdk.values.WindowedValues.FullWindowedValueCoder
- getWindowedAggregateDoFnOperator(FlinkStreamingTranslationContext, PTransform<PCollection<KV<K, InputT>>, PCollection<KV<K, OutputT>>>, KvCoder<K, InputAccumT>, Coder<WindowedValue<KV<K, OutputAccumT>>>, SystemReduceFn<K, InputAccumT, ?, OutputAccumT, BoundedWindow>, Map<Integer, PCollectionView<?>>, List<PCollectionView<?>>) - Static method in class org.apache.beam.runners.flink.FlinkStreamingAggregationsTranslators
-
Create a DoFnOperator instance that group elements per window and apply a combine function on them.
- getWindowedAggregateDoFnOperator(FlinkStreamingTranslationContext, PTransform<PCollection<KV<K, InputT>>, PCollection<KV<K, OutputT>>>, KvCoder<K, InputAccumT>, Coder<WindowedValue<KV<K, OutputAccumT>>>, CombineFnBase.GlobalCombineFn<? super InputAccumT, ?, OutputAccumT>, Map<Integer, PCollectionView<?>>, List<PCollectionView<?>>) - Static method in class org.apache.beam.runners.flink.FlinkStreamingAggregationsTranslators
- getWindowedValueCoder(String, RunnerApi.Components) - Static method in class org.apache.beam.runners.fnexecution.translation.PipelineTranslatorUtils
- getWindowedWrites() - Method in class org.apache.beam.sdk.io.WriteFiles
- getWindowFn() - Method in class org.apache.beam.sdk.transforms.windowing.Window.Assign
- getWindowFn() - Method in class org.apache.beam.sdk.transforms.windowing.Window
- getWindowFn() - Method in class org.apache.beam.sdk.values.WindowingStrategy
- getWindowingStrategy() - Method in class org.apache.beam.sdk.values.PCollection
-
Returns the
WindowingStrategyof thisPCollection. - getWindowingStrategy(String, RunnerApi.Components) - Static method in class org.apache.beam.runners.fnexecution.translation.PipelineTranslatorUtils
- getWindowingStrategyInternal() - Method in interface org.apache.beam.sdk.values.PCollectionView
-
Deprecated.this method will be removed entirely. The
PCollectionunderlying a side input, including itsWindowingStrategy, is part of the side input's specification with aParDotransform, which will obtain that information via a package-private channel. - getWindowingStrategyInternal() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
-
Returns the
WindowingStrategyof thisPCollectionView, which should be that of the underlyingPCollection. - getWindowMappingFn() - Method in interface org.apache.beam.sdk.values.PCollectionView
-
For internal use only.
- getWindowMappingFn() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
- getWindows() - Method in interface org.apache.beam.sdk.values.WindowedValue
-
Returns the windows of this
WindowedValue. - getWindows() - Method in class org.apache.beam.sdk.values.WindowedValues.Builder
- getWindows() - Method in class org.apache.beam.sdk.values.WindowedValues.ParamWindowedValueCoder
- getWindowsCoder() - Method in class org.apache.beam.sdk.values.WindowedValues.FullWindowedValueCoder
- getWindowTypeDescriptor() - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
- getWindowTypeDescriptor() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
-
Returns a
TypeDescriptorcapturing what is known statically about the window type of thisWindowFninstance's most-derived class. - getWithAutoSharding() - Method in class org.apache.beam.sdk.io.WriteFiles
- getWithPartitions() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadConfiguration
- getWorkCompleted() - Method in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker.Progress
-
The known amount of completed work.
- getWorkerCacheMb() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
The size of the worker's in-memory cache, in megabytes.
- getWorkerCPUs() - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
- getWorkerDiskType() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
-
Specifies what type of persistent disk is used.
- getWorkerHarnessContainerImage() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
-
Deprecated.
- getWorkerId() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerHarnessOptions
-
The identity of the worker running this pipeline.
- getWorkerId() - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClient
- getWorkerLogLevelOverrides() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
-
Deprecated.This option controls the log levels for specifically named loggers.
- getWorkerMachineType() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
-
Machine type to create Dataflow worker VMs as.
- getWorkerPool() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerHarnessOptions
-
The identity of the worker pool of this worker.
- getWorkerRegion() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
-
The Compute Engine region (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g.
- getWorkerSystemErrMessageLevel() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
-
Deprecated.Controls the log level given to messages printed to
System.err. - getWorkerSystemOutMessageLevel() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
-
Deprecated.Controls the log level given to messages printed to
System.out. - getWorkerZone() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
-
The Compute Engine zone (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g.
- getWorkRemaining() - Method in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker.Progress
-
The known amount of work remaining.
- getWritableByteChannelFactory() - Method in class org.apache.beam.sdk.io.FileBasedSink
-
Returns the
FileBasedSink.WritableByteChannelFactoryused. - getWrite() - Method in class org.apache.beam.io.requestresponse.Cache.Pair
- getWrite() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteFailure
- getWriteCounterPrefix() - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.GcsCountersOptions
- getWriteDisposition() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
- getWriteDisposition() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.WriteBuilder.Configuration
- getWriteDisposition() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
-
Getting disposition how write data to table, see:
WriteDisposition. - getWriteFailures() - Method in exception class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.FailedWritesException
-
This list of
FirestoreV1.WriteFailures detailing which writes failed and for what reason. - getWriteOperation() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
-
Return the WriteOperation that this Writer belongs to.
- getWriteRecordsTransform() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
- getWriteResult() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteFailure
- getWriteStatement() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
- getWriteStreamSchema(String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.WriteStreamService
- getWriteStreamSchema(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.WriteStreamServiceImpl
- getWriteStreamSchema(String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- getWriteStreamService(BigQueryOptions) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices
-
Returns a real, mock, or fake
BigQueryServices.WriteStreamService. - getWriteStreamService(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl
- getWriteStreamService(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
- getXmlConfiguration() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration
- getZetaSqlDefaultTimezone() - Method in interface org.apache.beam.sdk.extensions.sql.impl.BeamSqlPipelineOptions
- getZone() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
-
Deprecated.Use
GcpOptions.getWorkerZone()instead. - global(Map<Integer, GlobalWatermarkHolder.SparkWatermarks>) - Static method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
-
Build a global
TimerInternalsfor all feeding streams. - Global() - Constructor for class org.apache.beam.sdk.schemas.transforms.Group.Global
- GLOBAL_SEQUENCE_TRACKER - Static variable in class org.apache.beam.sdk.extensions.ordered.OrderedEventProcessor
- GlobalConfigRefreshPeriodFactory() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions.GlobalConfigRefreshPeriodFactory
- globalDefault() - Static method in class org.apache.beam.sdk.values.WindowingStrategy
-
Return a fully specified, default windowing strategy.
- GlobalDigest() - Constructor for class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.GlobalDigest
- globally() - Static method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct
-
Computes the approximate number of distinct elements in the input
PCollection<InputT>and returns aPCollection<Long>. - globally() - Static method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies
-
Create the
PTransformthat will build a Count-min sketch for keeping track of the frequency of the elements in the whole stream. - globally() - Static method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles
-
Compute the stream in order to build a T-Digest structure (MergingDigest) for keeping track of the stream distribution and returns a
PCollection<MergingDigest>. - globally() - Static method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct
- globally() - Static method in class org.apache.beam.sdk.extensions.zetasketch.HllCount.Extract
-
Returns a
PTransformthat takes an inputPCollection<byte[]>of HLL++ sketches and returns aPCollection<Long>of the estimated count of distinct elements extracted from each sketch. - globally() - Method in class org.apache.beam.sdk.extensions.zetasketch.HllCount.Init.Builder
-
Returns a
Combine.GloballyPTransformthat takes an inputPCollection<InputT>and returns aPCollection<byte[]>which consists of the HLL++ sketch computed from the elements in the inputPCollection. - globally() - Static method in class org.apache.beam.sdk.extensions.zetasketch.HllCount.MergePartial
-
Returns a
Combine.GloballyPTransformthat takes an inputPCollection<byte[]>of HLL++ sketches and returns aPCollection<byte[]>of a new sketch merged from the input sketches. - globally() - Static method in class org.apache.beam.sdk.schemas.transforms.Group
-
Returns a transform that groups all elements in the input
PCollection. - globally() - Static method in class org.apache.beam.sdk.transforms.Count
-
Returns a
PTransformthat counts the number of elements in its inputPCollection. - globally() - Static method in class org.apache.beam.sdk.transforms.Latest
-
Returns a
PTransformthat takes as input aPCollection<T>and returns aPCollection<T>whose contents is the latest element according to its event time, or null if there are no elements. - globally() - Static method in class org.apache.beam.sdk.transforms.Max
-
Returns a
PTransformthat takes an inputPCollection<T>and returns aPCollection<T>whose contents is the maximum according to the natural ordering ofTof the inputPCollection's elements, ornullif there are no elements. - globally() - Static method in class org.apache.beam.sdk.transforms.Mean
-
Returns a
PTransformthat takes an inputPCollection<NumT>and returns aPCollection<Double>whose contents is the mean of the inputPCollection's elements, or0if there are no elements. - globally() - Static method in class org.apache.beam.sdk.transforms.Min
-
Returns a
PTransformthat takes an inputPCollection<T>and returns aPCollection<T>whose contents is the minimum according to the natural ordering ofTof the inputPCollection's elements, ornullif there are no elements. - globally(double) - Static method in class org.apache.beam.sdk.transforms.ApproximateUnique
-
Deprecated.Like
ApproximateUnique.globally(int), but specifies the desired maximum estimation error instead of the sample size. - globally(int) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles
-
Like
ApproximateQuantiles.globally(int, Comparator), but sorts using the elements' natural ordering. - globally(int) - Static method in class org.apache.beam.sdk.transforms.ApproximateUnique
-
Deprecated.Returns a
PTransformthat takes aPCollection<T>and returns aPCollection<Long>containing a single value that is an estimate of the number of distinct elements in the inputPCollection. - globally(int, ComparatorT) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles
-
Returns a
PTransformthat takes aPCollection<T>and returns aPCollection<List<T>>whose single value is aListof the approximateN-tiles of the elements of the inputPCollection. - globally(ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Max
-
Returns a
PTransformthat takes an inputPCollection<T>and returns aPCollection<T>whose contents is the maximum of the inputPCollection's elements, ornullif there are no elements. - globally(ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Min
-
Returns a
PTransformthat takes an inputPCollection<T>and returns aPCollection<T>whose contents is the minimum of the inputPCollection's elements, ornullif there are no elements. - globally(CombineFnBase.GlobalCombineFn<? super InputT, ?, OutputT>) - Static method in class org.apache.beam.sdk.transforms.Combine
-
Returns a
Combine.GloballyPTransformthat uses the givenGloballyCombineFnto combine all the elements in each window of the inputPCollectioninto a single value in the outputPCollection. - globally(CombineWithContext.CombineFnWithContext<InputT, AccumT, OutputT>, SerializablePipelineOptions, Map<TupleTag<?>, KV<WindowingStrategy<?, ?>, SideInputBroadcast<?>>>, WindowingStrategy<?, ?>) - Static method in class org.apache.beam.runners.spark.translation.SparkCombineFn
- globally(SerializableBiFunction<V, V, V>) - Static method in class org.apache.beam.sdk.transforms.Combine
-
Returns a
Combine.GloballyPTransformthat uses the givenSerializableBiFunctionto combine all the elements in each window of the inputPCollectioninto a single value in the outputPCollection. - globally(SerializableFunction<Iterable<V>, V>) - Static method in class org.apache.beam.sdk.transforms.Combine
-
Returns a
Combine.GloballyPTransformthat uses the givenSerializableFunctionto combine all the elements in each window of the inputPCollectioninto a single value in the outputPCollection. - Globally() - Constructor for class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.Globally
- Globally(double) - Constructor for class org.apache.beam.sdk.transforms.ApproximateUnique.Globally
-
Deprecated.
- Globally(int) - Constructor for class org.apache.beam.sdk.transforms.ApproximateUnique.Globally
-
Deprecated.
- GloballyDistinct() - Constructor for class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.GloballyDistinct
- GlobalSketch() - Constructor for class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.GlobalSketch
- GlobalWatermarkHolder - Class in org.apache.beam.runners.spark.util
-
A store to hold the global watermarks for a micro-batch.
- GlobalWatermarkHolder() - Constructor for class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
- GlobalWatermarkHolder.SparkWatermarks - Class in org.apache.beam.runners.spark.util
-
A
GlobalWatermarkHolder.SparkWatermarksholds the watermarks and batch time relevant to a micro-batch input from a specific source. - GlobalWatermarkHolder.WatermarkAdvancingStreamingListener - Class in org.apache.beam.runners.spark.util
-
Advance the WMs onBatchCompleted event.
- GlobalWindow - Class in org.apache.beam.sdk.transforms.windowing
-
The default window into which all data is placed (via
GlobalWindows). - GlobalWindow.Coder - Class in org.apache.beam.sdk.transforms.windowing
-
GlobalWindow.Coderfor encoding and decodingGlobalWindows. - GlobalWindows - Class in org.apache.beam.sdk.transforms.windowing
-
A
WindowFnthat assigns all data to the same window. - GlobalWindows() - Constructor for class org.apache.beam.sdk.transforms.windowing.GlobalWindows
- GoogleADCIdTokenProvider - Class in org.apache.beam.sdk.io.aws2.auth
-
A OIDC web identity token provider implementation that uses the application default credentials set by the runtime (container, GCE instance, local environment, etc.).
- GoogleADCIdTokenProvider() - Constructor for class org.apache.beam.sdk.io.aws2.auth.GoogleADCIdTokenProvider
- GoogleAdsClientFactory - Interface in org.apache.beam.sdk.io.googleads
-
Defines how to construct a
GoogleAdsClient. - GoogleAdsCredentialsFactory() - Constructor for class org.apache.beam.sdk.io.googleads.GoogleAdsOptions.GoogleAdsCredentialsFactory
- GoogleAdsIO<GoogleAdsRowT,
SearchGoogleAdsStreamRequestT> - Class in org.apache.beam.sdk.io.googleads -
GoogleAdsIOprovides an API for reading from the Google Ads API over supported versions of the Google Ads client libraries. - GoogleAdsIO() - Constructor for class org.apache.beam.sdk.io.googleads.GoogleAdsIO
- GoogleAdsIO.RateLimitPolicy<GoogleAdsErrorT> - Interface in org.apache.beam.sdk.io.googleads
-
This interface can be used to implement custom client-side rate limiting policies.
- GoogleAdsIO.RateLimitPolicyFactory<GoogleAdsErrorT> - Interface in org.apache.beam.sdk.io.googleads
-
Implement this interface to create a
GoogleAdsIO.RateLimitPolicy. - GoogleAdsOptions - Interface in org.apache.beam.sdk.io.googleads
-
Options used to configure Google Ads API specific options.
- GoogleAdsOptions.GoogleAdsCredentialsFactory - Class in org.apache.beam.sdk.io.googleads
-
Attempts to load the Google Ads credentials.
- GoogleAdsUserCredentialFactory - Class in org.apache.beam.sdk.io.googleads
-
Constructs and returns
Credentialsto be used by Google Ads API calls. - GoogleAdsV19 - Class in org.apache.beam.sdk.io.googleads
-
GoogleAdsV19provides an API to read Google Ads API v19 reports. - GoogleAdsV19.Read - Class in org.apache.beam.sdk.io.googleads
-
A
PTransformthat reads the results of a Google Ads query asGoogleAdsRowobjects. - GoogleAdsV19.ReadAll - Class in org.apache.beam.sdk.io.googleads
-
A
PTransformthat reads the results of manySearchGoogleAdsStreamRequestobjects asGoogleAdsRowobjects. - GoogleAdsV19.SimpleRateLimitPolicy - Class in org.apache.beam.sdk.io.googleads
-
This rate limit policy wraps a
RateLimiterand can be used in low volume and development use cases as a client-side rate limiting policy. - GoogleApiDebugOptions - Interface in org.apache.beam.sdk.extensions.gcp.options
-
These options configure debug settings for Google API clients created within the Apache Beam SDK.
- GoogleApiDebugOptions.GoogleApiTracer - Class in org.apache.beam.sdk.extensions.gcp.options
-
A
GoogleClientRequestInitializerthat adds the trace destination to Google API calls. - GoogleApiTracer() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GoogleApiDebugOptions.GoogleApiTracer
- GraphiteSink - Class in org.apache.beam.runners.spark.metrics.sink
-
A
Sinkfor Spark's metric system reporting metrics (including Beam step metrics) to Graphite. - GraphiteSink(Properties, MetricRegistry) - Constructor for class org.apache.beam.runners.spark.metrics.sink.GraphiteSink
-
Constructor for Spark 3.2.x and later.
- GraphiteSink(Properties, MetricRegistry, SecurityManager) - Constructor for class org.apache.beam.runners.spark.metrics.sink.GraphiteSink
-
Constructor for Spark 3.1.x and earlier.
- GREATER_THAN - Enum constant in enum class org.apache.beam.sdk.extensions.sql.impl.cep.CEPKind
- GREATER_THAN_OR_EQUAL - Enum constant in enum class org.apache.beam.sdk.extensions.sql.impl.cep.CEPKind
- greaterThan(Coder<T>, T) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcherwith identical criteria toMatchers.greaterThan(Comparable). - greaterThan(T) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcherwith identical criteria toMatchers.greaterThan(Comparable). - greaterThan(T) - Static method in class org.apache.beam.sdk.transforms.Filter
-
Returns a
PTransformthat takes an inputPCollection<T>and returns aPCollection<T>with elements that are greater than a given value, based on the elements' natural ordering. - greaterThanEq(T) - Static method in class org.apache.beam.sdk.transforms.Filter
-
Returns a
PTransformthat takes an inputPCollection<T>and returns aPCollection<T>with elements that are greater than or equal to a given value, based on the elements' natural ordering. - greaterThanOrEqualTo(Coder<T>, T) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcherwith identical criteria toMatchers.greaterThanOrEqualTo(Comparable). - greaterThanOrEqualTo(T) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcherwith identical criteria toMatchers.greaterThanOrEqualTo(Comparable). - Group - Class in org.apache.beam.sdk.schemas.transforms
-
A generic grouping transform for schema
PCollections. - Group() - Constructor for class org.apache.beam.sdk.schemas.transforms.Group
- Group.AggregateCombiner<InputT> - Class in org.apache.beam.sdk.schemas.transforms
-
a
PTransformthat does a combine using an aggregation built up by calls to aggregateField and aggregateFields. - Group.ByFields<InputT> - Class in org.apache.beam.sdk.schemas.transforms
-
a
PTransformthat groups schema elements based on the given fields. - Group.CombineFieldsByFields<InputT> - Class in org.apache.beam.sdk.schemas.transforms
-
a
PTransformthat does a per-key combine using an aggregation built up by calls to aggregateField and aggregateFields. - Group.CombineFieldsByFields.Fanout - Class in org.apache.beam.sdk.schemas.transforms
- Group.CombineFieldsByFields.Fanout.Kind - Enum Class in org.apache.beam.sdk.schemas.transforms
- Group.CombineFieldsGlobally<InputT> - Class in org.apache.beam.sdk.schemas.transforms
-
a
PTransformthat does a global combine using an aggregation built up by calls to aggregateField and aggregateFields. - Group.CombineGlobally<InputT,
OutputT> - Class in org.apache.beam.sdk.schemas.transforms -
a
PTransformthat does a global combine using a providerCombine.CombineFn. - Group.Global<InputT> - Class in org.apache.beam.sdk.schemas.transforms
-
A
PTransformfor doing global aggregations on schema PCollections. - GroupAlsoByWindowViaOutputBufferFn<K,
InputT, - Class in org.apache.beam.runners.spark.structuredstreaming.translation.batch.functionsW> -
A FlatMap function that groups by windows in batch mode using
ReduceFnRunner. - GroupAlsoByWindowViaOutputBufferFn(WindowingStrategy<?, W>, StateInternalsFactory<K>, SystemReduceFn<K, InputT, Iterable<InputT>, Iterable<InputT>, W>, Supplier<PipelineOptions>) - Constructor for class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.GroupAlsoByWindowViaOutputBufferFn
- GroupByEncryptedKey<K,
V> - Class in org.apache.beam.sdk.transforms -
A
PTransformthat provides a secure alternative toGroupByKey. - GroupByKey<K,
V> - Class in org.apache.beam.sdk.transforms -
GroupByKey<K, V>takes aPCollection<KV<K, V>>, groups the values by key and windows, and returns aPCollection<KV<K, Iterable<V>>>representing a map from each distinct key and window of the inputPCollectionto anIterableover all the values associated with that key in the input per window. - groupByKeyAndWindow(JavaDStream<WindowedValue<KV<K, InputT>>>, Coder<K>, Coder<WindowedValue<InputT>>, WindowingStrategy<?, W>, SerializablePipelineOptions, List<Integer>, String) - Static method in class org.apache.beam.runners.spark.stateful.SparkGroupAlsoByWindowViaWindowSet
- groupByKeyOnly(JavaRDD<WindowedValue<KV<K, V>>>, Coder<K>, WindowedValues.WindowedValueCoder<V>, Partitioner) - Static method in class org.apache.beam.runners.spark.translation.GroupCombineFunctions
-
An implementation of
GroupByKeyViaGroupByKeyOnly.GroupByKeyOnlyfor the Spark runner. - GroupByKeyTranslatorBatch<K,
V> - Class in org.apache.beam.runners.twister2.translators.batch -
GroupByKey translator.
- GroupByKeyTranslatorBatch() - Constructor for class org.apache.beam.runners.twister2.translators.batch.GroupByKeyTranslatorBatch
- GroupByKeyVisitor - Class in org.apache.beam.runners.spark.translation
-
Traverses the pipeline to populate the candidates for group by key.
- GroupByKeyVisitor(SparkPipelineTranslator, EvaluationContext) - Constructor for class org.apache.beam.runners.spark.translation.GroupByKeyVisitor
- GroupByWindowFunction<K,
V, - Class in org.apache.beam.runners.twister2.translators.functionsW> -
GroupBy window function.
- GroupByWindowFunction() - Constructor for class org.apache.beam.runners.twister2.translators.functions.GroupByWindowFunction
- GroupByWindowFunction(WindowingStrategy<?, W>, SystemReduceFn<K, V, Iterable<V>, Iterable<V>, W>, PipelineOptions) - Constructor for class org.apache.beam.runners.twister2.translators.functions.GroupByWindowFunction
- GroupCombineFunctions - Class in org.apache.beam.runners.spark.translation
-
A set of group/combine functions to apply to Spark
RDDs. - GroupCombineFunctions() - Constructor for class org.apache.beam.runners.spark.translation.GroupCombineFunctions
- grouped() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
Same transform but can be applied to
PCollectionofMutationGroup. - groupedValues(CombineFnBase.GlobalCombineFn<? super InputT, ?, OutputT>) - Static method in class org.apache.beam.sdk.transforms.Combine
-
Returns a
Combine.GroupedValuesPTransformthat takes aPCollectionofKVs where a key maps to anIterableof values, e.g., the result of aGroupByKey, then uses the givenCombineFnto combine all the values associated with a key, ignoring the key. - groupedValues(SerializableBiFunction<V, V, V>) - Static method in class org.apache.beam.sdk.transforms.Combine
-
Returns a
Combine.GroupedValuesPTransformthat takes aPCollectionofKVs where a key maps to anIterableof values, e.g., the result of aGroupByKey, then uses the givenSerializableFunctionto combine all the values associated with a key, ignoring the key. - groupedValues(SerializableFunction<Iterable<V>, V>) - Static method in class org.apache.beam.sdk.transforms.Combine
-
Returns a
Combine.GroupedValuesPTransformthat takes aPCollectionofKVs where a key maps to anIterableof values, e.g., the result of aGroupByKey, then uses the givenSerializableFunctionto combine all the values associated with a key, ignoring the key. - GroupingState<InputT,
OutputT> - Interface in org.apache.beam.sdk.state -
A
ReadableStatecell that combines multiple input values and outputs a single value of a different type. - GroupIntoBatches<K,
InputT> - Class in org.apache.beam.sdk.transforms -
A
PTransformthat batches inputs to a desired batch size. - GroupIntoBatches.BatchingParams<InputT> - Class in org.apache.beam.sdk.transforms
-
Wrapper class for batching parameters supplied by users.
- GroupIntoBatches.WithShardedKey - Class in org.apache.beam.sdk.transforms
- GroupIntoBatchesOverride - Class in org.apache.beam.runners.dataflow
- GroupIntoBatchesOverride() - Constructor for class org.apache.beam.runners.dataflow.GroupIntoBatchesOverride
- GroupNonMergingWindowsFunctions - Class in org.apache.beam.runners.spark.translation
-
Functions for GroupByKey with Non-Merging windows translations to Spark.
- GroupNonMergingWindowsFunctions() - Constructor for class org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions
- groups() - Element in annotation interface org.apache.beam.sdk.options.Validation.Required
-
The groups that the annotated attribute is a member of.
- GrowableOffsetRangeTracker - Class in org.apache.beam.sdk.transforms.splittabledofn
-
An
OffsetRangeTrackerfor tracking a growable offset range. - GrowableOffsetRangeTracker(long, GrowableOffsetRangeTracker.RangeEndEstimator) - Constructor for class org.apache.beam.sdk.transforms.splittabledofn.GrowableOffsetRangeTracker
- GrowableOffsetRangeTracker.RangeEndEstimator - Interface in org.apache.beam.sdk.transforms.splittabledofn
-
Provides the estimated end offset of the range.
- Growth() - Constructor for class org.apache.beam.sdk.transforms.Watch.Growth
- growthOf(Contextful<Watch.Growth.PollFn<InputT, OutputT>>, SerializableFunction<OutputT, KeyT>) - Static method in class org.apache.beam.sdk.transforms.Watch
-
Watches the growth of the given poll function, using the given "key function" to deduplicate outputs.
- growthOf(Watch.Growth.PollFn<InputT, OutputT>) - Static method in class org.apache.beam.sdk.transforms.Watch
-
Watches the growth of the given poll function.
- growthOf(Watch.Growth.PollFn<InputT, OutputT>, Requirements) - Static method in class org.apache.beam.sdk.transforms.Watch
-
Watches the growth of the given poll function.
- GrpcContextHeaderAccessorProvider - Class in org.apache.beam.sdk.fn.server
-
A HeaderAccessorProvider which intercept the header in a GRPC request and expose the relevant fields.
- GrpcContextHeaderAccessorProvider() - Constructor for class org.apache.beam.sdk.fn.server.GrpcContextHeaderAccessorProvider
- GrpcDataService - Class in org.apache.beam.runners.fnexecution.data
-
A
FnDataServiceimplemented via gRPC. - GrpcDataService() - Constructor for class org.apache.beam.runners.fnexecution.data.GrpcDataService
-
Deprecated.This constructor is for migrating Dataflow purpose only.
- GrpcFnServer<ServiceT> - Class in org.apache.beam.sdk.fn.server
-
A
gRPC Serverwhich manages a singleFnService. - GrpcLoggingService - Class in org.apache.beam.runners.fnexecution.logging
-
An implementation of the Beam Fn Logging Service over gRPC.
- GrpcStateService - Class in org.apache.beam.runners.fnexecution.state
-
An implementation of the Beam Fn State service.
- guessExpressionType(String, Map<String, Type>) - Static method in class org.apache.beam.sdk.schemas.transforms.providers.StringCompiler
- GZIP - Enum constant in enum class org.apache.beam.sdk.io.CompressedSource.CompressionMode
-
Deprecated.
- GZIP - Enum constant in enum class org.apache.beam.sdk.io.Compression
-
GZip compression.
- GZIP - Enum constant in enum class org.apache.beam.sdk.io.FileBasedSink.CompressionType
-
Deprecated.
- GZIP - Enum constant in enum class org.apache.beam.sdk.io.TextIO.CompressionType
-
Deprecated.
- GZIP - Enum constant in enum class org.apache.beam.sdk.io.TFRecordIO.CompressionType
-
Deprecated.
- GZIP - Enum constant in enum class org.apache.beam.sdk.io.xml.XmlIO.Read.CompressionType
-
Deprecated.
H
- HADOOP - Enum constant in enum class org.apache.beam.sdk.extensions.sorter.ExternalSorter.Options.SorterType
- hadoopConfiguration - Variable in class org.apache.beam.sdk.io.cdap.Plugin
- HadoopFileSystemModule - Class in org.apache.beam.sdk.io.hdfs
- HadoopFileSystemModule() - Constructor for class org.apache.beam.sdk.io.hdfs.HadoopFileSystemModule
- HadoopFileSystemOptions - Interface in org.apache.beam.sdk.io.hdfs
- HadoopFileSystemOptions.ConfigurationLocator - Class in org.apache.beam.sdk.io.hdfs
-
A
DefaultValueFactorywhich locates a HadoopConfiguration. - HadoopFileSystemOptionsRegistrar - Class in org.apache.beam.sdk.io.hdfs
-
AutoServiceregistrar forHadoopFileSystemOptions. - HadoopFileSystemOptionsRegistrar() - Constructor for class org.apache.beam.sdk.io.hdfs.HadoopFileSystemOptionsRegistrar
- HadoopFileSystemRegistrar - Class in org.apache.beam.sdk.io.hdfs
-
AutoServiceregistrar for theHadoopFileSystem. - HadoopFileSystemRegistrar() - Constructor for class org.apache.beam.sdk.io.hdfs.HadoopFileSystemRegistrar
- HadoopFormatIO - Class in org.apache.beam.sdk.io.hadoop.format
-
A
HadoopFormatIOis a Transform for reading data from any source or writing data to any sink which implements HadoopInputFormatorOutputFormat. - HadoopFormatIO() - Constructor for class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO
- HadoopFormatIO.HadoopInputFormatBoundedSource<K,
V> - Class in org.apache.beam.sdk.io.hadoop.format -
Bounded source implementation for
HadoopFormatIO. - HadoopFormatIO.Read<K,
V> - Class in org.apache.beam.sdk.io.hadoop.format -
A
PTransformthat reads from any data source which implements Hadoop InputFormat. - HadoopFormatIO.SerializableSplit - Class in org.apache.beam.sdk.io.hadoop.format
-
A wrapper to allow Hadoop
InputSplitto be serialized using Java's standard serialization mechanisms. - HadoopFormatIO.Write<KeyT,
ValueT> - Class in org.apache.beam.sdk.io.hadoop.format -
A
PTransformthat writes to any data sink which implements Hadoop OutputFormat. - HadoopFormatIO.Write.ExternalSynchronizationBuilder<KeyT,
ValueT> - Interface in org.apache.beam.sdk.io.hadoop.format -
Builder for External Synchronization defining.
- HadoopFormatIO.Write.PartitionedWriterBuilder<KeyT,
ValueT> - Interface in org.apache.beam.sdk.io.hadoop.format -
Builder for partitioning determining.
- HadoopFormatIO.Write.WriteBuilder<KeyT,
ValueT> - Interface in org.apache.beam.sdk.io.hadoop.format -
Main builder of Write transformation.
- HadoopInputFormatBoundedSource(SerializableConfiguration, Coder<K>, Coder<V>, SimpleFunction<?, K>, SimpleFunction<?, V>, HadoopFormatIO.SerializableSplit, boolean, boolean) - Constructor for class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.HadoopInputFormatBoundedSource
- handle(BeamFnApi.InstructionRequest) - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClient
- handle(BeamFnApi.InstructionRequest) - Method in interface org.apache.beam.runners.fnexecution.control.InstructionRequestHandler
- handle(BeamFnApi.StateRequest) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandler
-
Handle a
BeamFnApi.StateRequestasynchronously. - handle(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter
-
Opportunity to further refine the relational expression created for a given level.
- handleErrorEx(Object, JCSMPException, long) - Method in class org.apache.beam.sdk.io.solace.broker.PublishResultHandler
- handleSplitRequest(int, String) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceSplitEnumerator
- handleSplitRequest(int, String) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.LazyFlinkSourceSplitEnumerator
- Handling Errors - Search tag in class org.apache.beam.sdk.io.FileIO
- Section
- HarnessUpdateReportingPeriodFactory() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions.HarnessUpdateReportingPeriodFactory
- has(String) - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
-
Returns whether this
PCollectionRowTuplecontains aPCollectionwith the given tag. - has(String) - Method in class org.apache.beam.sdk.values.PCollectionTuple
-
Returns whether this
PCollectionTuplecontains aPCollectionwith the given tag. - has(ObjectT) - Method in interface org.apache.beam.sdk.schemas.FieldValueHaver
- has(TupleTag<T>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
-
Returns whether this
PCollectionTuplecontains aPCollectionwith the given tag. - hasAnyPrefix() - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.GcsCountersOptions
- hasCommitted() - Method in class org.apache.beam.sdk.metrics.MetricResult
- hasDatabase(String) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogBeamSchema
-
Checks if metastore client has the specified database.
- hasDefault() - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn
-
Deprecated.Returns if a default value was specified.
- hasDefault() - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn2
-
Returns if a default value was specified.
- HasDefaultTracker<RestrictionT,
TrackerT> - Interface in org.apache.beam.sdk.transforms.splittabledofn -
Interface for restrictions for which a default implementation of
DoFn.NewTrackeris available, depending only on the restriction itself. - hasDefaultValue() - Method in class org.apache.beam.sdk.transforms.View.AsSingleton
-
Returns whether this transform has a default value.
- HasDefaultWatermarkEstimator<WatermarkEstimatorStateT,
WatermarkEstimatorT> - Interface in org.apache.beam.sdk.transforms.splittabledofn -
Interface for watermark estimator state for which a default implementation of
DoFn.NewWatermarkEstimatoris available, depending only on the watermark estimator state itself. - HasDisplayData - Interface in org.apache.beam.sdk.transforms.display
-
Marker interface for
PTransformsand components to specify display data used within UIs and diagnostic tools. - hasErrored() - Method in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
-
If this handler has errored since it was last reset.
- hasEventTimers(DoFn<?, ?>) - Static method in class org.apache.beam.runners.spark.translation.TranslationUtils
-
Checks if the given DoFn uses event time timers.
- hasException() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- hasExperiment(DataflowPipelineDebugOptions, String) - Static method in class org.apache.beam.runners.dataflow.DataflowRunner
-
Returns true if the specified experiment is enabled, handling null experiments.
- hasExperiment(PipelineOptions, String) - Static method in interface org.apache.beam.sdk.options.ExperimentalOptions
-
Returns true iff the provided pipeline options has the specified experiment enabled.
- hasFailedRecords(List<ResT>) - Method in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
- hasField(String) - Method in class org.apache.beam.sdk.schemas.Schema
-
Returns true if
fieldNameexists in the schema, false otherwise. - hasGlobWildcard(String) - Static method in class org.apache.beam.sdk.io.FileSystems
-
Checks whether the given spec contains a glob wildcard character.
- hash(byte[]) - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueComparator
- hash(List<?>) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmRecordCoder
-
Computes the shard id for the given key component(s).
- hashCode() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmRecordCoder
- hashCode() - Method in class org.apache.beam.runners.dataflow.util.CloudObject
- hashCode() - Method in class org.apache.beam.runners.dataflow.util.OutputReference
- hashCode() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
- hashCode() - Method in class org.apache.beam.runners.flink.adapter.FlinkKey
- hashCode() - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeInformation
- hashCode() - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeSerializer
- hashCode() - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueTypeInformation
- hashCode() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.FlinkStateNamespaceKeySerializer
- hashCode() - Method in class org.apache.beam.runners.jet.Utils.ByteArrayKey
- hashCode() - Method in class org.apache.beam.runners.spark.io.EmptyCheckpointMark
- hashCode() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource
- hashCode() - Method in class org.apache.beam.runners.spark.util.ByteArray
- hashCode() - Method in class org.apache.beam.runners.spark.util.TimerUtils.TimerMarker
- hashCode() - Method in class org.apache.beam.sdk.coders.AtomicCoder
-
.
- hashCode() - Method in class org.apache.beam.sdk.coders.Coder.Context
-
Deprecated.
- hashCode() - Method in class org.apache.beam.sdk.coders.DelegateCoder
- hashCode() - Method in class org.apache.beam.sdk.coders.RowCoder
- hashCode() - Method in class org.apache.beam.sdk.coders.SerializableCoder
- hashCode() - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
- hashCode() - Method in class org.apache.beam.sdk.coders.StructuralByteArray
- hashCode() - Method in class org.apache.beam.sdk.coders.StructuredCoder
- hashCode() - Method in class org.apache.beam.sdk.coders.ZstdCoder
- hashCode() - Method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
- hashCode() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory
- hashCode() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsResourceId
- hashCode() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- hashCode() - Method in class org.apache.beam.sdk.extensions.ordered.combiner.SequenceRangeAccumulator
- hashCode() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus
- hashCode() - Method in class org.apache.beam.sdk.extensions.protobuf.DynamicProtoCoder
- hashCode() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
- hashCode() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoDomain
- hashCode() - Method in class org.apache.beam.sdk.extensions.sql.example.model.Customer
- hashCode() - Method in class org.apache.beam.sdk.extensions.sql.example.model.Order
- hashCode() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
- hashCode() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
- hashCode() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
- hashCode() - Method in class org.apache.beam.sdk.io.cassandra.RingRange
- hashCode() - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy.Params
- hashCode() - Method in class org.apache.beam.sdk.io.FileIO.ReadableFile
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertError
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.DetectNewPartitionsState
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.InitialPipelineState
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.StreamPartitionWithWatermark
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.WriteSuccessSummary
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteFailure
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteSuccessSummary
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadParameter
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadResponse
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.ProjectPath
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.TopicPath
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataTableNames
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartition
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.
Consider using
ApproximateCountDistinctin thezetasketchextension module, which makes use of theHllCountimplementation.If
ApproximateCountDistinctdoes not meet your needs then you can directly useHllCount. Direct usage will also give you access to save intermediate aggregation result into a sketch for later processing.For example, to estimate the number of distinct elements in a
For more details about usingPCollection<String>:HllCountand thezetasketchextension module, see https://s.apache.org/hll-in-beam#bookmark=id.v6chsij1ixo7.