Index
All Classes and Interfaces|All Packages|Constant Field Values|Serialized Form
A
- abort() - Method in interface org.apache.beam.runners.fnexecution.state.StateDelegator.Registration
-
De-registers the handler for all future requests for state for the registered process bundle instruction id.
- abort(Executor) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- absolute(String, String...) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData.Path
-
Construct a path from an absolute component path hierarchy.
- AbstractBeamCalcRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamRelNode to replace
Project
andFilter
node. - AbstractBeamCalcRel(RelOptCluster, RelTraitSet, RelNode, RexProgram) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.AbstractBeamCalcRel
- AbstractFlinkCombineRunner<K,
InputT, - Class in org.apache.beam.runners.flink.translation.functionsAccumT, OutputT, W> -
Abstract base for runners that execute a
Combine.PerKey
. - AbstractFlinkCombineRunner() - Constructor for class org.apache.beam.runners.flink.translation.functions.AbstractFlinkCombineRunner
- AbstractFlinkCombineRunner.CompleteFlinkCombiner<K,
InputT, - Class in org.apache.beam.runners.flink.translation.functionsAccumT, OutputT> - AbstractFlinkCombineRunner.FinalFlinkCombiner<K,
AccumT, - Class in org.apache.beam.runners.flink.translation.functionsOutputT> -
A final combiner that takes in
AccumT
and producesOutputT
. - AbstractFlinkCombineRunner.FlinkCombiner<K,
InputT, - Interface in org.apache.beam.runners.flink.translation.functionsAccumT, OutputT> -
Adapter interface that allows using a
CombineFnBase.GlobalCombineFn
to either produce theAccumT
as output or to combine several accumulators into anOutputT
. - AbstractFlinkCombineRunner.PartialFlinkCombiner<K,
InputT, - Class in org.apache.beam.runners.flink.translation.functionsAccumT> -
A partial combiner that takes in
InputT
and producesAccumT
. - AbstractGetterTypeSupplier() - Constructor for class org.apache.beam.sdk.schemas.AutoValueSchema.AbstractGetterTypeSupplier
- AbstractInOutIterator<K,
InputT, - Class in org.apache.beam.runners.spark.translationOutputT> -
Abstract base class for iterators that process Spark input data and produce corresponding output values.
- AbstractInOutIterator(SparkProcessContext<K, InputT, OutputT>) - Constructor for class org.apache.beam.runners.spark.translation.AbstractInOutIterator
- AbstractReadFileRangesFn(SerializableFunction<String, ? extends FileBasedSource<InT>>, ReadAllViaFileBasedSource.ReadFileRangesFnExceptionHandler) - Constructor for class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceTransform.AbstractReadFileRangesFn
- AbstractResult() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.AbstractResult
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.ArrayQualifierContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.ArrayQualifierListContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.FieldSpecifierContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.MapQualifierContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.MapQualifierListContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifiedComponentContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifyComponentContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.SimpleIdentifierContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.WildcardContext
- accept(BeamFnApi.Elements) - Method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver
- accept(SchemaZipFold.Context, Optional<Schema.Field>, Optional<Schema.Field>) - Method in class org.apache.beam.sdk.schemas.utils.SchemaZipFold
-
Accepts two fields, context.parent() is always ROW.
- accept(SchemaZipFold.Context, Schema.FieldType, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.utils.SchemaZipFold
-
Accepts two components, context.parent() is always ROW, MAP, ARRAY or absent.
- accept(ByteString) - Method in class org.apache.beam.sdk.fn.data.DecodingFnDataReceiver
- accept(T) - Method in interface org.apache.beam.sdk.fn.data.FnDataReceiver
- accept(T) - Method in interface org.apache.beam.sdk.function.ThrowingConsumer
- accept(T1, T2) - Method in interface org.apache.beam.sdk.function.ThrowingBiConsumer
- accessPattern() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.SideInputSpec
- accessType() - Method in class org.apache.beam.sdk.io.solace.data.Semp.QueueData
- accumulate(T, T) - Method in class org.apache.beam.sdk.schemas.utils.SchemaZipFold
-
Accumulate two results together.
- accumulateWeight(long) - Method in class org.apache.beam.sdk.fn.data.WeightedList
- ACCUMULATING_FIRED_PANES - Enum constant in enum class org.apache.beam.sdk.values.WindowingStrategy.AccumulationMode
- AccumulatingCombineFn() - Constructor for class org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn
- accumulatingFiredPanes() - Method in class org.apache.beam.sdk.transforms.windowing.Window
-
Returns a new
Window
PTransform
that uses the registered WindowFn and Triggering behavior, and that accumulates elements in a pane after they are triggered. - ACCUMULATOR_NAME - Static variable in class org.apache.beam.runners.flink.metrics.FlinkMetricContainer
- AccumulatorCheckpointingSparkListener() - Constructor for class org.apache.beam.runners.spark.metrics.MetricsAccumulator.AccumulatorCheckpointingSparkListener
- ackDeadlineSeconds(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Return the ack deadline, in seconds, for
subscription
. - ackDeadlineSeconds(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
- ackDeadlineSeconds(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
- ackDeadlineSeconds(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- ackId() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.IncomingMessage
-
Id to pass back to Pubsub to acknowledge receipt of this message.
- acknowledge(PubsubClient.SubscriptionPath, List<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Acknowldege messages from
subscription
withackIds
. - acknowledge(PubsubClient.SubscriptionPath, List<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
- acknowledge(PubsubClient.SubscriptionPath, List<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
- acknowledge(PubsubClient.SubscriptionPath, List<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- acquireTaskAttemptIdLock(Configuration, int) - Method in interface org.apache.beam.sdk.io.hadoop.format.ExternalSynchronization
-
Creates unique
TaskAttemptID
for given taskId. - acquireTaskAttemptIdLock(Configuration, int) - Method in class org.apache.beam.sdk.io.hadoop.format.HDFSSynchronization
- acquireTaskIdLock(Configuration) - Method in interface org.apache.beam.sdk.io.hadoop.format.ExternalSynchronization
-
Creates
TaskID
with unique id among given job. - acquireTaskIdLock(Configuration) - Method in class org.apache.beam.sdk.io.hadoop.format.HDFSSynchronization
- action() - Method in class org.apache.beam.runners.spark.translation.BoundedDataset
- action() - Method in interface org.apache.beam.runners.spark.translation.Dataset
- action() - Method in class org.apache.beam.runners.spark.translation.streaming.UnboundedDataset
- ActionFactory - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.action
-
Factory class for creating instances that will handle different functions of DoFns.
- ActionFactory - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.action
-
Factory class for creating instances that will handle each type of record within a change stream query.
- ActionFactory() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ActionFactory
- ActionFactory() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.ActionFactory
- activate(MetricsContainer) - Method in class org.apache.beam.sdk.metrics.MetricsEnvironment.MetricsContainerHolder
- activate(MetricsContainer) - Method in interface org.apache.beam.sdk.metrics.MetricsEnvironment.MetricsEnvironmentState
- ACTIVE_PARTITION_READ_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Counter for the active partition reads during the execution of the Connector.
- actuateProjectionPushdown(Map<TupleTag<?>, FieldAccessDescriptor>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- actuateProjectionPushdown(Map<TupleTag<?>, FieldAccessDescriptor>) - Method in interface org.apache.beam.sdk.schemas.ProjectionProducer
-
Actuate a projection pushdown.
- add(int, GlobalWatermarkHolder.SparkWatermarks) - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
- add(long) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique
-
Deprecated.Adds a value to the heap, returning whether the value is (large enough to be) in the heap.
- add(long, Instant, boolean) - Method in class org.apache.beam.sdk.extensions.ordered.combiner.SequenceRangeAccumulator
- add(InputT) - Method in interface org.apache.beam.sdk.state.GroupingState
-
Add a value to the buffer.
- add(Class<?>, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.UdfImplReflectiveFunctionBase.ParameterListBuilder
- add(Iterable<String>) - Method in class org.apache.beam.runners.jet.metrics.BoundedTrieImpl
- add(Iterable<String>) - Method in interface org.apache.beam.sdk.metrics.BoundedTrie
-
Adds a path to the trie.
- add(Iterable<String>) - Method in class org.apache.beam.sdk.metrics.Lineage
-
Adds the given fqn as lineage.
- add(Iterable<TimestampedValue<T>>) - Static method in class org.apache.beam.sdk.testing.TestStream.ElementEvent
-
For internal use only: no backwards compatibility guarantees.
- add(Type, String, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.UdfImplReflectiveFunctionBase.ParameterListBuilder
- add(String) - Method in class org.apache.beam.runners.jet.metrics.StringSetImpl
- add(String) - Method in interface org.apache.beam.sdk.metrics.StringSet
-
Add a value to this set.
- add(String...) - Method in class org.apache.beam.runners.jet.metrics.BoundedTrieImpl
- add(String...) - Method in class org.apache.beam.runners.jet.metrics.StringSetImpl
- add(String...) - Method in interface org.apache.beam.sdk.metrics.BoundedTrie
-
Adds a path to the trie.
- add(String...) - Method in interface org.apache.beam.sdk.metrics.StringSet
-
Add values to this set.
- add(String, String, Iterable<String>, String) - Method in class org.apache.beam.sdk.metrics.Lineage
-
Add a FQN (fully-qualified name) to Lineage.
- add(String, Iterable<String>) - Method in class org.apache.beam.sdk.metrics.Lineage
-
Add a FQN (fully-qualified name) to Lineage.
- add(String, Iterable<String>, String) - Method in class org.apache.beam.sdk.metrics.Lineage
-
Add a FQN (fully-qualified name) to Lineage.
- add(List<ValueInSingleWindow<T>>, TableDataInsertAllResponse.InsertErrors, TableReference, FailsafeValueInSingleWindow<TableRow, TableRow>) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.ErrorContainer
- add(MetricsContainerStepMap) - Method in class org.apache.beam.runners.flink.metrics.MetricsAccumulator
- add(MetricsContainerStepMap) - Method in class org.apache.beam.runners.spark.metrics.MetricsContainerStepMapAccumulator
- add(MetricsContainerStepMap) - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.MetricsAccumulator
- add(DisplayData.ItemSpec<?>) - Method in interface org.apache.beam.sdk.transforms.display.DisplayData.Builder
-
Register the given display item.
- add(KV<byte[], byte[]>) - Method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter
- add(KV<byte[], byte[]>) - Method in class org.apache.beam.sdk.extensions.sorter.ExternalSorter
-
Adds a given record to the sorter.
- add(WindowedValue<InputT>, SparkCombineFn<InputT, ValueT, AccumT, ?>) - Method in interface org.apache.beam.runners.spark.translation.SparkCombineFn.WindowedAccumulator
-
Add value with unexploded windows into the accumulator.
- add(T) - Method in class org.apache.beam.runners.twister2.translators.functions.Twister2SinkFunction
- add(T, long) - Method in class org.apache.beam.sdk.fn.data.WeightedList
- add(T, long, Coder<T>) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.Sketch
- add(T, Coder<T>) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.Sketch
- addAccum(AccumT) - Method in interface org.apache.beam.sdk.state.CombiningState
-
Add an accumulator to this state cell.
- addAll(List<T>, long) - Method in class org.apache.beam.sdk.fn.data.WeightedList
- addAll(Map<Integer, Queue<GlobalWatermarkHolder.SparkWatermarks>>) - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
- addAll(WeightedList<T>) - Method in class org.apache.beam.sdk.fn.data.WeightedList
- addAnnotation(String, byte[]) - Method in class org.apache.beam.sdk.transforms.PTransform
- addArray(Object...) - Method in class org.apache.beam.sdk.values.Row.Builder
- addArray(Collection<T>) - Method in class org.apache.beam.sdk.values.Row.Builder
- addArrayField(String, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addAttempted(T, BiFunction<T, T, T>) - Method in class org.apache.beam.sdk.metrics.MetricResult
- addBatchWriteRequest(long, boolean) - Method in interface org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler.Stats
- addBoolean(Map<String, Object>, String, boolean) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- addBooleanField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addByteArrayField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addByteField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addCoderAndEncodedRecord(Coder<T>, T) - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Record.Builder
- addCollectionToSingletonOutput(PCollection<?>, String, PCollectionView<?>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
-
Adds an output to this
CollectionToSingleton
Dataflow step, consuming the specified inputPValue
and producing the specified outputPValue
. - addCommitted(T, BiFunction<T, T, T>) - Method in class org.apache.beam.sdk.metrics.MetricResult
- addDataSet(String, DataSet<T>) - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.BatchTranslationContext
- addDataStream(String, DataStream<T>) - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator.StreamingTranslationContext
- addDateTimeField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addDecimalField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addDictionary(Map<String, Object>, String, Map<String, Object>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- addDouble(Map<String, Object>, String, Double) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- addDoubleField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addElements(TimestampedValue<T>, TimestampedValue<T>...) - Method in class org.apache.beam.sdk.testing.TestStream.Builder
-
Adds the specified elements to the source with the provided timestamps.
- addElements(T, T...) - Method in class org.apache.beam.sdk.testing.TestStream.Builder
-
Adds the specified elements to the source with timestamp equal to the current watermark.
- addEncodingInput(Coder<?>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
-
Sets the encoding for this Dataflow step.
- addErrorCollection(PCollection<ErrorT>) - Method in interface org.apache.beam.sdk.transforms.errorhandling.ErrorHandler
- addErrorCollection(PCollection<ErrorT>) - Method in class org.apache.beam.sdk.transforms.errorhandling.ErrorHandler.DefaultErrorHandler
- addErrorCollection(PCollection<ErrorT>) - Method in class org.apache.beam.sdk.transforms.errorhandling.ErrorHandler.PTransformErrorHandler
- addErrorForCode(int, String) - Method in class org.apache.beam.sdk.extensions.gcp.util.CustomHttpErrors.Builder
-
Adds a matcher to log the provided string if the error matches a particular status code.
- addErrorForCodeAndUrlContains(int, String, String) - Method in class org.apache.beam.sdk.extensions.gcp.util.CustomHttpErrors.Builder
-
Adds a matcher to log the provided string if the error matches a particular status code and the url contains a certain string.
- addExceptionStackTrace(Exception) - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Failure.Builder
- addExperiment(ExperimentalOptions, String) - Static method in interface org.apache.beam.sdk.options.ExperimentalOptions
-
Adds experiment to options if not already present.
- addFailure(String, String) - Method in class org.apache.beam.sdk.io.cdap.context.FailureCollectorWrapper
- addField(String, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addField(Schema.Field) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addFields(List<Schema.Field>) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addFields(Schema.Field...) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- AddFields - Class in org.apache.beam.sdk.schemas.transforms
-
A transform to add new nullable fields to a PCollection's schema.
- AddFields() - Constructor for class org.apache.beam.sdk.schemas.transforms.AddFields
- AddFields.Inner<T> - Class in org.apache.beam.sdk.schemas.transforms
-
Inner PTransform for AddFields.
- addFloatField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- AddHarnessIdInterceptor - Class in org.apache.beam.sdk.fn.channel
-
A
ClientInterceptor
that attaches a provided SDK Harness ID to outgoing messages. - addHumanReadableJson(Object) - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Record.Builder
- addIfAbsent(T) - Method in interface org.apache.beam.sdk.state.SetState
-
Ensures a value is a member of the set, returning
true
if it was added andfalse
otherwise. - addIfNotDefault(DisplayData.ItemSpec<T>, T) - Method in interface org.apache.beam.sdk.transforms.display.DisplayData.Builder
-
Register the given display item if the value is different than the specified default.
- addIfNotNull(DisplayData.ItemSpec<?>) - Method in interface org.apache.beam.sdk.transforms.display.DisplayData.Builder
-
Register the given display item if the value is not null.
- addIncompleteNewPartitions(NewPartition) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler.OrphanedMetadataCleaner
-
Add NewPartition if it hasn't been updated for 15 minutes.
- addIncompleteNewPartitions(NewPartition) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler.PartitionReconciler
-
Capture NewPartition row that cannot merge on its own.
- addInput(double[], Double) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
- addInput(int[], Integer) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
- addInput(long[], Boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CountIf.CountIfFn
- addInput(long[], Long) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
- addInput(AccumT, InputT) - Method in class org.apache.beam.sdk.extensions.sql.impl.LazyAggregateCombineFn
- addInput(AccumT, InputT) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions.PositionAwareCombineFn
- addInput(AccumT, InputT) - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
- addInput(AccumT, InputT) - Method in interface org.apache.beam.sdk.extensions.sql.udf.AggregateFn
-
Adds the given input value to the given accumulator, returning the new accumulator value.
- addInput(AccumT, InputT) - Method in class org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn
- addInput(AccumT, InputT) - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
-
Adds the given input value to the given accumulator, returning the new accumulator value.
- addInput(AccumT, InputT, Long, Long, Long) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions.PositionAwareCombineFn
- addInput(AccumT, InputT, CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
-
Adds the given input value to the given accumulator, returning the new accumulator value.
- addInput(HyperLogLogPlus, InputT) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
- addInput(MergingDigest, Double) - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
- addInput(InputT) - Method in interface org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn.Accumulator
-
Adds the given input value to this accumulator, modifying this accumulator.
- addInput(Iterable<T>, T) - Method in class org.apache.beam.runners.flink.FlinkStreamingAggregationsTranslators.ConcatenateAsIterable
- addInput(Long, Long) - Method in class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.Sum
- addInput(Object[], DataT) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
- addInput(Object[], DataT, CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
- addInput(String, byte[]) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggByte
- addInput(String, Boolean) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
-
Adds an input with the given name and value to this Dataflow step.
- addInput(String, Long) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
-
Adds an input with the given name and value to this Dataflow step.
- addInput(String, String) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
-
Adds an input with the given name and value to this Dataflow step.
- addInput(String, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggString
- addInput(String, List<? extends Map<String, Object>>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
-
Adds an input that is a list of objects.
- addInput(String, Map<String, Object>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
-
Adds an input that is a dictionary of strings to objects.
- addInput(String, PInput) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
-
Adds an input with the given name to this Dataflow step, coming from the specified input PValue.
- addInput(List<String>, String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Concatenate
- addInput(List<T>, T) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.ArrayAgg.ArrayAggArray
- addInput(List<V>, V) - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
- addInput(K, AccumT, AccumT, PipelineOptions, SideInputReader, Collection<? extends BoundedWindow>) - Method in class org.apache.beam.runners.flink.translation.functions.AbstractFlinkCombineRunner.FinalFlinkCombiner
- addInput(K, AccumT, InputT, PipelineOptions, SideInputReader, Collection<? extends BoundedWindow>) - Method in class org.apache.beam.runners.flink.translation.functions.AbstractFlinkCombineRunner.CompleteFlinkCombiner
- addInput(K, AccumT, InputT, PipelineOptions, SideInputReader, Collection<? extends BoundedWindow>) - Method in interface org.apache.beam.runners.flink.translation.functions.AbstractFlinkCombineRunner.FlinkCombiner
- addInput(K, AccumT, InputT, PipelineOptions, SideInputReader, Collection<? extends BoundedWindow>) - Method in class org.apache.beam.runners.flink.translation.functions.AbstractFlinkCombineRunner.PartialFlinkCombiner
- addInput(SequenceRangeAccumulator, TimestampedValue<KV<EventKeyT, KV<Long, EventT>>>) - Method in class org.apache.beam.sdk.extensions.ordered.combiner.DefaultSequenceCombiner
- addInput(SketchFrequencies.Sketch<InputT>, InputT) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
- addInput(CovarianceAccumulator, Row) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CovarianceFn
- addInput(VarianceAccumulator, T) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.VarianceFn
- addInput(BeamBuiltinAggregations.BitXOr.Accum, T) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations.BitXOr
- addInput(ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique, T) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
-
Deprecated.
- addInput(Combine.Holder<V>, V) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
- addInput(Top.BoundedHeap<KV<Integer, T>, SerializableComparator<KV<Integer, T>>>, T) - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
- addInt16Field(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addInt32Field(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addInt64Field(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addIterable(Iterable<T>) - Method in class org.apache.beam.sdk.values.Row.Builder
- addIterableField(String, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- Additional Outputs - Search tag in class org.apache.beam.sdk.transforms.ParDo
- Section
- additionalOutputTags - Variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- addKnownCoderUrn(String) - Static method in class org.apache.beam.runners.fnexecution.wire.LengthPrefixUnknownCoders
-
Registers a coder as being of known type and as such not meriting length prefixing.
- addLabel(String, String) - Method in class org.apache.beam.sdk.metrics.LabeledMetricNameUtils.MetricNameBuilder
-
Add a metric label KV pair to the metric name.
- addLengthPrefixedCoder(String, RunnerApi.Components.Builder, boolean) - Static method in class org.apache.beam.runners.fnexecution.wire.LengthPrefixUnknownCoders
-
Recursively traverses the coder tree and wraps the first unknown coder in every branch with a
LengthPrefixCoder
unless an ancestor coder is itself aLengthPrefixCoder
. - addList(Map<String, Object>, String, List<? extends Map<String, Object>>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- addList(Map<String, Object>, String, T[]) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- addLogicalTypeConversions(GenericData) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
- addLogicalTypeField(String, Schema.LogicalType<InputT, BaseT>) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addLong(Map<String, Object>, String, long) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- addLongs(Map<String, Object>, String, long...) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- addMapField(String, Schema.FieldType, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addMessageListener(Consumer<JobApi.JobMessage>) - Method in class org.apache.beam.runners.jobsubmission.JobInvocation
-
Listen for job messages with a
Consumer
. - addMethodParameters(Method) - Method in class org.apache.beam.sdk.extensions.sql.impl.UdfImplReflectiveFunctionBase.ParameterListBuilder
- addMetricLabel(String, String) - Method in class org.apache.beam.sdk.metrics.LabeledMetricNameUtils.MetricNameBuilder
-
Add a metric label KV pair to the metric.
- addMissingPartitions(List<Range.ByteStringRange>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler.OrphanedMetadataCleaner
-
Add all the missingPartitions.
- addMissingPartitions(List<Range.ByteStringRange>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler.PartitionReconciler
-
Capture partitions that are not currently being streamed.
- addNameFilter(MetricNameFilter) - Method in class org.apache.beam.sdk.metrics.MetricsFilter.Builder
-
Add a
MetricNameFilter
. - addNull(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- addNullableArrayField(String, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addNullableBooleanField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addNullableByteArrayField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addNullableByteField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addNullableDateTimeField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addNullableDecimalField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addNullableDoubleField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addNullableField(String, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addNullableFloatField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addNullableInt16Field(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addNullableInt32Field(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addNullableInt64Field(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addNullableIterableField(String, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addNullableLogicalTypeField(String, Schema.LogicalType<InputT, BaseT>) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addNullableMapField(String, Schema.FieldType, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addNullableRowField(String, Schema) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addNullableStringField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addObject(Map<String, Object>, String, Map<String, Object>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- addOptions(Schema.Options) - Method in class org.apache.beam.sdk.schemas.Schema.Options.Builder
- addOutput(Output) - Method in class org.apache.beam.sdk.io.cdap.context.BatchSinkContextImpl
-
Overrides the output configuration of this Batch job to the specified
Output
. - addOutput(String, PCollection<?>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
-
Adds a primitive output to this Dataflow step with the given name as the local output name, producing the specified output
PValue
, including itsCoder
if aTypedPValue
. - addOutputColumnList(List<ResolvedNodes.ResolvedOutputColumn>) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.QueryTrait
- addOverrideForClass(Class<?>, DataflowWorkerLoggingOptions.Level) - Method in class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.WorkerLogLevelOverrides
-
Deprecated.Overrides the default log level for the passed in class.
- addOverrideForClass(Class<?>, SdkHarnessOptions.LogLevel) - Method in class org.apache.beam.sdk.options.SdkHarnessOptions.SdkHarnessLogLevelOverrides
-
Overrides the default log level for the passed in class.
- addOverrideForName(String, DataflowWorkerLoggingOptions.Level) - Method in class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.WorkerLogLevelOverrides
-
Deprecated.Overrides the default log level for the passed in name.
- addOverrideForName(String, SdkHarnessOptions.LogLevel) - Method in class org.apache.beam.sdk.options.SdkHarnessOptions.SdkHarnessLogLevelOverrides
-
Overrides the default log logLevel for the passed in name.
- addOverrideForPackage(Package, DataflowWorkerLoggingOptions.Level) - Method in class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.WorkerLogLevelOverrides
-
Deprecated.Overrides the default log level for the passed in package.
- addOverrideForPackage(Package, SdkHarnessOptions.LogLevel) - Method in class org.apache.beam.sdk.options.SdkHarnessOptions.SdkHarnessLogLevelOverrides
-
Overrides the default log level for the passed in package.
- addProperties(MetadataEntity, Map<String, String>) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- addReader(int) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceSplitEnumerator
- addReader(int) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.LazyFlinkSourceSplitEnumerator
- addResolvedTable(TableResolution.SimpleTableWithPath) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.QueryTrait
-
Store a table together with its full path for repeated resolutions.
- addRowField(String, Schema) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addRows(Object...) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestBoundedTable
-
Add rows to the builder.
- addRows(String, Row...) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider
- addRows(Duration, Object...) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestUnboundedTable
-
Add rows to the builder.
- addRunnerWireCoder(PipelineNode.PCollectionNode, RunnerApi.Components.Builder, RunnerApi.ExecutableStagePayload.WireCoderSetting) - Static method in class org.apache.beam.runners.fnexecution.wire.WireCoders
-
Creates a runner-side wire coder for a port read/write for the given PCollection.
- addSchema(String, TableProvider) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv.BeamSqlEnvBuilder
-
Add a top-level schema backed by the table provider.
- addSdkWireCoder(PipelineNode.PCollectionNode, RunnerApi.Components.Builder, RunnerApi.ExecutableStagePayload.WireCoderSetting) - Static method in class org.apache.beam.runners.fnexecution.wire.WireCoders
-
Creates an SDK-side wire coder for a port read/write for the given PCollection.
- AddShardKeyDoFn - Class in org.apache.beam.sdk.io.solace.write
-
This class adds pseudo-key with a given cardinality.
- AddShardKeyDoFn(int) - Constructor for class org.apache.beam.sdk.io.solace.write.AddShardKeyDoFn
- addSideInputValue(StreamRecord<RawUnionValue>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
-
Add the side input value.
- addSideInputValue(StreamRecord<RawUnionValue>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.ExecutableStageDoFnOperator
- addSplits(List<FlinkSourceSplit<T>>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- addSplitsBack(List<FlinkSourceSplit<T>>, int) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceSplitEnumerator
- addSplitsBack(List<FlinkSourceSplit<T>>, int) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.LazyFlinkSourceSplitEnumerator
- addSplitsToUnfinishedForCheckpoint(long, List<FlinkSourceSplit<T>>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
-
To be overridden in unbounded reader.
- addSplitsToUnfinishedForCheckpoint(long, List<FlinkSourceSplit<T>>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.unbounded.FlinkUnboundedSourceReader
- addStateListener(Consumer<JobApi.JobStateEvent>) - Method in class org.apache.beam.runners.jobsubmission.JobInvocation
-
Listen for job state changes with a
Consumer
. - addStep(String) - Method in class org.apache.beam.sdk.metrics.MetricsFilter.Builder
-
Add a step filter.
- addStep(PTransform<?, ?>, String) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
-
Adds a step to the Dataflow workflow for the given transform, with the given Dataflow step type.
- addString(Map<String, Object>, String, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- addStringField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- addStringList(Map<String, Object>, String, List<String>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- addTags(MetadataEntity, Iterable<String>) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- addTags(MetadataEntity, String...) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- addTimers(Iterator<TimerInternals.TimerData>) - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
- addToCurrentBundle(Solace.Record) - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
- addTraceFor(AbstractGoogleClientRequest<?>, String) - Method in class org.apache.beam.sdk.extensions.gcp.options.GoogleApiDebugOptions.GoogleApiTracer
-
Creates a
GoogleApiDebugOptions.GoogleApiTracer
that sets the tracetraceDestination
on all calls that match for the given request type. - addTraceFor(AbstractGoogleClient, String) - Method in class org.apache.beam.sdk.extensions.gcp.options.GoogleApiDebugOptions.GoogleApiTracer
-
Creates a
GoogleApiDebugOptions.GoogleApiTracer
that sets the trace destination on all calls that match the given client type. - addUdaf(String, Combine.CombineFn) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv.BeamSqlEnvBuilder
-
Register a UDAF function which can be used in GROUP-BY expression.
- addUdf(String, Class<?>, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv.BeamSqlEnvBuilder
-
Register a UDF function which can be used in SQL expression.
- addUdf(String, Class<? extends BeamSqlUdf>) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv.BeamSqlEnvBuilder
-
Register a UDF function which can be used in SQL expression.
- addUdf(String, SerializableFunction) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv.BeamSqlEnvBuilder
-
Register a UDF function which can be used in SQL expression.
- addUuids() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteIO
-
Add Uuids to to-be-published messages that ensures that uniqueness is maintained.
- AddUuidsTransform - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
-
A transform to add UUIDs to each message to be written to Pub/Sub Lite.
- AddUuidsTransform() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.AddUuidsTransform
- addValue(Object) - Method in class org.apache.beam.sdk.values.Row.Builder
- addValues(Object...) - Method in class org.apache.beam.sdk.values.Row.Builder
- addValues(List<Object>) - Method in class org.apache.beam.sdk.values.Row.Builder
- addWatermarkHoldUsage(Instant) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals
- advance() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
- advance() - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
-
See
GlobalWatermarkHolder.advance(String)
. - advance() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
-
For subscription mode only: Track progression of time according to the
Clock
passed . - advance() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
- advance() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
- advance() - Method in class org.apache.beam.sdk.io.Source.Reader
-
Advances the reader to the next valid record.
- advance() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
-
Advances the reader to the next valid record.
- advanceBy(Duration) - Static method in class org.apache.beam.sdk.testing.TestStream.ProcessingTimeEvent
-
For internal use only: no backwards compatibility guarantees.
- Advanced features - Search tag in class org.apache.beam.sdk.io.TextIO
- Section
- Advanced Kafka Configuration - Search tag in class org.apache.beam.sdk.io.kafka.KafkaIO
- Section
- Advanced SolaceIO#read(TypeDescriptor, SerializableFunction, SerializableFunction) top-level method - Search tag in class org.apache.beam.sdk.io.solace.SolaceIO
- Section
- advanceImpl() - Method in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
- advanceImpl() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
-
Advances to the next record and returns
true
, or returns false if there is no next record. - advanceNextBatchWatermarkToInfinity() - Method in class org.apache.beam.runners.spark.io.CreateStream
-
Advances the watermark in the next batch to the end-of-time.
- advanceProcessingTime(Duration) - Method in class org.apache.beam.sdk.testing.TestStream.Builder
-
Advance the processing time by the specified amount.
- advanceTo(Instant) - Static method in class org.apache.beam.sdk.testing.TestStream.WatermarkEvent
-
For internal use only: no backwards compatibility guarantees.
- advanceWatermark() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
-
Advances the watermark.
- advanceWatermarkForNextBatch(Instant) - Method in class org.apache.beam.runners.spark.io.CreateStream
-
Advances the watermark in the next batch.
- advanceWatermarkTo(Instant) - Method in class org.apache.beam.sdk.testing.TestStream.Builder
-
Advance the watermark of this source to the specified instant.
- advanceWatermarkToInfinity() - Method in class org.apache.beam.sdk.testing.TestStream.Builder
-
Advance the watermark to infinity, completing this
TestStream
. - AdvancingPhaser - Class in org.apache.beam.sdk.fn.stream
-
A
Phaser
which never terminates. - AdvancingPhaser(int) - Constructor for class org.apache.beam.sdk.fn.stream.AdvancingPhaser
- AfterAll - Class in org.apache.beam.sdk.transforms.windowing
-
A composite
Trigger
that fires when all of its sub-triggers are ready. - afterBundleCommit(Instant, DoFn.BundleFinalizer.Callback) - Method in interface org.apache.beam.sdk.transforms.DoFn.BundleFinalizer
-
The provided function will be called after the runner successfully commits the output of a successful bundle.
- AfterEach - Class in org.apache.beam.sdk.transforms.windowing
-
A composite
Trigger
that executes its sub-triggers in order. - AfterFirst - Class in org.apache.beam.sdk.transforms.windowing
-
A composite
Trigger
that fires once after at least one of its sub-triggers have fired. - afterIterations(int) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
-
Returns a
Watch.Growth.TerminationCondition
that holds after the given number of polling iterations have occurred per-input. - AfterPane - Class in org.apache.beam.sdk.transforms.windowing
-
A
Trigger
that fires at some point after a specified number of input elements have arrived. - AfterProcessingTime - Class in org.apache.beam.sdk.transforms.windowing
-
A
Trigger
trigger that fires at a specified point in processing time, relative to when input first arrives. - AfterSynchronizedProcessingTime - Class in org.apache.beam.sdk.transforms.windowing
-
FOR INTERNAL USE ONLY.
- afterTimeSinceNewOutput(SerializableFunction<InputT, ReadableDuration>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
-
Like
Watch.Growth.afterTimeSinceNewOutput(ReadableDuration)
, but the duration is input-dependent. - afterTimeSinceNewOutput(ReadableDuration) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
-
Returns a
Watch.Growth.TerminationCondition
that holds after the given time has elapsed after the last time theWatch.Growth.PollResult
for the current input contained a previously unseen output. - afterTotalOf(SerializableFunction<InputT, ReadableDuration>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
-
Like
Watch.Growth.afterTotalOf(ReadableDuration)
, but the duration is input-dependent. - afterTotalOf(ReadableDuration) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
-
Returns a
Watch.Growth.TerminationCondition
that holds after the given time has elapsed after the current input was seen. - AfterWatermark - Class in org.apache.beam.sdk.transforms.windowing
-
AfterWatermark
triggers fire based on progress of the system watermark. - AfterWatermark.AfterWatermarkEarlyAndLate - Class in org.apache.beam.sdk.transforms.windowing
- AfterWatermark.FromEndOfWindow - Class in org.apache.beam.sdk.transforms.windowing
-
A watermark trigger targeted relative to the end of the window.
- aggregate(Combine.CombineFn<InputT, ?, OutputT>) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
-
Aggregate the grouped data using the specified
Combine.CombineFn
. - AggregateCombiner() - Constructor for class org.apache.beam.sdk.schemas.transforms.Group.AggregateCombiner
- aggregateField(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
- aggregateField(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
- aggregateField(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
- aggregateField(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
- aggregateField(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.AggregateCombiner
-
Build up an aggregation function over the input elements.
- aggregateField(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
- aggregateField(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
- aggregateField(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
- aggregateField(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
- aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
-
Build up an aggregation function over the input elements.
- aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
-
Build up an aggregation function over the input elements.
- aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
-
Build up an aggregation function over the input elements.
- aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
-
Build up an aggregation function over the input elements.
- aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.AggregateCombiner
-
Build up an aggregation function over the input elements.
- aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
-
Build up an aggregation function over the input elements.
- aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
-
Build up an aggregation function over the input elements.
- aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
-
Build up an aggregation function over the input elements.
- aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
-
Build up an aggregation function over the input elements.
- aggregateFieldBaseValue(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
- aggregateFieldBaseValue(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
- aggregateFieldBaseValue(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
- aggregateFieldBaseValue(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
- aggregateFieldBaseValue(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
- aggregateFieldBaseValue(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
- aggregateFieldBaseValue(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
- aggregateFieldBaseValue(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
- aggregateFieldBaseValue(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
- aggregateFieldBaseValue(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
- aggregateFieldBaseValue(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
- aggregateFieldBaseValue(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
- aggregateFieldBaseValue(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
- aggregateFieldBaseValue(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
- aggregateFieldBaseValue(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
- aggregateFieldBaseValue(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
- aggregateFields(List<String>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
-
Build up an aggregation function over the input elements.
- aggregateFields(List<String>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
-
Build up an aggregation function over the input elements.
- aggregateFields(List<String>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
-
Build up an aggregation function over the input elements.
- aggregateFields(List<String>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
-
Build up an aggregation function over the input elements.
- aggregateFields(List<String>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
-
Build up an aggregation function over the input elements.
- aggregateFields(List<String>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
-
Build up an aggregation function over the input elements.
- aggregateFields(List<String>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
-
Build up an aggregation function over the input elements.
- aggregateFields(List<String>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
-
Build up an aggregation function over the input elements.
- aggregateFields(FieldAccessDescriptor, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
-
Build up an aggregation function over the input elements.
- aggregateFields(FieldAccessDescriptor, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
-
Build up an aggregation function over the input elements.
- aggregateFields(FieldAccessDescriptor, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
-
Build up an aggregation function over the input elements.
- aggregateFields(FieldAccessDescriptor, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
-
Build up an aggregation function over the input elements.
- aggregateFields(FieldAccessDescriptor, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
-
Build up an aggregation function over the input elements.
- aggregateFields(FieldAccessDescriptor, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
-
Build up an aggregation function over the input elements.
- aggregateFields(FieldAccessDescriptor, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
-
Build up an aggregation function over the input elements.
- aggregateFields(FieldAccessDescriptor, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
-
Build up an aggregation function over the input elements.
- aggregateFieldsById(List<Integer>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
- aggregateFieldsById(List<Integer>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
- aggregateFieldsById(List<Integer>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
- aggregateFieldsById(List<Integer>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.AggregateCombiner
-
Build up an aggregation function over the input elements by field id.
- aggregateFieldsById(List<Integer>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
- aggregateFieldsById(List<Integer>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
- aggregateFieldsById(List<Integer>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
- aggregateFieldsById(List<Integer>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
- AggregateFn<InputT,
AccumT, - Interface in org.apache.beam.sdk.extensions.sql.udfOutputT> -
An aggregate function that can be executed as part of a SQL query.
- AggregationCombineFnAdapter<T> - Class in org.apache.beam.sdk.extensions.sql.impl.transform.agg
-
Wrapper
Combine.CombineFn
s for aggregation function calls. - AggregationCombineFnAdapter() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.agg.AggregationCombineFnAdapter
- Aggregation of records - Search tag in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO
- Section
- AggregationQuery - Class in org.apache.beam.sdk.io.mongodb
-
Builds a MongoDB AggregateIterable object.
- AggregationQuery() - Constructor for class org.apache.beam.sdk.io.mongodb.AggregationQuery
- algorithm(String) - Method in class org.apache.beam.sdk.io.aws2.s3.SSECustomerKey.Builder
- ALIAS - Enum constant in enum class org.apache.beam.sdk.io.clickhouse.TableSchema.DefaultType
- align(Duration) - Method in interface org.apache.beam.sdk.state.Timer
-
Aligns the target timestamp used by
Timer.setRelative()
to the next boundary ofperiod
. - alignedTo(Duration) - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
-
Aligns the time to be the smallest multiple of
period
greater than the epoch boundary (akanew Instant(0)
). - alignedTo(Duration, Instant) - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
-
Aligns timestamps to the smallest multiple of
period
since theoffset
greater than the timestamp. - alignTo(Duration) - Static method in class org.apache.beam.sdk.transforms.windowing.TimestampTransform
-
For internal use only; no backwards-compatibility guarantees.
- alignTo(Duration, Instant) - Static method in class org.apache.beam.sdk.transforms.windowing.TimestampTransform
-
For internal use only; no backwards-compatibility guarantees.
- AlignTo() - Constructor for class org.apache.beam.sdk.transforms.windowing.TimestampTransform.AlignTo
- ALL - Enum constant in enum class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.ListQualifier
- ALL - Enum constant in enum class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.MapQualifier
- ALL_CONTEXTS - Static variable in class org.apache.beam.sdk.testing.CoderProperties
-
All the contexts, for use in test cases.
- ALL_KEYS - Static variable in class org.apache.beam.sdk.io.range.ByteKeyRange
-
The range of all keys, with empty start and end keys.
- allLeavesDescriptor(Schema, SerializableFunction<List<String>, String>) - Static method in class org.apache.beam.sdk.schemas.utils.SelectHelpers
- allMatches(String) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.AllMatches
PTransform
that checks if the entire line matches the Regex. - allMatches(Pattern) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.AllMatches
PTransform
that checks if the entire line matches the Regex. - AllMatches(Pattern) - Constructor for class org.apache.beam.sdk.transforms.Regex.AllMatches
- allMetrics() - Method in class org.apache.beam.sdk.metrics.MetricResults
- allocateAddressAndCreate(List<BindableService>, Endpoints.ApiServiceDescriptor.Builder) - Method in class org.apache.beam.sdk.fn.server.InProcessServerFactory
- allocateAddressAndCreate(List<BindableService>, Endpoints.ApiServiceDescriptor.Builder) - Method in class org.apache.beam.sdk.fn.server.ServerFactory
-
Creates an instance of this server using an ephemeral address.
- allocateAddressAndCreate(List<BindableService>, Endpoints.ApiServiceDescriptor.Builder) - Method in class org.apache.beam.sdk.fn.server.ServerFactory.InetSocketAddressServerFactory
- allocatePortAndCreateFor(List<? extends FnService>, ServerFactory) - Static method in class org.apache.beam.sdk.fn.server.GrpcFnServer
-
Create
GrpcFnServer
s for the providedFnService
s running on an arbitrary port. - allocatePortAndCreateFor(ServiceT, ServerFactory) - Static method in class org.apache.beam.sdk.fn.server.GrpcFnServer
-
Create a
GrpcFnServer
for the providedFnService
running on an arbitrary port. - allOf(Iterable<SerializableMatcher<? super T>>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.allOf(Iterable)
. - allOf(SerializableMatcher<T>...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.allOf(Matcher[])
. - allOf(Watch.Growth.TerminationCondition<InputT, FirstStateT>, Watch.Growth.TerminationCondition<InputT, SecondStateT>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
-
Returns a
Watch.Growth.TerminationCondition
that holds when both of the given two conditions hold. - ALLOW - Enum constant in enum class org.apache.beam.sdk.io.fs.EmptyMatchTreatment
-
Filepatterns matching no resources are allowed.
- ALLOW_DUPLICATES - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- ALLOW_FIELD_ADDITION - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.SchemaUpdateOption
-
Allow adding a nullable field to the schema.
- ALLOW_FIELD_RELAXATION - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.SchemaUpdateOption
-
Allow relaxing a required field in the original schema to nullable.
- ALLOW_IF_WILDCARD - Enum constant in enum class org.apache.beam.sdk.io.fs.EmptyMatchTreatment
-
Filepatterns matching no resources are allowed if the filepattern contains a glob wildcard character, and disallowed otherwise (i.e.
- allowDuplicates() - Method in class org.apache.beam.runners.dataflow.internal.DataflowGroupByKey
-
Returns whether it allows duplicated elements in the output.
- ALLOWS_SHARDABLE_STATE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- allowsDynamicSplitting() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
- allowsDynamicSplitting() - Method in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
- allowsDynamicSplitting() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
-
Whether this reader should allow dynamic splitting of the offset ranges.
- allReaders() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- AlwaysPassMatcher() - Constructor for class org.apache.beam.sdk.testing.TestPipelineOptions.AlwaysPassMatcher
- AlwaysPassMatcherFactory() - Constructor for class org.apache.beam.sdk.testing.TestPipelineOptions.AlwaysPassMatcherFactory
- alwaysRetry() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy
-
Always retry all failures.
- alwaysUseRead() - Method in class org.apache.beam.sdk.transforms.Create.Values
- AmqpIO - Class in org.apache.beam.sdk.io.amqp
-
AmqpIO supports AMQP 1.0 protocol using the Apache QPid Proton-J library.
- AmqpIO.Read - Class in org.apache.beam.sdk.io.amqp
-
A
PTransform
to read/receive messages using AMQP 1.0 protocol. - AmqpIO.Write - Class in org.apache.beam.sdk.io.amqp
-
A
PTransform
to send messages using AMQP 1.0 protocol. - AmqpMessageCoder - Class in org.apache.beam.sdk.io.amqp
-
A coder for AMQP message.
- AmqpMessageCoder() - Constructor for class org.apache.beam.sdk.io.amqp.AmqpMessageCoder
- AmqpMessageCoderProviderRegistrar - Class in org.apache.beam.sdk.io.amqp
-
A
CoderProviderRegistrar
for standard types used withAmqpIO
. - AmqpMessageCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.io.amqp.AmqpMessageCoderProviderRegistrar
- and(Iterable<PCollection<T>>) - Method in class org.apache.beam.sdk.values.PCollectionList
-
Returns a new
PCollectionList
that has all thePCollections
of thisPCollectionList
plus the givenPCollections
appended to the end, in order. - and(String, PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
-
A version of
KeyedPCollectionTuple.and(String, PCollection)
that takes in a string instead of a TupleTag. - and(String, PCollection<Row>) - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
-
Returns a new
PCollectionRowTuple
that has eachPCollection
and tag of thisPCollectionRowTuple
plus the givenPCollection
associated with the given tag. - and(String, PCollection<T>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
-
A version of
PCollectionTuple.and(TupleTag, PCollection)
that takes in a String instead of a TupleTag. - and(List<TupleTag<?>>) - Method in class org.apache.beam.sdk.values.TupleTagList
-
Returns a new
TupleTagList
that has all theTupleTags
of thisTupleTagList
plus the givenTupleTags
appended to the end, in order. - and(PCollection.IsBounded) - Method in enum class org.apache.beam.sdk.values.PCollection.IsBounded
-
Returns the composed IsBounded property.
- and(PCollection<T>) - Method in class org.apache.beam.sdk.values.PCollectionList
-
Returns a new
PCollectionList
that has all thePCollections
of thisPCollectionList
plus the givenPCollection
appended to the end. - and(TupleTag<?>) - Method in class org.apache.beam.sdk.values.TupleTagList
-
Returns a new
TupleTagList
that has all theTupleTags
of thisTupleTagList
plus the givenTupleTag
appended to the end. - and(TupleTag<T>, PCollection<T>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
-
Returns a new
PCollectionTuple
that has eachPCollection
andTupleTag
of thisPCollectionTuple
plus the givenPCollection
associated with the givenTupleTag
. - and(TupleTag<V>, List<V>) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
-
Returns a new
CoGbkResult
based on this, with the given tag and given data added to it. - and(TupleTag<V>, PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
-
Returns a new
KeyedPCollectionTuple<K>
that is the same as this, appended with the given PCollection. - annotateFromBytes(PCollectionView<Map<ByteString, VideoContext>>, List<Feature>) - Static method in class org.apache.beam.sdk.extensions.ml.VideoIntelligence
-
Annotates videos from ByteStrings of their contents.
- annotateFromBytesWithContext(List<Feature>) - Static method in class org.apache.beam.sdk.extensions.ml.VideoIntelligence
-
Annotates videos from key-value pairs of ByteStrings and VideoContext.
- annotateFromURI(List<Feature>, PCollectionView<Map<String, VideoContext>>) - Static method in class org.apache.beam.sdk.extensions.ml.VideoIntelligence
-
Annotates videos from GCS URIs.
- annotateFromUriWithContext(List<Feature>) - Static method in class org.apache.beam.sdk.extensions.ml.VideoIntelligence
-
Annotates videos from key-value pairs of GCS URI and VideoContext.
- annotateImagesFromBytes(PCollectionView<Map<ByteString, ImageContext>>, List<Feature>, long) - Static method in class org.apache.beam.sdk.extensions.ml.CloudVision
-
Creates a
PTransform
that annotates images from their contents encoded inByteString
s. - annotateImagesFromBytes(PCollectionView<Map<ByteString, ImageContext>>, List<Feature>, long, int) - Static method in class org.apache.beam.sdk.extensions.ml.CloudVision
-
Creates a
PTransform
that annotates images from their contents encoded inByteString
s. - AnnotateImagesFromBytes(PCollectionView<Map<ByteString, ImageContext>>, List<Feature>, long, int) - Constructor for class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromBytes
- annotateImagesFromBytesWithContext(List<Feature>, long) - Static method in class org.apache.beam.sdk.extensions.ml.CloudVision
-
Creates a
PTransform
that annotates images from KVs of their GCS addresses in Strings andImageContext
for each image. - annotateImagesFromBytesWithContext(List<Feature>, long, int) - Static method in class org.apache.beam.sdk.extensions.ml.CloudVision
-
Creates a
PTransform
that annotates images from KVs of their GCS addresses in Strings andImageContext
for each image. - AnnotateImagesFromBytesWithContext(List<Feature>, long, int) - Constructor for class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromBytesWithContext
- annotateImagesFromGcsUri(PCollectionView<Map<String, ImageContext>>, List<Feature>, long) - Static method in class org.apache.beam.sdk.extensions.ml.CloudVision
-
Creates a
PTransform
that annotates images from their GCS addresses. - annotateImagesFromGcsUri(PCollectionView<Map<String, ImageContext>>, List<Feature>, long, int) - Static method in class org.apache.beam.sdk.extensions.ml.CloudVision
-
Creates a
PTransform
that annotates images from their GCS addresses. - AnnotateImagesFromGcsUri(PCollectionView<Map<String, ImageContext>>, List<Feature>, long, int) - Constructor for class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromGcsUri
- annotateImagesFromGcsUriWithContext(List<Feature>, long) - Static method in class org.apache.beam.sdk.extensions.ml.CloudVision
-
Creates a
PTransform
that annotates images from KVs of their String-encoded contents andImageContext
for each image. - annotateImagesFromGcsUriWithContext(List<Feature>, long, int) - Static method in class org.apache.beam.sdk.extensions.ml.CloudVision
-
Creates a
PTransform
that annotates images from KVs of their String-encoded contents andImageContext
for each image. - AnnotateImagesFromGcsUriWithContext(List<Feature>, long, int) - Constructor for class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromGcsUriWithContext
- AnnotateText - Class in org.apache.beam.sdk.extensions.ml
-
A
PTransform
using the Cloud AI Natural language processing capability. - AnnotateText() - Constructor for class org.apache.beam.sdk.extensions.ml.AnnotateText
- AnnotateText.Builder - Class in org.apache.beam.sdk.extensions.ml
- AnnotateVideoFromBytes(PCollectionView<Map<ByteString, VideoContext>>, List<Feature>) - Constructor for class org.apache.beam.sdk.extensions.ml.VideoIntelligence.AnnotateVideoFromBytes
- AnnotateVideoFromBytesWithContext(List<Feature>) - Constructor for class org.apache.beam.sdk.extensions.ml.VideoIntelligence.AnnotateVideoFromBytesWithContext
- AnnotateVideoFromUri(PCollectionView<Map<String, VideoContext>>, List<Feature>) - Constructor for class org.apache.beam.sdk.extensions.ml.VideoIntelligence.AnnotateVideoFromUri
- AnnotateVideoFromURIWithContext(List<Feature>) - Constructor for class org.apache.beam.sdk.extensions.ml.VideoIntelligence.AnnotateVideoFromURIWithContext
- annotations - Variable in class org.apache.beam.sdk.transforms.PTransform
- Annotations For PipelineOptions - Search tag in interface org.apache.beam.sdk.options.PipelineOptions
- Section
- any(long) - Static method in class org.apache.beam.sdk.transforms.Sample
-
Sample#any(long)
takes aPCollection<T>
and a limit, and produces a newPCollection<T>
containing up to limit elements of the inputPCollection
. - anyCombineFn(int) - Static method in class org.apache.beam.sdk.transforms.Sample
-
Returns a
Combine.CombineFn
that computes a fixed-sized potentially non-uniform sample of its inputs. - anyOf(Iterable<SerializableMatcher<? super T>>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.anyOf(Iterable)
. - anyOf(SerializableMatcher<T>...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.anyOf(Matcher[])
. - anything() - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.anything()
. - anyValueCombineFn() - Static method in class org.apache.beam.sdk.transforms.Sample
-
Returns a
Combine.CombineFn
that computes a single and potentially non-uniform sample value of its inputs. - API_METRIC_LABEL - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl
- ApiIOError - Class in org.apache.beam.io.requestresponse
-
ApiIOError
is a data class for storing details about an error. - ApiIOError() - Constructor for class org.apache.beam.io.requestresponse.ApiIOError
- append(K, W, Iterator<V>) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.BagUserStateHandler
-
Appends the values to the bag user state for the given key and window.
- APPEND - Enum constant in enum class org.apache.beam.sdk.io.redis.RedisIO.Write.Method
-
Use APPEND command.
- APPEND - Enum constant in enum class org.apache.beam.sdk.io.snowflake.enums.WriteDisposition
- APPEND_ROWS - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics.RpcMethod
- appendRows(long, ProtoRows) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StreamAppendClient
-
Append rows to a Storage API write stream at the given offset.
- appendRowsRowStatusCounter(BigQuerySinkMetrics.RowStatus, String, String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
- applicableTo(PCollection<?>) - Static method in class org.apache.beam.runners.dataflow.internal.DataflowGroupByKey
- applicableTo(PCollection<?>) - Static method in class org.apache.beam.sdk.transforms.GroupByKey
- ApplicationNameOptions - Interface in org.apache.beam.sdk.options
-
Options that allow setting the application name.
- apply(byte[]) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages.DeserializeBytesIntoPubsubMessagePayloadOnly
- apply(byte[]) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages.ParsePubsubMessageProtoAsPayload
- apply(double, double) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
-
Applies the binary operation to the two operands, returning the result.
- apply(int, int) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
-
Applies the binary operation to the two operands, returning the result.
- apply(long, long) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
-
Applies the binary operation to the two operands, returning the result.
- apply(Row) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableRowToBeamRow
- apply(MongoCollection<Document>) - Method in class org.apache.beam.sdk.io.mongodb.AggregationQuery
- apply(MongoCollection<Document>) - Method in class org.apache.beam.sdk.io.mongodb.FindQuery
- apply(InputT) - Method in interface org.apache.beam.sdk.coders.DelegateCoder.CodingFunction
- apply(InputT) - Method in class org.apache.beam.sdk.transforms.InferableFunction
- apply(InputT) - Method in interface org.apache.beam.sdk.transforms.ProcessFunction
-
Returns the result of invoking this function on the given input.
- apply(InputT) - Method in interface org.apache.beam.sdk.transforms.SerializableFunction
-
Returns the result of invoking this function on the given input.
- apply(InputT) - Method in class org.apache.beam.sdk.transforms.SimpleFunction
- apply(InputT, Contextful.Fn.Context) - Method in interface org.apache.beam.sdk.transforms.Contextful.Fn
-
Invokes the function on the given input with the given context.
- apply(Iterable<? extends InputT>) - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
- apply(Iterable<? extends InputT>) - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
-
Applies this
CombineFn
to a collection of input values to produce a combined output value. - apply(Iterable<? extends InputT>, CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
-
Applies this
CombineFnWithContext
to a collection of input values to produce a combined output value. - apply(String, Session) - Method in class org.apache.beam.sdk.io.jms.TextMessageMapper
- apply(String, PTransform<? super PBegin, OutputT>) - Method in class org.apache.beam.sdk.Pipeline
- apply(String, PTransform<? super PBegin, OutputT>) - Method in class org.apache.beam.sdk.values.PBegin
-
Applies the given
PTransform
to thisPBegin
, usingname
to identify this specific application of the transform. - apply(String, PTransform<? super PCollection<T>, OutputT>) - Method in class org.apache.beam.sdk.values.PCollection
-
Applies the given
PTransform
to this inputPCollection
, usingname
to identify this specific application of the transform. - apply(String, PTransform<? super PCollectionRowTuple, OutputT>) - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
-
Applies the given
PTransform
to this inputPCollectionRowTuple
, usingname
to identify this specific application of the transform. - apply(String, PTransform<? super PCollectionTuple, OutputT>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
-
Applies the given
PTransform
to this inputPCollectionTuple
, usingname
to identify this specific application of the transform. - apply(String, PTransform<KeyedPCollectionTuple<K>, OutputT>) - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
- apply(String, PTransform<PCollectionList<T>, OutputT>) - Method in class org.apache.beam.sdk.values.PCollectionList
-
Applies the given
PTransform
to this inputPCollectionList
, usingname
to identify this specific application of the transform. - apply(String, T) - Method in interface org.apache.beam.sdk.io.aws2.sqs.SqsIO.WriteBatches.EntryMapperFn.Builder
- apply(Void) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceProviderFromDataSourceConfiguration
- apply(Void) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.PoolableDataSourceProvider
- apply(Void) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverProviderFromDriverConfiguration
- apply(Void) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceProviderFromDataSourceConfiguration
- apply(SQLException) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DefaultRetryStrategy
- apply(SQLException) - Method in interface org.apache.beam.sdk.io.jdbc.JdbcIO.RetryStrategy
- apply(Schema) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory.GenericDatumFactory
- apply(Schema) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory.ReflectDatumFactory
- apply(Schema) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory.SpecificDatumFactory
- apply(Schema) - Method in interface org.apache.beam.sdk.extensions.avro.io.AvroSink.DatumWriterFactory
- apply(Schema, Schema) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory.GenericDatumFactory
- apply(Schema, Schema) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory.ReflectDatumFactory
- apply(Schema, Schema) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory.SpecificDatumFactory
- apply(Schema, Schema) - Method in interface org.apache.beam.sdk.extensions.avro.io.AvroSource.DatumReaderFactory
- apply(ByteArray, Option<byte[]>, State<StateAndTimers>) - Method in class org.apache.beam.runners.spark.translation.streaming.ParDoStateUpdateFn
- apply(SqsMessage) - Method in class org.apache.beam.sdk.io.aws2.sqs.providers.SqsReadSchemaTransformProvider.SqsMessageToBeamRow
- apply(FileIO.ReadableFile, OffsetRange, Exception) - Method in class org.apache.beam.sdk.io.ReadAllViaFileBasedSource.ReadFileRangesFnExceptionHandler
- apply(HealthcareIOError<T>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOErrorToTableRow
- apply(PubsubMessage) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages.ParsePayloadAsPubsubMessageProto
- apply(Pipeline, String, RunnerApi.FunctionSpec, Map<String, PCollection<?>>) - Method in interface org.apache.beam.sdk.expansion.service.TransformProvider
- apply(Schema, Schema) - Method in class org.apache.beam.sdk.schemas.transforms.Cast.Narrowing
- apply(Schema, Schema) - Method in interface org.apache.beam.sdk.schemas.transforms.Cast.Validator
- apply(Schema, Schema) - Method in class org.apache.beam.sdk.schemas.transforms.Cast.Widening
- apply(Schema, Schema) - Method in class org.apache.beam.sdk.schemas.utils.SchemaZipFold
- apply(Materializations.IterableView<KV<K, V>>) - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryMapViewFn
- apply(Materializations.IterableView<KV<K, V>>) - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryMultimapViewFn
- apply(Materializations.IterableView<T>) - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryListViewFn
- apply(Materializations.IterableView<T>) - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableBackedListViewFn
- apply(Materializations.IterableView<T>) - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableViewFn2
- apply(Materializations.IterableView<T>) - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn2
- apply(Materializations.MultimapView<Long, PCollectionViews.ValueOrMetadata<T, OffsetRange>>) - Method in class org.apache.beam.sdk.values.PCollectionViews.ListViewFn2
- apply(Materializations.MultimapView<Void, KV<K, V>>) - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryMapFromVoidKeyViewFn
- apply(Materializations.MultimapView<Void, KV<K, V>>) - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryMultimapFromVoidKeyViewFn
- apply(Materializations.MultimapView<Void, KV<K, V>>) - Method in class org.apache.beam.sdk.values.PCollectionViews.MapViewFn
-
Deprecated.
- apply(Materializations.MultimapView<Void, KV<K, V>>) - Method in class org.apache.beam.sdk.values.PCollectionViews.MultimapViewFn
-
Deprecated.
- apply(Materializations.MultimapView<Void, T>) - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryListFromMultimapViewFn
- apply(Materializations.MultimapView<Void, T>) - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableViewFn
-
Deprecated.
- apply(Materializations.MultimapView<Void, T>) - Method in class org.apache.beam.sdk.values.PCollectionViews.ListViewFn
-
Deprecated.
- apply(Materializations.MultimapView<Void, T>) - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn
-
Deprecated.
- apply(Materializations.MultimapView<K, V>) - Method in class org.apache.beam.sdk.values.PCollectionViews.MapViewFn2
- apply(Materializations.MultimapView<K, V>) - Method in class org.apache.beam.sdk.values.PCollectionViews.MultimapViewFn2
- apply(PTransform<? super PBegin, OutputT>) - Method in class org.apache.beam.sdk.Pipeline
-
Like
Pipeline.apply(String, PTransform)
but the transform node in thePipeline
graph will be named according toPTransform.getName()
. - apply(PTransform<? super PBegin, OutputT>) - Method in class org.apache.beam.sdk.values.PBegin
-
Like
PBegin.apply(String, PTransform)
but defaulting to the name of thePTransform
. - apply(PTransform<? super PCollection<T>, OutputT>) - Method in class org.apache.beam.sdk.values.PCollection
-
of the
PTransform
. - apply(PTransform<? super PCollectionRowTuple, OutputT>) - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
-
Like
PCollectionRowTuple.apply(String, PTransform)
but defaulting to the name of thePTransform
. - apply(PTransform<? super PCollectionTuple, OutputT>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
-
Like
PCollectionTuple.apply(String, PTransform)
but defaulting to the name of thePTransform
. - apply(PTransform<KeyedPCollectionTuple<K>, OutputT>) - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
-
Like
KeyedPCollectionTuple.apply(String, PTransform)
but defaulting to the name provided by thePTransform
. - apply(PTransform<PCollectionList<T>, OutputT>) - Method in class org.apache.beam.sdk.values.PCollectionList
-
Like
PCollectionList.apply(String, PTransform)
but defaulting to the name of thePTransform
. - apply(WithFailures.ExceptionElement<T>) - Method in class org.apache.beam.sdk.transforms.WithFailures.ExceptionAsMapHandler
- apply(WithFailures.ExceptionElement<T>) - Method in class org.apache.beam.sdk.transforms.WithFailures.ThrowableHandler
- apply(KV<String, Long>) - Method in class org.apache.beam.runners.spark.structuredstreaming.examples.WordCount.FormatAsTextFn
- apply(Row) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamSetOperatorsTransforms.BeamSqlRow2KvFn
- apply(Row) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BeamRowToBigtableMutation.ToBigtableRowFn
- apply(Row) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.GetMutationsFromBeamRow
- apply(ValueInSingleWindow<byte[]>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite.ParsePubsubMessageProtoAsPayloadFromWindowedValue
- apply(TopicPartition) - Method in class org.apache.beam.sdk.io.kafka.CheckStopReadingFnWrapper
- apply(Statement, Description) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
- apply(Statement, Description) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
- apply(Statement, Description) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal
- apply(Statement, Description) - Method in class org.apache.beam.sdk.testing.TestPipeline
- apply(PrimitiveViewT) - Method in class org.apache.beam.sdk.transforms.ViewFn
-
A function to adapt a primitive view type to a desired view type.
- apply(Tuple2<ByteArray, byte[]>) - Method in class org.apache.beam.runners.spark.coders.CoderHelpers.FromByteFunction
- apply(T) - Method in class org.apache.beam.sdk.testing.PAssert.MatcherCheckerFn
- apply(T1) - Method in interface org.apache.beam.sdk.function.ThrowingFunction
- apply(T1, T2) - Method in interface org.apache.beam.sdk.function.ThrowingBiFunction
- apply(V, V) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
-
Applies the binary operation to the two operands, returning the result.
- applyBeamPTransform(Map<String, ? extends DataSet<?>>, PTransform<PCollectionTuple, PCollection<OutputT>>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataSetAdapter
- applyBeamPTransform(Map<String, ? extends DataStream<?>>, PTransform<PCollectionTuple, PCollection<OutputT>>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataStreamAdapter
- applyBeamPTransform(DataSet<InputT>, PTransform<CollectionT, PCollection<OutputT>>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataSetAdapter
- applyBeamPTransform(ExecutionEnvironment, PTransform<PBegin, PCollection<OutputT>>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataSetAdapter
- applyBeamPTransform(DataStream<InputT>, PTransform<CollectionT, PCollection<OutputT>>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataStreamAdapter
- applyBeamPTransform(StreamExecutionEnvironment, PTransform<PBegin, PCollection<OutputT>>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataStreamAdapter
- applyInputWatermarkHold(long) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
-
Allows to apply a hold to the input watermark.
- applyInputWatermarkHold(long) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.ExecutableStageDoFnOperator
- applyMultiOutputBeamPTransform(Map<String, ? extends DataSet<?>>, PTransform<PCollectionTuple, PCollectionTuple>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataSetAdapter
- applyMultiOutputBeamPTransform(Map<String, ? extends DataStream<?>>, PTransform<PCollectionTuple, PCollectionTuple>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataStreamAdapter
- applyMultiOutputBeamPTransform(DataSet<InputT>, PTransform<CollectionT, PCollectionTuple>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataSetAdapter
- applyMultiOutputBeamPTransform(ExecutionEnvironment, PTransform<PBegin, PCollectionTuple>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataSetAdapter
- applyMultiOutputBeamPTransform(DataStream<InputT>, PTransform<CollectionT, PCollectionTuple>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataStreamAdapter
- applyMultiOutputBeamPTransform(StreamExecutionEnvironment, PTransform<PBegin, PCollectionTuple>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataStreamAdapter
- applyNoOutputBeamPTransform(Map<String, ? extends DataSet<?>>, PTransform<PCollectionTuple, PDone>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataSetAdapter
- applyNoOutputBeamPTransform(Map<String, ? extends DataStream<?>>, PTransform<PCollectionTuple, PDone>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataStreamAdapter
- applyNoOutputBeamPTransform(DataSet<InputT>, PTransform<CollectionT, PDone>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataSetAdapter
- applyNoOutputBeamPTransform(ExecutionEnvironment, PTransform<PBegin, PDone>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataSetAdapter
- applyNoOutputBeamPTransform(DataStream<InputT>, PTransform<CollectionT, PDone>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataStreamAdapter
- applyNoOutputBeamPTransform(StreamExecutionEnvironment, PTransform<PBegin, PDone>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataStreamAdapter
- applyOutputWatermarkHold(long, long) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
-
Allows to apply a hold to the output watermark before it is sent out.
- applyOutputWatermarkHold(long, long) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.ExecutableStageDoFnOperator
- applyRowMutations() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
-
Write
RowMutation
messages to BigQuery. - applySdkEnvironmentOverrides(RunnerApi.Pipeline, DataflowPipelineOptions) - Method in class org.apache.beam.runners.dataflow.DataflowRunner
- applyTransform(InputT, PTransform<? super InputT, OutputT>) - Static method in class org.apache.beam.sdk.Pipeline
-
For internal use only; no backwards-compatibility guarantees.
- applyTransform(String, InputT, PTransform<? super InputT, OutputT>) - Static method in class org.apache.beam.sdk.Pipeline
-
For internal use only; no backwards-compatibility guarantees.
- applyWindowing() - Method in class org.apache.beam.sdk.transforms.PeriodicImpulse
- ApproximateCountDistinct - Class in org.apache.beam.sdk.extensions.zetasketch
-
PTransform
s for estimating the number of distinct elements in aPCollection
, or the number of distinct values associated with each key in aPCollection
ofKV
s. - ApproximateCountDistinct() - Constructor for class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct
- ApproximateCountDistinct.Globally<T> - Class in org.apache.beam.sdk.extensions.zetasketch
-
PTransform
for estimating the number of distinct elements in aPCollection
. - ApproximateCountDistinct.Globally.Builder<T> - Class in org.apache.beam.sdk.extensions.zetasketch
- ApproximateCountDistinct.PerKey<K,
V> - Class in org.apache.beam.sdk.extensions.zetasketch - ApproximateCountDistinct.PerKey.Builder<K,
V> - Class in org.apache.beam.sdk.extensions.zetasketch - ApproximateDistinct - Class in org.apache.beam.sdk.extensions.sketching
-
PTransform
s for computing the approximate number of distinct elements in a stream. - ApproximateDistinct() - Constructor for class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct
- ApproximateDistinct.ApproximateDistinctFn<InputT> - Class in org.apache.beam.sdk.extensions.sketching
-
Implements the
Combine.CombineFn
ofApproximateDistinct
transforms. - ApproximateDistinct.GloballyDistinct<InputT> - Class in org.apache.beam.sdk.extensions.sketching
-
Implementation of
ApproximateDistinct.globally()
. - ApproximateDistinct.HyperLogLogPlusCoder - Class in org.apache.beam.sdk.extensions.sketching
-
Coder for
HyperLogLogPlus
class. - ApproximateDistinct.PerKeyDistinct<K,
V> - Class in org.apache.beam.sdk.extensions.sketching -
Implementation of
ApproximateDistinct.perKey()
. - ApproximateQuantiles - Class in org.apache.beam.sdk.transforms
-
PTransform
s for getting an idea of aPCollection
's data distribution using approximateN
-tiles (e.g. - ApproximateQuantiles.ApproximateQuantilesCombineFn<T,
ComparatorT> - Class in org.apache.beam.sdk.transforms -
The
ApproximateQuantilesCombineFn
combiner gives an idea of the distribution of a collection of values using approximateN
-tiles. - ApproximateUnique - Class in org.apache.beam.sdk.transforms
-
Deprecated.
- ApproximateUnique() - Constructor for class org.apache.beam.sdk.transforms.ApproximateUnique
-
Deprecated.
- ApproximateUnique.ApproximateUniqueCombineFn<T> - Class in org.apache.beam.sdk.transforms
-
Deprecated.
CombineFn
that computes an estimate of the number of distinct values that were combined. - ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique - Class in org.apache.beam.sdk.transforms
-
Deprecated.A heap utility class to efficiently track the largest added elements.
- ApproximateUnique.Globally<T> - Class in org.apache.beam.sdk.transforms
-
Deprecated.
PTransform
for estimating the number of distinct elements in aPCollection
. - ApproximateUnique.PerKey<K,
V> - Class in org.apache.beam.sdk.transforms -
Deprecated.
PTransform
for estimating the number of distinct values associated with each key in aPCollection
ofKV
s. - ApproximateUniqueCombineFn(long, Coder<T>) - Constructor for class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
-
Deprecated.
- arbitrarily() - Static method in class org.apache.beam.sdk.transforms.Redistribute
- array() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
-
Returns the backing array.
- array(TableSchema.ColumnType) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- array(Schema.FieldType) - Static method in class org.apache.beam.sdk.schemas.Schema.FieldType
-
Create an array type for the given field type.
- array(Schema.FieldType, boolean) - Static method in class org.apache.beam.sdk.schemas.Schema.FieldType
-
Deprecated.Set the nullability on the elementType instead
- ARRAY - Enum constant in enum class org.apache.beam.sdk.io.clickhouse.TableSchema.TypeName
- ARRAY - Enum constant in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- ARRAY_AGG_FN - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
- ArrayAgg - Class in org.apache.beam.sdk.extensions.sql.impl.udaf
- ArrayAgg() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.udaf.ArrayAgg
- ArrayAgg.ArrayAggArray<T> - Class in org.apache.beam.sdk.extensions.sql.impl.udaf
- ArrayAggArray() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.udaf.ArrayAgg.ArrayAggArray
- arrayContaining(List<SerializableMatcher<? super T>>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.arrayContaining(List)
. - arrayContaining(Coder<T>, T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.arrayContaining(Object[])
. - arrayContaining(SerializableMatcher<? super T>...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.arrayContaining(Matcher[])
. - arrayContaining(T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.arrayContaining(Object[])
. - arrayContainingInAnyOrder(Collection<SerializableMatcher<? super T>>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.arrayContainingInAnyOrder(Collection)
. - arrayContainingInAnyOrder(Coder<T>, T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.arrayContainingInAnyOrder(Object[])
. - arrayContainingInAnyOrder(SerializableMatcher<? super T>...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.arrayContainingInAnyOrder(Matcher[])
. - arrayContainingInAnyOrder(T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.arrayContainingInAnyOrder(Object[])
. - ArrayCopyState() - Constructor for class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark.NewVsCopy.ArrayCopyState
- arrayElementType() - Method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- ArrayNewState() - Constructor for class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark.NewVsCopy.ArrayNewState
- ArrayOfNestedStringBundle() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.ArrayOfNestedStringBundle
- ArrayOfStringBundle() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.ArrayOfStringBundle
- arrayQualifier() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- arrayQualifier() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.ArrayQualifierListContext
- ArrayQualifierContext(ParserRuleContext, int) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.ArrayQualifierContext
- ArrayQualifierListContext(FieldSpecifierNotationParser.QualifierListContext) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.ArrayQualifierListContext
- arrayWithSize(int) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.arrayWithSize(int)
. - arrayWithSize(SerializableMatcher<? super Integer>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.arrayWithSize(Matcher)
. - ArrowConversion - Class in org.apache.beam.sdk.extensions.arrow
- ArrowConversion.ArrowSchemaTranslator - Class in org.apache.beam.sdk.extensions.arrow
-
Converts Arrow schema to Beam row schema.
- ArrowConversion.RecordBatchRowIterator - Class in org.apache.beam.sdk.extensions.arrow
- arrowSchemaFromInput(InputStream) - Static method in class org.apache.beam.sdk.extensions.arrow.ArrowConversion
- ArrowSchemaTranslator() - Constructor for class org.apache.beam.sdk.extensions.arrow.ArrowConversion.ArrowSchemaTranslator
- ArtifactDestination() - Constructor for class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService.ArtifactDestination
- ArtifactRetrievalService - Class in org.apache.beam.runners.fnexecution.artifact
-
An
ArtifactRetrievalService
that usesFileSystems
as its backing storage. - ArtifactRetrievalService() - Constructor for class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
- ArtifactRetrievalService(int) - Constructor for class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
- ArtifactRetrievalService(ArtifactResolver) - Constructor for class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
- ArtifactRetrievalService(ArtifactResolver, int) - Constructor for class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
- ArtifactStagingService - Class in org.apache.beam.runners.fnexecution.artifact
- ArtifactStagingService(ArtifactStagingService.ArtifactDestinationProvider) - Constructor for class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService
- ArtifactStagingService.ArtifactDestination - Class in org.apache.beam.runners.fnexecution.artifact
-
A pairing of a newly created artifact type and an output stream that will be readable at that type.
- ArtifactStagingService.ArtifactDestinationProvider - Interface in org.apache.beam.runners.fnexecution.artifact
-
Provides a concrete location to which artifacts can be staged on retrieval.
- as(Class<T>) - Method in interface org.apache.beam.sdk.options.PipelineOptions
-
Transforms this object into an object of type
<T>
saving each property that has been manipulated. - as(Class<T>) - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
-
Creates and returns an object that implements
<T>
. - as(Class<T>) - Method in class org.apache.beam.sdk.options.PipelineOptionsFactory.Builder
-
Creates and returns an object that implements
<T>
using the values configured on this builder during construction. - asCloudObject(Coder<?>, SdkComponents) - Static method in class org.apache.beam.runners.dataflow.util.CloudObjects
-
Convert the provided
Coder
into aCloudObject
. - asInputStream(int, int) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
-
Returns an
InputStream
wrapper which supplies the portion of this backing byte buffer starting atoffset
and up tolength
bytes. - asIterable() - Static method in class org.apache.beam.sdk.transforms.View
-
Returns a
View.AsIterable
transform that takes aPCollection
as input and produces aPCollectionView
mapping each window to anIterable
of the values in that window. - AsJsons<InputT> - Class in org.apache.beam.sdk.extensions.jackson
-
PTransform
for serializing objects to JSONStrings
. - AsJsons.AsJsonsWithFailures<FailureT> - Class in org.apache.beam.sdk.extensions.jackson
-
A
PTransform
that adds exception handling toAsJsons
. - asList() - Static method in class org.apache.beam.sdk.transforms.View
-
Returns a
View.AsList
transform that takes aPCollection
and returns aPCollectionView
mapping each window to aList
containing all of the elements in the window. - asMap() - Method in class org.apache.beam.sdk.transforms.display.DisplayData
- asMap() - Static method in class org.apache.beam.sdk.transforms.View
-
Returns a
View.AsMap
transform that takes aPCollection<KV<K, V>>
as input and produces aPCollectionView
mapping each window to aMap<K, V>
. - asMultimap() - Static method in class org.apache.beam.sdk.transforms.View
-
Returns a
View.AsMultimap
transform that takes aPCollection<KV<K, V>>
as input and produces aPCollectionView
mapping each window to its contents as aMap<K, Iterable<V>>
for use as a side input. - asOutputReference(PValue, AppliedPTransform<?, ?, ?>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
-
Encode a PValue reference as an output reference.
- asOutputStream() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
-
Returns an output stream which writes to the backing buffer from the current position.
- asPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubSubscription
-
Returns the string representation of this subscription as a path used in the Cloud Pub/Sub API.
- asPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
-
Returns the string representation of this topic as a path used in the Cloud Pub/Sub API.
- asQueryable(QueryProvider, SchemaPlus, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteTable
- asResponseObserver() - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClient
- assertionError() - Method in class org.apache.beam.sdk.testing.SuccessOrFailure
- assertSourcesEqualReferenceSource(BoundedSource<T>, List<? extends BoundedSource<T>>, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
-
Given a reference
Source
and a list ofSource
s, assert that the union of the records read from the list of sources is equal to the records read from the reference source. - assertSplitAtFractionBehavior(BoundedSource<T>, int, double, SourceTestUtils.ExpectedSplitOutcome, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
-
Asserts that the
source
's reader either fails tosplitAtFraction(fraction)
after readingnumItemsToReadBeforeSplit
items, or succeeds in a way that is consistent according toSourceTestUtils.assertSplitAtFractionSucceedsAndConsistent(org.apache.beam.sdk.io.BoundedSource<T>, int, double, org.apache.beam.sdk.options.PipelineOptions)
. - assertSplitAtFractionExhaustive(BoundedSource<T>, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
-
Asserts that for each possible start position,
BoundedSource.BoundedReader.splitAtFraction(double)
at every interesting fraction (halfway between two fractions that differ by at least one item) can be called successfully and the results are consistent if a split succeeds. - assertSplitAtFractionFails(BoundedSource<T>, int, double, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
-
Asserts that the
source
's reader fails tosplitAtFraction(fraction)
after readingnumItemsToReadBeforeSplit
items. - assertSplitAtFractionSucceedsAndConsistent(BoundedSource<T>, int, double, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
-
Verifies some consistency properties of
BoundedSource.BoundedReader.splitAtFraction(double)
on the given source. - assertSubscriptionEventuallyCreated(String, Duration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
-
Block until a subscription is created for this test topic in the specified project.
- assertThatAllRows(Schema) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
- assertThatTopicEventuallyReceives(Matcher<PubsubMessage>...) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
-
Repeatedly pull messages from
TestPubsub.subscriptionPath()
until receiving one for each matcher (or timeout is reached), then assert that the received messages match the expectations. - assertUnstartedReaderReadsSameAsItsSource(BoundedSource.BoundedReader<T>, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
-
Assert that a
Reader
returns aSource
that, when read from, produces the same records as the reader. - assign(BoundedWindow, Instant) - Method in enum class org.apache.beam.sdk.transforms.windowing.TimestampCombiner
-
Shorthand for
TimestampCombiner.merge(org.apache.beam.sdk.transforms.windowing.BoundedWindow, java.lang.Iterable<? extends org.joda.time.Instant>)
with just one element, to place it into the context of a window. - assignableTo(Schema) - Method in class org.apache.beam.sdk.schemas.Schema
-
Returns true if this Schema can be assigned to another Schema.
- assignableToIgnoreNullable(Schema) - Method in class org.apache.beam.sdk.schemas.Schema
-
Returns true if this Schema can be assigned to another Schema, ignoring nullable.
- AssignContext() - Constructor for class org.apache.beam.sdk.transforms.windowing.WindowFn.AssignContext
- assignedWindows(WindowFn<T, W>, long) - Static method in class org.apache.beam.sdk.testing.WindowFnTestUtils
- assignedWindowsWithValue(WindowFn<T, W>, TimestampedValue<T>) - Static method in class org.apache.beam.sdk.testing.WindowFnTestUtils
- AssignShardFn(Integer) - Constructor for class org.apache.beam.sdk.transforms.Reshuffle.AssignShardFn
- assignShardKey(DestinationT, UserT, int) - Method in interface org.apache.beam.sdk.io.ShardingFunction
- assignsToOneWindow() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
- assignsToOneWindow() - Method in class org.apache.beam.sdk.transforms.windowing.PartitioningWindowFn
- assignsToOneWindow() - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
- assignsToOneWindow() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
-
Returns true if this
WindowFn
always assigns an element to exactly one window. - assignWindow(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
- assignWindow(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
- assignWindow(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
- assignWindow(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
- assignWindow(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.PartitioningWindowFn
-
Returns the single window to which elements with this timestamp belong.
- AssignWindowP<T> - Class in org.apache.beam.runners.jet.processors
-
/** * Jet
Processor
implementation for Beam's Windowing primitive. - assignWindows(WindowFn.AssignContext) - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
- assignWindows(WindowFn.AssignContext) - Method in class org.apache.beam.sdk.extensions.sql.impl.TVFSlidingWindowFn
- assignWindows(WindowFn.AssignContext) - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
- assignWindows(WindowFn.AssignContext) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
- assignWindows(WindowFn.AssignContext) - Method in class org.apache.beam.sdk.transforms.windowing.PartitioningWindowFn
- assignWindows(WindowFn.AssignContext) - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
-
Given a timestamp and element, returns the set of windows into which it should be placed.
- assignWindows(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
- AssignWindowsFunction<T> - Class in org.apache.beam.runners.twister2.translators.functions
-
Assign Windows function.
- AssignWindowsFunction(WindowFn<T, BoundedWindow>, PipelineOptions) - Constructor for class org.apache.beam.runners.twister2.translators.functions.AssignWindowsFunction
- AssignWindowTranslatorBatch<T> - Class in org.apache.beam.runners.twister2.translators.batch
-
Assign Window translator.
- AssignWindowTranslatorBatch() - Constructor for class org.apache.beam.runners.twister2.translators.batch.AssignWindowTranslatorBatch
- asSingleton() - Static method in class org.apache.beam.sdk.transforms.View
-
Returns a
View.AsSingleton
transform that takes aPCollection
with a single value per window as input and produces aPCollectionView
that returns the value in the main input window when read as a side input. - asSingletonView() - Method in class org.apache.beam.sdk.transforms.Combine.Globally
-
Returns a
PTransform
that produces aPCollectionView
whose elements are the result of combining elements per-window in the inputPCollection
. - assumedRoleArn() - Method in class org.apache.beam.sdk.io.aws2.auth.StsAssumeRoleForFederatedCredentialsProvider
- assumeSingleMessageSchema() - Method in class org.apache.beam.sdk.extensions.sbe.SbeSchema.IrOptions
- ASTERISK - Static variable in class org.apache.beam.sdk.extensions.sql.impl.cep.Quantifier
- ASTERISK_RELUCTANT - Static variable in class org.apache.beam.sdk.extensions.sql.impl.cep.Quantifier
- asV1Beta1Path() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubSubscription
-
Deprecated.the v1beta1 API for Cloud Pub/Sub is deprecated.
- asV1Beta1Path() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
-
Deprecated.the v1beta1 API for Cloud Pub/Sub is deprecated.
- asV1Beta2Path() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubSubscription
-
Deprecated.the v1beta2 API for Cloud Pub/Sub is deprecated.
- asV1Beta2Path() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
-
Deprecated.the v1beta2 API for Cloud Pub/Sub is deprecated.
- AsyncBatchWriteHandler<RecT,
ResT> - Class in org.apache.beam.sdk.io.aws2.common -
Async handler that automatically retries unprocessed records in case of a partial success.
- AsyncBatchWriteHandler(int, FluentBackoff, AsyncBatchWriteHandler.Stats, Function<ResT, String>, BiFunction<String, List<RecT>, CompletableFuture<List<ResT>>>) - Constructor for class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
- AsyncBatchWriteHandler.Stats - Interface in org.apache.beam.sdk.io.aws2.common
-
Statistics on the batch request.
- AsyncWatermarkCache - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.cache
-
Asynchronously compute the earliest partition watermark and stores it in memory.
- AsyncWatermarkCache(PartitionMetadataDao, Duration) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.cache.AsyncWatermarkCache
- atMinimumTimestamp(V) - Static method in class org.apache.beam.sdk.values.TimestampedValue
-
Returns a new
TimestampedValue
with theminimum timestamp
. - AtomicAccumulatorState() - Constructor for class org.apache.beam.sdk.io.kafka.jmh.KafkaIOUtilsBenchmark.AtomicAccumulatorState
- AtomicCoder<T> - Class in org.apache.beam.sdk.coders
- AtomicCoder() - Constructor for class org.apache.beam.sdk.coders.AtomicCoder
- AtomicLongFactory() - Constructor for class org.apache.beam.sdk.options.PipelineOptions.AtomicLongFactory
- atomicRead(KafkaIOUtilsBenchmark.AtomicAccumulatorState) - Method in class org.apache.beam.sdk.io.kafka.jmh.KafkaIOUtilsBenchmark
- atomicReadWhileWriting(KafkaIOUtilsBenchmark.AtomicAccumulatorState) - Method in class org.apache.beam.sdk.io.kafka.jmh.KafkaIOUtilsBenchmark
- atomicWrite(KafkaIOUtilsBenchmark.AtomicAccumulatorState, KafkaIOUtilsBenchmark.ProducerState) - Method in class org.apache.beam.sdk.io.kafka.jmh.KafkaIOUtilsBenchmark
- atomicWriteWhileReading(KafkaIOUtilsBenchmark.AtomicAccumulatorState, KafkaIOUtilsBenchmark.ProducerState) - Method in class org.apache.beam.sdk.io.kafka.jmh.KafkaIOUtilsBenchmark
- attached() - Method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
- attachValues(Object...) - Method in class org.apache.beam.sdk.values.Row.Builder
- attachValues(List<Object>) - Method in class org.apache.beam.sdk.values.Row.Builder
- attempted(MetricKey, T) - Static method in class org.apache.beam.sdk.metrics.MetricResult
- ATTRIBUTE_ARRAY_ENTRY_SCHEMA - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
- ATTRIBUTE_ARRAY_FIELD_TYPE - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
- ATTRIBUTE_MAP_FIELD_TYPE - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
- AttributeValueCoder - Class in org.apache.beam.sdk.io.aws2.dynamodb
-
A
Coder
that serializes and deserializes theAttributeValue
objects. - audience() - Method in class org.apache.beam.sdk.io.aws2.auth.StsAssumeRoleForFederatedCredentialsProvider
- AUTH_VALIDATION_GROUP - Static variable in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- AuthenticatedRetryInitializer(GoogleCredentials) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.AuthenticatedRetryInitializer
- Authentication - Search tag in class org.apache.beam.sdk.io.solace.SolaceIO
- Section
- Authentication - Search tag in class org.apache.beam.sdk.io.solace.SolaceIO
- Section
- AUTO - Enum constant in enum class org.apache.beam.sdk.io.CompressedSource.CompressionMode
-
Deprecated.
- AUTO - Enum constant in enum class org.apache.beam.sdk.io.Compression
-
When reading a file, automatically determine the compression type based on filename extension.
- AUTO - Enum constant in enum class org.apache.beam.sdk.io.TextIO.CompressionType
-
Deprecated.
- AUTO - Enum constant in enum class org.apache.beam.sdk.io.TFRecordIO.CompressionType
-
Deprecated.
- AUTO - Enum constant in enum class org.apache.beam.sdk.io.xml.XmlIO.Read.CompressionType
-
Deprecated.
- AUTO - Static variable in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- autoCastField(Schema.Field, Object) - Static method in class org.apache.beam.sdk.extensions.sql.impl.schema.BeamTableUtils
-
Attempt to cast an object to a specified Schema.Field.Type.
- autoLoadUserDefinedFunctions() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv.BeamSqlEnvBuilder
-
Load UDF/UDAFs from
UdfUdafProvider
. - AutoScaler - Interface in org.apache.beam.sdk.io.jms
-
Enables users to specify their own `JMS` backlog reporters enabling
JmsIO
to reportUnboundedSource.UnboundedReader.getTotalBacklogBytes()
. - AUTOVALUE_CLASS - Static variable in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- AUTOVALUE_CLASS - Static variable in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
- AutoValueSchema - Class in org.apache.beam.sdk.schemas
-
A
SchemaProvider
for AutoValue classes. - AutoValueSchema() - Constructor for class org.apache.beam.sdk.schemas.AutoValueSchema
- AutoValueSchema.AbstractGetterTypeSupplier - Class in org.apache.beam.sdk.schemas
-
FieldValueTypeSupplier
that's based on AutoValue getters. - AutoValueUtils - Class in org.apache.beam.sdk.schemas.utils
-
Utilities for managing AutoValue schemas.
- AutoValueUtils() - Constructor for class org.apache.beam.sdk.schemas.utils.AutoValueUtils
- AVAILABLE_NOW - Static variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- AvailableParallelismFactory() - Constructor for class org.apache.beam.runners.direct.DirectOptions.AvailableParallelismFactory
- Available transforms - Search tag in class org.apache.beam.sdk.managed.Managed
- Section
- AVG - Enum constant in enum class org.apache.beam.sdk.extensions.sql.impl.cep.CEPKind
- AVRO_CODER_URN - Static variable in class org.apache.beam.sdk.extensions.avro.AvroGenericCoderRegistrar
- AvroCoder<T> - Class in org.apache.beam.sdk.extensions.avro.coders
-
A
Coder
using Avro binary format. - AvroCoder(Class<T>, Schema) - Constructor for class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
- AvroCoder(Class<T>, Schema, boolean) - Constructor for class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
- AvroCoder(AvroDatumFactory<T>, Schema) - Constructor for class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
- AvroConvertType(boolean) - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.AvroConvertType
- AvroDatumFactory<T> - Class in org.apache.beam.sdk.extensions.avro.io
-
Create
DatumReader
andDatumWriter
for given schemas. - AvroDatumFactory(Class<T>) - Constructor for class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory
- AvroDatumFactory.GenericDatumFactory - Class in org.apache.beam.sdk.extensions.avro.io
-
Specialized
AvroDatumFactory
forGenericRecord
. - AvroDatumFactory.ReflectDatumFactory<T> - Class in org.apache.beam.sdk.extensions.avro.io
-
Specialized
AvroDatumFactory
for java classes transforming to avro through reflection. - AvroDatumFactory.SpecificDatumFactory<T> - Class in org.apache.beam.sdk.extensions.avro.io
-
Specialized
AvroDatumFactory
forSpecificRecord
. - AvroGenericCoder - Class in org.apache.beam.sdk.extensions.avro.coders
-
AvroCoder specialisation for GenericRecord, needed for cross-language transforms.
- AvroGenericCoderRegistrar - Class in org.apache.beam.sdk.extensions.avro
-
Coder registrar for AvroGenericCoder.
- AvroGenericCoderRegistrar() - Constructor for class org.apache.beam.sdk.extensions.avro.AvroGenericCoderRegistrar
- AvroGenericCoderTranslator - Class in org.apache.beam.sdk.extensions.avro
-
Coder translator for AvroGenericCoder.
- AvroGenericCoderTranslator() - Constructor for class org.apache.beam.sdk.extensions.avro.AvroGenericCoderTranslator
- AvroGenericRecordToStorageApiProto - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Utility methods for converting Avro
GenericRecord
objects to dynamic protocol message, for use with the Storage write API. - AvroGenericRecordToStorageApiProto() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.AvroGenericRecordToStorageApiProto
- AvroIO - Class in org.apache.beam.sdk.extensions.avro.io
-
PTransform
s for reading and writing Avro files. - AvroIO.Parse<T> - Class in org.apache.beam.sdk.extensions.avro.io
- AvroIO.ParseAll<T> - Class in org.apache.beam.sdk.extensions.avro.io
-
Deprecated.See
AvroIO.parseAllGenericRecords(SerializableFunction)
for details. - AvroIO.ParseFiles<T> - Class in org.apache.beam.sdk.extensions.avro.io
- AvroIO.Read<T> - Class in org.apache.beam.sdk.extensions.avro.io
-
Implementation of
AvroIO.read(java.lang.Class<T>)
andAvroIO.readGenericRecords(org.apache.avro.Schema)
. - AvroIO.ReadAll<T> - Class in org.apache.beam.sdk.extensions.avro.io
-
Deprecated.See
AvroIO.readAll(Class)
for details. - AvroIO.ReadFiles<T> - Class in org.apache.beam.sdk.extensions.avro.io
-
Implementation of
AvroIO.readFiles(java.lang.Class<T>)
. - AvroIO.RecordFormatter<ElementT> - Interface in org.apache.beam.sdk.extensions.avro.io
-
Deprecated.Users can achieve the same by providing this transform in a
ParDo
before using write in AvroIOAvroIO.write(Class)
. - AvroIO.Sink<ElementT> - Class in org.apache.beam.sdk.extensions.avro.io
- AvroIO.TypedWrite<UserT,
DestinationT, - Class in org.apache.beam.sdk.extensions.avro.ioOutputT> -
Implementation of
AvroIO.write(java.lang.Class<T>)
. - AvroIO.Write<T> - Class in org.apache.beam.sdk.extensions.avro.io
-
This class is used as the default return value of
AvroIO.write(java.lang.Class<T>)
- AvroJavaTimeConversions - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
-
Avro 1.8 ships with joda time conversions only.
- AvroJavaTimeConversions() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions
- AvroJavaTimeConversions.DateConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroJavaTimeConversions.LocalTimestampMicros - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroJavaTimeConversions.LocalTimestampMicrosConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroJavaTimeConversions.LocalTimestampMillis - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroJavaTimeConversions.LocalTimestampMillisConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroJavaTimeConversions.TimeMicrosConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroJavaTimeConversions.TimeMillisConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroJavaTimeConversions.TimestampMicrosConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroJavaTimeConversions.TimestampMillisConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroJodaTimeConversions - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
-
Avro 1.8 invalid input: '&' 1.9 ship joda time conversions.
- AvroJodaTimeConversions() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions
- AvroJodaTimeConversions.DateConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroJodaTimeConversions.LossyTimeMicrosConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroJodaTimeConversions.LossyTimestampMicrosConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroJodaTimeConversions.TimeConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroJodaTimeConversions.TimeMicrosConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroJodaTimeConversions.TimestampConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroJodaTimeConversions.TimestampMicrosConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroPayloadSerializerProvider - Class in org.apache.beam.sdk.extensions.avro.schemas.io.payloads
- AvroPayloadSerializerProvider() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.io.payloads.AvroPayloadSerializerProvider
- AvroReader(AvroSource<T>) - Constructor for class org.apache.beam.sdk.extensions.avro.io.AvroSource.AvroReader
-
Reads Avro records of type
T
from the specified source. - AvroReadSchemaTransformFormatProvider - Class in org.apache.beam.sdk.io.fileschematransform
- AvroReadSchemaTransformFormatProvider() - Constructor for class org.apache.beam.sdk.io.fileschematransform.AvroReadSchemaTransformFormatProvider
- AvroRecordSchema - Class in org.apache.beam.sdk.extensions.avro.schemas
-
A
SchemaProvider
for AVRO generated SpecificRecords and POJOs. - AvroRecordSchema() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.AvroRecordSchema
- AvroSchemaInformationProvider - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroSchemaInformationProvider() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroSchemaInformationProvider
- AvroSchemaIOProvider - Class in org.apache.beam.sdk.extensions.avro.io
-
An implementation of
SchemaIOProvider
for reading and writing Avro files withAvroIO
. - AvroSchemaIOProvider() - Constructor for class org.apache.beam.sdk.extensions.avro.io.AvroSchemaIOProvider
- AvroSink<UserT,
DestinationT, - Class in org.apache.beam.sdk.extensions.avro.ioOutputT> -
A
FileBasedSink
for Avro files. - AvroSink.DatumWriterFactory<T> - Interface in org.apache.beam.sdk.extensions.avro.io
- AvroSource<T> - Class in org.apache.beam.sdk.extensions.avro.io
-
Do not use in pipelines directly: most users should use
AvroIO.Read
. - AvroSource.AvroReader<T> - Class in org.apache.beam.sdk.extensions.avro.io
-
A
BlockBasedSource.BlockBasedReader
for reading blocks from Avro files. - AvroSource.DatumReaderFactory<T> - Interface in org.apache.beam.sdk.extensions.avro.io
- AvroTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.avro
-
TableProvider
forAvroIO
for consumption by Beam SQL. - AvroTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.avro.AvroTableProvider
- AvroUtils - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
-
Utils to convert AVRO records to Beam rows.
- AvroUtils.AvroConvertType - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroUtils.AvroConvertValueForGetter - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroUtils.AvroConvertValueForSetter - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroUtils.FixedBytesField - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
-
Wrapper for fixed byte fields.
- AvroUtils.TypeWithNullability - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
- AvroWriteRequest<T> - Class in org.apache.beam.sdk.io.gcp.bigquery
- AvroWriteRequest(T, Schema) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.AvroWriteRequest
- AvroWriteSchemaTransformFormatProvider - Class in org.apache.beam.sdk.io.fileschematransform
-
A
FileWriteSchemaTransformFormatProvider
for avro format. - AvroWriteSchemaTransformFormatProvider() - Constructor for class org.apache.beam.sdk.io.fileschematransform.AvroWriteSchemaTransformFormatProvider
- awaitCompletion() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver
-
Uses the callers thread to process all elements received until we receive the end of the stream from the upstream producer for all endpoints specified.
- awaitTermination(Duration) - Method in class org.apache.beam.runners.spark.SparkPipelineResult
- awaitTermination(Duration) - Method in class org.apache.beam.runners.spark.SparkRunnerDebugger.DebugSparkPipelineResult
- AwsBuilderFactory<PojoT,
BuilderT> - Class in org.apache.beam.sdk.io.aws2.schemas -
Builder factory for AWS
SdkPojo
to avoid using reflection to instantiate a builder. - AwsBuilderFactory() - Constructor for class org.apache.beam.sdk.io.aws2.schemas.AwsBuilderFactory
- AwsModule - Class in org.apache.beam.sdk.io.aws2.options
-
A Jackson
Module
that registers aJsonSerializer
andJsonDeserializer
forAwsCredentialsProvider
and some subclasses. - AwsModule() - Constructor for class org.apache.beam.sdk.io.aws2.options.AwsModule
- AwsOptions - Interface in org.apache.beam.sdk.io.aws2.options
-
Options used to configure Amazon Web Services specific options such as credentials and region.
- AwsOptions.AwsRegionFactory - Class in org.apache.beam.sdk.io.aws2.options
-
Attempt to load default region.
- AwsOptions.AwsUserCredentialsFactory - Class in org.apache.beam.sdk.io.aws2.options
-
Return
DefaultCredentialsProvider
as default provider. - AwsPipelineOptionsRegistrar - Class in org.apache.beam.sdk.io.aws2.options
-
A registrar containing the default AWS options.
- AwsPipelineOptionsRegistrar() - Constructor for class org.apache.beam.sdk.io.aws2.options.AwsPipelineOptionsRegistrar
- AwsRegionFactory() - Constructor for class org.apache.beam.sdk.io.aws2.options.AwsOptions.AwsRegionFactory
- AwsSchemaProvider - Class in org.apache.beam.sdk.io.aws2.schemas
-
Schema provider for AWS
SdkPojo
models using the provided field metadata (@seeSdkPojo.sdkFields()
) rather than reflection. - AwsSchemaProvider() - Constructor for class org.apache.beam.sdk.io.aws2.schemas.AwsSchemaProvider
- AwsSchemaRegistrar - Class in org.apache.beam.sdk.io.aws2.schemas
- AwsSchemaRegistrar() - Constructor for class org.apache.beam.sdk.io.aws2.schemas.AwsSchemaRegistrar
- AwsSerializableUtils - Class in org.apache.beam.sdk.io.aws2.options
-
Utilities for working with AWS Serializables.
- AwsSerializableUtils() - Constructor for class org.apache.beam.sdk.io.aws2.options.AwsSerializableUtils
- AwsTypes - Class in org.apache.beam.sdk.io.aws2.schemas
- AwsTypes() - Constructor for class org.apache.beam.sdk.io.aws2.schemas.AwsTypes
- AwsUserCredentialsFactory() - Constructor for class org.apache.beam.sdk.io.aws2.options.AwsOptions.AwsUserCredentialsFactory
- AzureBlobStoreFileSystemRegistrar - Class in org.apache.beam.sdk.io.azure.blobstore
-
AutoService
registrar for theAzureBlobStoreFileSystem
. - AzureBlobStoreFileSystemRegistrar() - Constructor for class org.apache.beam.sdk.io.azure.blobstore.AzureBlobStoreFileSystemRegistrar
- AzureModule - Class in org.apache.beam.sdk.io.azure.options
-
A Jackson
Module
that registers aJsonSerializer
andJsonDeserializer
for Azure credential providers. - AzureModule() - Constructor for class org.apache.beam.sdk.io.azure.options.AzureModule
- AzureOptions - Interface in org.apache.beam.sdk.io.azure.options
- AzureOptions.AzureUserCredentialsFactory - Class in org.apache.beam.sdk.io.azure.options
-
Attempts to load Azure credentials.
- AzurePipelineOptionsRegistrar - Class in org.apache.beam.sdk.io.azure.options
-
A registrar containing the default Azure options.
- AzurePipelineOptionsRegistrar() - Constructor for class org.apache.beam.sdk.io.azure.options.AzurePipelineOptionsRegistrar
- AzureUserCredentialsFactory() - Constructor for class org.apache.beam.sdk.io.azure.options.AzureOptions.AzureUserCredentialsFactory
B
- BACKLOG_UNKNOWN - Static variable in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
-
Constant representing an unknown amount of backlog.
- backlogBytes() - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
-
Gauge for source backlog in bytes.
- backlogBytesOfSplit(String) - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
-
Gauge for source split backlog in bytes.
- backlogElements() - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
-
Gauge for source backlog in elements.
- backlogElementsOfSplit(String) - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
-
Gauge for source split backlog in elements.
- BackOffAdapter - Class in org.apache.beam.sdk.extensions.gcp.util
-
An adapter for converting between Apache Beam and Google API client representations of backoffs.
- BackOffAdapter() - Constructor for class org.apache.beam.sdk.extensions.gcp.util.BackOffAdapter
- BAD_RECORD_TAG - Static variable in interface org.apache.beam.sdk.transforms.errorhandling.BadRecordRouter
- BadRecord - Class in org.apache.beam.sdk.transforms.errorhandling
- BadRecord() - Constructor for class org.apache.beam.sdk.transforms.errorhandling.BadRecord
- BadRecord.Builder - Class in org.apache.beam.sdk.transforms.errorhandling
- BadRecord.Failure - Class in org.apache.beam.sdk.transforms.errorhandling
- BadRecord.Failure.Builder - Class in org.apache.beam.sdk.transforms.errorhandling
- BadRecord.Record - Class in org.apache.beam.sdk.transforms.errorhandling
- BadRecord.Record.Builder - Class in org.apache.beam.sdk.transforms.errorhandling
- BadRecordErrorHandler(PTransform<PCollection<BadRecord>, OutputT>, Pipeline) - Constructor for class org.apache.beam.sdk.transforms.errorhandling.ErrorHandler.BadRecordErrorHandler
-
Constructs a new ErrorHandler for handling BadRecords.
- BadRecordRouter - Interface in org.apache.beam.sdk.transforms.errorhandling
- BadRecordRouter.RecordingBadRecordRouter - Class in org.apache.beam.sdk.transforms.errorhandling
- BadRecordRouter.ThrowingBadRecordRouter - Class in org.apache.beam.sdk.transforms.errorhandling
- bag() - Static method in class org.apache.beam.sdk.state.StateSpecs
- bag(Coder<T>) - Static method in class org.apache.beam.sdk.state.StateSpecs
-
Identical to
StateSpecs.bag()
, but with an element coder explicitly supplied. - BagState<T> - Interface in org.apache.beam.sdk.state
-
A
ReadableState
cell containing a bag of values. - BagUserStateSpec() - Constructor for class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.BagUserStateSpec
- BASE_IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.FixedPrecisionNumeric
-
Identifier of the unspecified precision numeric type.
- baseBackoff() - Method in class org.apache.beam.sdk.io.aws2.common.RetryConfiguration
- baseBackoff(Duration) - Method in class org.apache.beam.sdk.io.aws2.common.RetryConfiguration.Builder
- BaseBeamTable - Class in org.apache.beam.sdk.extensions.sql.meta
-
Basic implementation of
BeamSqlTable
. - BaseBeamTable() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.BaseBeamTable
- baseNameBuilder(String) - Static method in class org.apache.beam.sdk.metrics.LabeledMetricNameUtils.MetricNameBuilder
- baseUrl() - Method in class org.apache.beam.sdk.io.solr.SolrIO.ReplicaInfo
- BASIC - Enum constant in enum class org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions.AutoscalingAlgorithmType
- BASIC - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService.TableMetadataView
- BASIC_CONNECTION_INFO_VALIDATION_GROUP - Static variable in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- BasicAuthJcsmpSessionServiceFactory - Class in org.apache.beam.sdk.io.solace.broker
-
A factory for creating
JcsmpSessionService
instances. - BasicAuthJcsmpSessionServiceFactory() - Constructor for class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionServiceFactory
- BasicAuthJcsmpSessionServiceFactory.Builder - Class in org.apache.beam.sdk.io.solace.broker
- BasicAuthSempClient - Class in org.apache.beam.sdk.io.solace.broker
-
A class that manages REST calls to the Solace Element Management Protocol (SEMP) using basic authentication.
- BasicAuthSempClient(String, String, String, String, SerializableSupplier<HttpRequestFactory>) - Constructor for class org.apache.beam.sdk.io.solace.broker.BasicAuthSempClient
- BasicAuthSempClientFactory - Class in org.apache.beam.sdk.io.solace.broker
-
A factory for creating
BasicAuthSempClient
instances. - BasicAuthSempClientFactory() - Constructor for class org.apache.beam.sdk.io.solace.broker.BasicAuthSempClientFactory
- BasicAuthSempClientFactory.Builder - Class in org.apache.beam.sdk.io.solace.broker
- Basic Usage - Search tag in class org.apache.beam.io.requestresponse.RequestResponseIO
- Section
- BATCH - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead.QueryPriority
-
Specifies that a query should be run with a BATCH priority.
- BATCH - Enum constant in enum class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.ScanType
- BATCH_IMPORT - Enum constant in enum class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write.WriteMethod
-
Batch import write method.
- batchCombinePerKey(FlinkStreamingTranslationContext, PTransform<PCollection<KV<K, InputT>>, PCollection<KV<K, OutputT>>>, CombineFnBase.GlobalCombineFn<InputT, AccumT, OutputT>, Map<Integer, PCollectionView<?>>, List<PCollectionView<?>>) - Static method in class org.apache.beam.runners.flink.FlinkStreamingAggregationsTranslators
- batchCombinePerKeyNoSideInputs(FlinkStreamingTranslationContext, PTransform<PCollection<KV<K, InputT>>, PCollection<KV<K, OutputT>>>, CombineFnBase.GlobalCombineFn<InputT, AccumT, OutputT>) - Static method in class org.apache.beam.runners.flink.FlinkStreamingAggregationsTranslators
- BatchContextImpl - Class in org.apache.beam.sdk.io.cdap.context
-
Class for Batch, Sink and Stream CDAP wrapper classes that use it to provide common details.
- BatchContextImpl() - Constructor for class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- BATCHED - Enum constant in enum class org.apache.beam.sdk.io.solace.SolaceIO.WriterType
- batchGetDocuments() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.Read
-
Factory method to create a new type safe builder for
BatchGetDocumentsRequest
operations. - batchGroupByKey(FlinkStreamingTranslationContext, PTransform<PCollection<KV<K, InputT>>, PCollection<KV<K, Iterable<InputT>>>>) - Static method in class org.apache.beam.runners.flink.FlinkStreamingAggregationsTranslators
-
Creates a two-steps GBK operation.
- Batching and Grouping - Search tag in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
- Section
- BatchingParams() - Constructor for class org.apache.beam.sdk.transforms.GroupIntoBatches.BatchingParams
- BatchSideInputHandlerFactory - Class in org.apache.beam.runners.fnexecution.translation
-
StateRequestHandler
that uses aBatchSideInputHandlerFactory.SideInputGetter
to access side inputs. - BatchSideInputHandlerFactory.SideInputGetter - Interface in org.apache.beam.runners.fnexecution.translation
-
Returns the value for the side input with the given PCollection id from the runner.
- BatchSinkContextImpl - Class in org.apache.beam.sdk.io.cdap.context
-
Class for creating context object of different CDAP classes with batch sink type.
- BatchSinkContextImpl() - Constructor for class org.apache.beam.sdk.io.cdap.context.BatchSinkContextImpl
- batchSize() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportCatalogItems
- batchSize() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportUserEvents
- BatchSourceContextImpl - Class in org.apache.beam.sdk.io.cdap.context
-
Class for creating context object of different CDAP classes with batch source type.
- BatchSourceContextImpl() - Constructor for class org.apache.beam.sdk.io.cdap.context.BatchSourceContextImpl
- BatchStatefulParDoOverrides - Class in org.apache.beam.runners.dataflow
-
PTransformOverrideFactories
that expands to correctly implement statefulParDo
using window-unawareBatchViewOverrides.GroupByKeyAndSortValuesOnly
to linearize processing per key. - BatchStatefulParDoOverrides() - Constructor for class org.apache.beam.runners.dataflow.BatchStatefulParDoOverrides
- BatchStatefulParDoOverrides.BatchStatefulDoFn<K,
V, - Class in org.apache.beam.runners.dataflowOutputT> -
A key-preserving
DoFn
that explodes an iterable that has been grouped by key and window. - BatchTransformTranslator<TransformT> - Interface in org.apache.beam.runners.twister2.translators
-
Batch TransformTranslator interface.
- batchWrite() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.Write
- batchWrite(String, List<RecT>) - Method in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
-
Asynchronously trigger a batch write request (unless already in error state).
- batchWrite(String, List<RecT>, boolean) - Method in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
-
Asynchronously trigger a batch write request (unless already in error state).
- BatchWriteWithSummary(JodaClock, FirestoreStatefulComponentFactory, RpcQosOptions, String, String) - Constructor for class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchWriteWithSummary
- Batch writing - Search tag in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO
- Section
- BEAM_INSTANCE_PROPERTY - Static variable in class org.apache.beam.io.debezium.KafkaSourceConsumerFn
- BeamAggregateProjectMergeRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
This rule is essentially a wrapper around Calcite's
AggregateProjectMergeRule
. - BeamAggregateProjectMergeRule(Class<? extends Aggregate>, Class<? extends Project>, RelBuilderFactory) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rule.BeamAggregateProjectMergeRule
- BeamAggregationRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamRelNode
to replace aAggregate
node. - BeamAggregationRel(RelOptCluster, RelTraitSet, RelNode, ImmutableBitSet, List<ImmutableBitSet>, List<AggregateCall>, WindowFn<Row, IntervalWindow>, int) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamAggregationRel
- BeamAggregationRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
Rule to detect the window/trigger settings.
- BeamAggregationRule(Class<? extends Aggregate>, Class<? extends Project>, RelBuilderFactory) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rule.BeamAggregationRule
- BeamBasicAggregationRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
Aggregation rule that doesn't include projection.
- BeamBasicAggregationRule(Class<? extends Aggregate>, RelBuilderFactory) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rule.BeamBasicAggregationRule
- BeamBatchTSetEnvironment - Class in org.apache.beam.runners.twister2
-
This is a shell tset environment which is used on as a central driver model to fit what beam expects.
- BeamBatchTSetEnvironment() - Constructor for class org.apache.beam.runners.twister2.BeamBatchTSetEnvironment
- BeamBatchWorker - Class in org.apache.beam.runners.twister2
-
The Twister2 worker that will execute the job logic once the job is submitted from the run method.
- BeamBatchWorker() - Constructor for class org.apache.beam.runners.twister2.BeamBatchWorker
- BeamBigQuerySqlDialect - Class in org.apache.beam.sdk.extensions.sql.meta.provider.bigquery
- BeamBigQuerySqlDialect(SqlDialect.Context) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamBigQuerySqlDialect
- BeamBuiltinAggregations - Class in org.apache.beam.sdk.extensions.sql.impl.transform
-
Built-in aggregations functions for COUNT/MAX/MIN/SUM/AVG/VAR_POP/VAR_SAMP.
- BeamBuiltinAggregations() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations
- BeamBuiltinAggregations.BitXOr<T> - Class in org.apache.beam.sdk.extensions.sql.impl.transform
- BeamBuiltinAnalyticFunctions - Class in org.apache.beam.sdk.extensions.sql.impl.transform
-
Built-in Analytic Functions for the aggregation analytics functionality.
- BeamBuiltinAnalyticFunctions() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions
- BeamBuiltinAnalyticFunctions.PositionAwareCombineFn<InputT,
AccumT, - Class in org.apache.beam.sdk.extensions.sql.impl.transformOutputT> - BeamBuiltinFunctionProvider - Class in org.apache.beam.sdk.extensions.sql.impl.udf
-
BeamBuiltinFunctionClass interface.
- BeamBuiltinFunctionProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.udf.BeamBuiltinFunctionProvider
- BeamBuiltinMethods - Class in org.apache.beam.sdk.extensions.sql.zetasql.translation.impl
-
BeamBuiltinMethods.
- BeamBuiltinMethods() - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.BeamBuiltinMethods
- BeamCalciteSchema - Class in org.apache.beam.sdk.extensions.sql.impl
-
Adapter from
TableProvider
toSchema
. - BeamCalciteTable - Class in org.apache.beam.sdk.extensions.sql.impl
-
Adapter from
BeamSqlTable
to a calcite Table. - BeamCalcMergeRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
Planner rule to merge a
BeamCalcRel
with aBeamCalcRel
. - BeamCalcMergeRule() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rule.BeamCalcMergeRule
- BeamCalcRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamRelNode to replace
Project
andFilter
node. - BeamCalcRel(RelOptCluster, RelTraitSet, RelNode, RexProgram) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel
- BeamCalcRel.WrappedList<T> - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
WrappedList translates
List
on access. - BeamCalcRel.WrappedMap<V> - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
WrappedMap translates
Map
on access. - BeamCalcRel.WrappedRow - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
WrappedRow translates
Row
on access. - BeamCalcRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
- BeamCalcSplittingRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
A
RelOptRule
that converts aLogicalCalc
into a chain ofAbstractBeamCalcRel
nodes viaCalcRelSplitter
. - BeamCalcSplittingRule(String) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rule.BeamCalcSplittingRule
- BeamCodegenUtils - Class in org.apache.beam.sdk.extensions.sql.zetasql.translation.impl
-
BeamCodegenUtils.
- BeamCodegenUtils() - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.BeamCodegenUtils
- BeamCoGBKJoinRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
A
BeamJoinRel
which does CoGBK Join - BeamCoGBKJoinRel(RelOptCluster, RelTraitSet, RelNode, RelNode, RexNode, Set<CorrelationId>, JoinRelType) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCoGBKJoinRel
- BeamCoGBKJoinRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
Rule to convert
LogicalJoin
node toBeamCoGBKJoinRel
node. - beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.AbstractBeamCalcRel
- beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamAggregationRel
- beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIntersectRel
- beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSinkRel
- beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
- beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
- beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMatchRel
- beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMinusRel
- beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamPushDownIOSourceRel
- beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in interface org.apache.beam.sdk.extensions.sql.impl.rel.BeamRelNode
-
This method is called by
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner.NonCumulativeCostImpl
. - beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel
- beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamTableFunctionScanRel
- beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUncollectRel
- beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnionRel
- beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnnestRel
- beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamValuesRel
- beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamWindowRel
-
A dummy cost computation based on a fixed multiplier.
- beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.BeamZetaSqlUncollectRel
- beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.BeamZetaSqlUnnestRel
- BeamCostModel - Class in org.apache.beam.sdk.extensions.sql.impl.planner
-
VolcanoCost
represents the cost of a plan node. - BeamCostModel.Factory - Class in org.apache.beam.sdk.extensions.sql.impl.planner
-
Implementation of
RelOptCostFactory
that createsBeamCostModel
s. - BeamEnumerableConverter - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamRelNode to replace a
Enumerable
node. - BeamEnumerableConverter(RelOptCluster, RelTraitSet, RelNode) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamEnumerableConverter
- BeamEnumerableConverterRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
- beamFilesystemArtifactDestinationProvider(String) - Static method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService
-
An ArtifactDestinationProvider that places new artifacts as files in a Beam filesystem.
- BeamFlinkDataSetAdapter - Class in org.apache.beam.runners.flink.adapter
-
An adapter class that allows one to apply Apache Beam PTransforms directly to Flink DataSets.
- BeamFlinkDataSetAdapter() - Constructor for class org.apache.beam.runners.flink.adapter.BeamFlinkDataSetAdapter
- BeamFlinkDataSetAdapter(PipelineOptions) - Constructor for class org.apache.beam.runners.flink.adapter.BeamFlinkDataSetAdapter
- BeamFlinkDataStreamAdapter - Class in org.apache.beam.runners.flink.adapter
-
An adapter class that allows one to apply Apache Beam PTransforms directly to Flink DataStreams.
- BeamFlinkDataStreamAdapter() - Constructor for class org.apache.beam.runners.flink.adapter.BeamFlinkDataStreamAdapter
- BeamFlinkDataStreamAdapter(PipelineOptions) - Constructor for class org.apache.beam.runners.flink.adapter.BeamFlinkDataStreamAdapter
- BeamFnDataGrpcMultiplexer - Class in org.apache.beam.sdk.fn.data
-
A gRPC multiplexer for a specific
Endpoints.ApiServiceDescriptor
. - BeamFnDataGrpcMultiplexer(Endpoints.ApiServiceDescriptor, OutboundObserverFactory, OutboundObserverFactory.BasicFactory<BeamFnApi.Elements, BeamFnApi.Elements>) - Constructor for class org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer
- BeamFnDataInboundObserver - Class in org.apache.beam.sdk.fn.data
- BeamFnDataInboundObserver.CloseException - Exception Class in org.apache.beam.sdk.fn.data
- BeamFnDataOutboundAggregator - Class in org.apache.beam.sdk.fn.data
-
An outbound data buffering aggregator with size-based buffer and time-based buffer if corresponding options are set.
- BeamFnDataOutboundAggregator(PipelineOptions, Supplier<String>, StreamObserver<BeamFnApi.Elements>, boolean) - Constructor for class org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator
- BeamImpulseSource - Class in org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.impulse
-
A Beam
BoundedSource
for Impulse Source. - BeamImpulseSource() - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.impulse.BeamImpulseSource
- BeamIntersectRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamRelNode
to replace aIntersect
node. - BeamIntersectRel(RelOptCluster, RelTraitSet, List<RelNode>, boolean) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIntersectRel
- BeamIntersectRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
ConverterRule
to replaceIntersect
withBeamIntersectRel
. - BeamIOPushDownRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
- BeamIOPushDownRule(RelBuilderFactory) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rule.BeamIOPushDownRule
- BeamIOSinkRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamRelNode to replace a
TableModify
node. - BeamIOSinkRel(RelOptCluster, RelOptTable, Prepare.CatalogReader, RelNode, TableModify.Operation, List<String>, List<RexNode>, boolean, BeamSqlTable, Map<String, String>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSinkRel
- BeamIOSinkRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
- BeamIOSourceRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamRelNode to replace a
TableScan
node. - BeamIOSourceRel(RelOptCluster, RelTraitSet, RelOptTable, BeamSqlTable, Map<String, String>, BeamCalciteTable) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
- BeamJavaTypeFactory - Class in org.apache.beam.sdk.extensions.sql.impl.planner
-
customized data type in Beam.
- BeamJavaUdfCalcRule - Class in org.apache.beam.sdk.extensions.sql.zetasql
- BeamJoinAssociateRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
This is very similar to
JoinAssociateRule
. - BeamJoinPushThroughJoinRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
This is exactly similar to
JoinPushThroughJoinRule
. - BeamJoinRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
An abstract
BeamRelNode
to implement Join Rels. - BeamJoinRel(RelOptCluster, RelTraitSet, RelNode, RelNode, RexNode, Set<CorrelationId>, JoinRelType) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
- BeamJoinTransforms - Class in org.apache.beam.sdk.extensions.sql.impl.transform
-
Collections of
PTransform
andDoFn
used to perform JOIN operation. - BeamJoinTransforms() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamJoinTransforms
- BeamJoinTransforms.JoinAsLookup - Class in org.apache.beam.sdk.extensions.sql.impl.transform
-
Transform to execute Join as Lookup.
- BeamKafkaCSVTable - Class in org.apache.beam.sdk.extensions.sql.meta.provider.kafka
-
A Kafka topic that saves records as CSV format.
- BeamKafkaCSVTable(Schema, String, List<String>) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaCSVTable
- BeamKafkaCSVTable(Schema, String, List<String>, TimestampPolicyFactory) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaCSVTable
- BeamKafkaCSVTable(Schema, String, List<String>, CSVFormat, TimestampPolicyFactory) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaCSVTable
- BeamKafkaTable - Class in org.apache.beam.sdk.extensions.sql.meta.provider.kafka
-
BeamKafkaTable
represent a Kafka topic, as source or target. - BeamKafkaTable(Schema) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
- BeamKafkaTable(Schema, String, List<String>) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
- BeamKafkaTable(Schema, String, List<String>, TimestampPolicyFactory) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
- BeamKafkaTable(Schema, List<TopicPartition>, String) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
- BeamKafkaTable(Schema, List<TopicPartition>, String, TimestampPolicyFactory) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
- BeamLogicalConvention - Enum Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
Convention for Beam SQL.
- BeamMatchRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamRelNode
to replace aMatch
node. - BeamMatchRel(RelOptCluster, RelTraitSet, RelNode, RelDataType, RexNode, boolean, boolean, Map<String, RexNode>, Map<String, RexNode>, RexNode, Map<String, ? extends SortedSet<String>>, boolean, ImmutableBitSet, RelCollation, RexNode) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMatchRel
- BeamMatchRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
ConverterRule
to replaceMatch
withBeamMatchRel
. - BeamMinusRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamRelNode
to replace aMinus
node. - BeamMinusRel(RelOptCluster, RelTraitSet, List<RelNode>, boolean) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMinusRel
- BeamMinusRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
ConverterRule
to replaceMinus
withBeamMinusRel
. - BeamPCollectionTable<InputT> - Class in org.apache.beam.sdk.extensions.sql.impl.schema
-
BeamPCollectionTable
converts aPCollection<Row>
as a virtual table, then a downstream query can query directly. - BeamPCollectionTable(PCollection<InputT>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.schema.BeamPCollectionTable
- BeamPushDownIOSourceRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
- BeamPushDownIOSourceRel(RelOptCluster, RelTraitSet, RelOptTable, BeamSqlTable, List<String>, BeamSqlTableFilter, Map<String, String>, BeamCalciteTable) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamPushDownIOSourceRel
- BeamRelDataTypeSystem - Class in org.apache.beam.sdk.extensions.sql.impl.planner
-
customized data type in Beam.
- BeamRelMetadataQuery - Class in org.apache.beam.sdk.extensions.sql.impl.planner
- BeamRelNode - Interface in org.apache.beam.sdk.extensions.sql.impl.rel
-
A
RelNode
that can also give aPTransform
that implements the expression. - beamRow2CsvLine(Row, CSVFormat) - Static method in class org.apache.beam.sdk.extensions.sql.impl.schema.BeamTableUtils
- beamRowFromSourceRecordFn(Schema) - Static method in class org.apache.beam.io.debezium.KafkaConnectUtils
- Beam Rows - Search tag in class org.apache.beam.sdk.io.iceberg.IcebergIO
- Section
- BeamRowToBigtableMutation - Class in org.apache.beam.sdk.io.gcp.bigtable
- BeamRowToBigtableMutation(Map<String, Set<String>>) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BeamRowToBigtableMutation
- BeamRowToBigtableMutation.ToBigtableRowFn - Class in org.apache.beam.sdk.io.gcp.bigtable
- beamRowToIcebergRecord(Schema, Row) - Static method in class org.apache.beam.sdk.io.iceberg.IcebergUtils
-
Converts a Beam
Row
to an IcebergRecord
. - BeamRowToStorageApiProto - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Utility methods for converting Beam
Row
objects to dynamic protocol message, for use with the Storage write API. - BeamRowToStorageApiProto() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BeamRowToStorageApiProto
- BeamRuleSets - Class in org.apache.beam.sdk.extensions.sql.impl.planner
-
RuleSet
used inBeamQueryPlanner
. - BeamRuleSets() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRuleSets
- beamSchemaFromJsonSchema(String) - Static method in class org.apache.beam.sdk.schemas.utils.JsonUtils
- beamSchemaFromKafkaConnectSchema(Schema) - Static method in class org.apache.beam.io.debezium.KafkaConnectUtils
- beamSchemaToIcebergSchema(Schema) - Static method in class org.apache.beam.sdk.io.iceberg.IcebergUtils
-
Converts a Beam
Schema
to an IcebergSchema
. - beamSchemaTypeFromKafkaType(Schema) - Static method in class org.apache.beam.io.debezium.KafkaConnectUtils
- BeamSetOperatorRelBase - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
Delegate for Set operators:
BeamUnionRel
,BeamIntersectRel
andBeamMinusRel
. - BeamSetOperatorRelBase(BeamRelNode, BeamSetOperatorRelBase.OpType, boolean) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSetOperatorRelBase
- BeamSetOperatorRelBase.OpType - Enum Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
Set operator type.
- BeamSetOperatorsTransforms - Class in org.apache.beam.sdk.extensions.sql.impl.transform
-
Collections of
PTransform
andDoFn
used to perform Set operations. - BeamSetOperatorsTransforms() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamSetOperatorsTransforms
- BeamSetOperatorsTransforms.BeamSqlRow2KvFn - Class in org.apache.beam.sdk.extensions.sql.impl.transform
-
Transform a
BeamSqlRow
to aKV<BeamSqlRow, BeamSqlRow>
. - BeamSetOperatorsTransforms.SetOperatorFilteringDoFn - Class in org.apache.beam.sdk.extensions.sql.impl.transform
-
Filter function used for Set operators.
- BeamSideInputJoinRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
A
BeamJoinRel
which does sideinput Join - BeamSideInputJoinRel(RelOptCluster, RelTraitSet, RelNode, RelNode, RexNode, Set<CorrelationId>, JoinRelType) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSideInputJoinRel
- BeamSideInputJoinRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
Rule to convert
LogicalJoin
node toBeamSideInputJoinRel
node. - BeamSideInputLookupJoinRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
A
BeamJoinRel
which does Lookup Join - BeamSideInputLookupJoinRel(RelOptCluster, RelTraitSet, RelNode, RelNode, RexNode, Set<CorrelationId>, JoinRelType) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSideInputLookupJoinRel
- BeamSideInputLookupJoinRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
Rule to convert
LogicalJoin
node toBeamSideInputLookupJoinRel
node. - BeamSideInputLookupJoinRule() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rule.BeamSideInputLookupJoinRule
- BeamSortRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamRelNode
to replace aSort
node. - BeamSortRel(RelOptCluster, RelTraitSet, RelNode, RelCollation, RexNode, RexNode) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel
- BeamSortRel.BeamSqlRowComparator - Class in org.apache.beam.sdk.extensions.sql.impl.rel
- BeamSortRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
ConverterRule
to replaceSort
withBeamSortRel
. - beamSource - Variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSource
- BeamSqlCli - Class in org.apache.beam.sdk.extensions.sql
-
BeamSqlCli
provides methods to execute Beam SQL with an interactive client. - BeamSqlCli() - Constructor for class org.apache.beam.sdk.extensions.sql.BeamSqlCli
- BeamSqlDataCatalogExample - Class in org.apache.beam.sdk.extensions.sql.example
-
Example pipeline that uses Google Cloud Data Catalog to retrieve the table metadata.
- BeamSqlDataCatalogExample() - Constructor for class org.apache.beam.sdk.extensions.sql.example.BeamSqlDataCatalogExample
- BeamSqlDataCatalogExample.DCExamplePipelineOptions - Interface in org.apache.beam.sdk.extensions.sql.example
-
Pipeline options to specify the query and the output for the example.
- Beam SQL DSL usage: - Search tag in class org.apache.beam.sdk.extensions.sql.SqlTransform
- Section
- BeamSqlEnv - Class in org.apache.beam.sdk.extensions.sql.impl
-
Contains the metadata of tables/UDF functions, and exposes APIs to query/validate/optimize/translate SQL statements.
- BeamSqlEnv.BeamSqlEnvBuilder - Class in org.apache.beam.sdk.extensions.sql.impl
-
BeamSqlEnv's Builder.
- BeamSqlOutputToConsoleFn - Class in org.apache.beam.sdk.extensions.sql.impl.transform
-
A test PTransform to display output in console.
- BeamSqlOutputToConsoleFn(String) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamSqlOutputToConsoleFn
- BeamSqlParser - Class in org.apache.beam.sdk.extensions.sql.impl.parser
- BeamSqlPipelineOptions - Interface in org.apache.beam.sdk.extensions.sql.impl
-
Options used to configure BeamSQL.
- BeamSqlPipelineOptionsRegistrar - Class in org.apache.beam.sdk.extensions.sql.impl
-
AutoService
registrar forBeamSqlPipelineOptions
. - BeamSqlPipelineOptionsRegistrar() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.BeamSqlPipelineOptionsRegistrar
- BeamSqlRelUtils - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
Utilities for
BeamRelNode
. - BeamSqlRelUtils() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils
- BeamSqlRow2KvFn() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamSetOperatorsTransforms.BeamSqlRow2KvFn
- BeamSqlRowComparator(List<Integer>, List<Boolean>, List<Boolean>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel.BeamSqlRowComparator
- BeamSqlSeekableTable - Interface in org.apache.beam.sdk.extensions.sql
-
A seekable table converts a JOIN operator to an inline lookup.
- BeamSqlTable - Interface in org.apache.beam.sdk.extensions.sql.meta
-
This interface defines a Beam Sql Table.
- BeamSqlTableFilter - Interface in org.apache.beam.sdk.extensions.sql.meta
-
This interface defines Beam SQL Table Filter.
- BeamSqlUdf - Interface in org.apache.beam.sdk.extensions.sql
-
Interface to create a UDF in Beam SQL.
- BeamSqlUnparseContext - Class in org.apache.beam.sdk.extensions.sql.meta.provider.bigquery
- BeamSqlUnparseContext(IntFunction<SqlNode>) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamSqlUnparseContext
- BeamStoppableFunction - Interface in org.apache.beam.runners.flink.translation.wrappers.streaming.io
-
Custom StoppableFunction for backward compatibility.
- BeamTableFunctionScanRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamRelNode to replace
TableFunctionScan
. - BeamTableFunctionScanRel(RelOptCluster, RelTraitSet, List<RelNode>, RexNode, Type, RelDataType, Set<RelColumnMapping>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamTableFunctionScanRel
- BeamTableFunctionScanRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
This is the conveter rule that converts a Calcite
TableFunctionScan
to BeamTableFunctionScanRel
. - BeamTableStatistics - Class in org.apache.beam.sdk.extensions.sql.impl
-
This class stores row count statistics.
- BeamTableUtils - Class in org.apache.beam.sdk.extensions.sql.impl.schema
-
Utility methods for working with
BeamTable
. - BeamTableUtils() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.schema.BeamTableUtils
- BeamUncollectRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamRelNode
to implement an uncorrelatedUncollect
, aka UNNEST. - BeamUncollectRel(RelOptCluster, RelTraitSet, RelNode, boolean) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUncollectRel
- BeamUncollectRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
- BeamUnionRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamRelNode
to replace aUnion
. - BeamUnionRel(RelOptCluster, RelTraitSet, List<RelNode>, boolean) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnionRel
- BeamUnionRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
- BeamUnnestRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
- BeamUnnestRel(RelOptCluster, RelTraitSet, RelNode, RelDataType, List<Integer>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnnestRel
- BeamUnnestRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
- BeamValuesRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamRelNode
to replace aValues
node. - BeamValuesRel(RelOptCluster, RelDataType, ImmutableList<ImmutableList<RexLiteral>>, RelTraitSet) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamValuesRel
- BeamValuesRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
ConverterRule
to replaceValues
withBeamValuesRel
. - BeamWindowRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamRelNode
to replace aWindow
node. - BeamWindowRel(RelOptCluster, RelTraitSet, RelNode, List<RexLiteral>, RelDataType, List<Window.Group>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamWindowRel
- BeamWindowRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
- BeamWorkerStatusGrpcService - Class in org.apache.beam.runners.fnexecution.status
-
A Fn Status service which can collect run-time status information from SDK harnesses for debugging purpose.
- BeamZetaSqlCalcMergeRule - Class in org.apache.beam.sdk.extensions.sql.zetasql
-
Planner rule to merge a
BeamZetaSqlCalcRel
with aBeamZetaSqlCalcRel
. - BeamZetaSqlCalcMergeRule() - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.BeamZetaSqlCalcMergeRule
- BeamZetaSqlCalcRel - Class in org.apache.beam.sdk.extensions.sql.zetasql
-
BeamRelNode to replace
Project
andFilter
node based on theZetaSQL
expression evaluator. - BeamZetaSqlCalcRel(RelOptCluster, RelTraitSet, RelNode, RexProgram) - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.BeamZetaSqlCalcRel
- BeamZetaSqlCalcRule - Class in org.apache.beam.sdk.extensions.sql.zetasql
- BeamZetaSqlCalcSplittingRule - Class in org.apache.beam.sdk.extensions.sql.zetasql
-
A
BeamCalcSplittingRule
that converts ainvalid reference
LogicalCalc
BeamZetaSqlCalcRel
and/orBeamCalcRel
viaCalcRelSplitter
. - BeamZetaSqlCatalog - Class in org.apache.beam.sdk.extensions.sql.zetasql
-
Catalog for registering tables and functions.
- BeamZetaSqlUncollectRel - Class in org.apache.beam.sdk.extensions.sql.zetasql.unnest
-
BeamRelNode
to implement an uncorrelatedZetaSqlUnnest
, aka UNNEST. - BeamZetaSqlUncollectRel(RelOptCluster, RelTraitSet, RelNode, boolean) - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.unnest.BeamZetaSqlUncollectRel
- BeamZetaSqlUncollectRule - Class in org.apache.beam.sdk.extensions.sql.zetasql.unnest
- BeamZetaSqlUnnestRel - Class in org.apache.beam.sdk.extensions.sql.zetasql.unnest
- BeamZetaSqlUnnestRel(RelOptCluster, RelTraitSet, RelNode, RelDataType, List<Integer>) - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.unnest.BeamZetaSqlUnnestRel
- BeamZetaSqlUnnestRule - Class in org.apache.beam.sdk.extensions.sql.zetasql.unnest
- before_initial_sequence - Enum constant in enum class org.apache.beam.sdk.extensions.ordered.UnprocessedEvent.Reason
- beforeProcessing(PipelineOptions) - Method in interface org.apache.beam.sdk.harness.JvmInitializer
-
Implement beforeProcessing to run some custom initialization after basic services such as logging, but before data processing begins.
- beforeProcessing(PipelineOptions) - Method in class org.apache.beam.sdk.io.kafka.KafkaIOInitializer
- beforeStart(ClientCallStreamObserver<RespT>) - Method in class org.apache.beam.sdk.fn.stream.ForwardingClientResponseObserver
- begin() - Method in class org.apache.beam.sdk.Pipeline
-
Returns a
PBegin
owned by this Pipeline. - beginningOnDay(int) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
- beginningOnDay(int, int) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
- benchmarkHadoopLineReader(TextSourceBenchmark.Data) - Method in class org.apache.beam.sdk.jmh.io.TextSourceBenchmark
- benchmarkTextSource(TextSourceBenchmark.Data) - Method in class org.apache.beam.sdk.jmh.io.TextSourceBenchmark
- BIG_INT - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
- BIG_QUERY_INSERT_ERROR_ERROR_CONTAINER - Static variable in interface org.apache.beam.sdk.io.gcp.bigquery.ErrorContainer
- BigDecimalCoder - Class in org.apache.beam.sdk.coders
-
A
BigDecimalCoder
encodes aBigDecimal
as an integer scale encoded withVarIntCoder
and aBigInteger
encoded usingBigIntegerCoder
. - BigDecimalConverter - Class in org.apache.beam.sdk.extensions.sql.impl.utils
-
Provides converters from
BigDecimal
to other numeric types based on the inputSchema.TypeName
. - BigDecimalConverter() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.utils.BigDecimalConverter
- bigdecimals() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
The
TypeDescriptor
for BigDecimal. - BigEndianIntegerCoder - Class in org.apache.beam.sdk.coders
-
A
BigEndianIntegerCoder
encodesIntegers
in 4 bytes, big-endian. - BigEndianLongCoder - Class in org.apache.beam.sdk.coders
-
A
BigEndianLongCoder
encodesLongs
in 8 bytes, big-endian. - BigEndianShortCoder - Class in org.apache.beam.sdk.coders
-
A
BigEndianShortCoder
encodesShorts
in 2 bytes, big-endian. - BigIntegerCoder - Class in org.apache.beam.sdk.coders
-
A
BigIntegerCoder
encodes aBigInteger
as a byte array containing the big endian two's-complement representation, encoded viaByteArrayCoder
. - bigintegers() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
The
TypeDescriptor
for BigInteger. - BIGQUERY - Static variable in class org.apache.beam.sdk.managed.Managed
- BIGQUERY_EARLY_ROLLOUT_REGION - Static variable in interface org.apache.beam.sdk.io.gcp.bigquery.TestBigQueryOptions
- BIGQUERY_JOB_TEMPLATE - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
-
Template for BigQuery jobs created by BigQueryIO.
- BigqueryClient - Class in org.apache.beam.sdk.io.gcp.testing
-
A wrapper class to call Bigquery API calls.
- BigqueryClient(String) - Constructor for class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
- BigQueryCoderProviderRegistrar - Class in org.apache.beam.sdk.io.gcp.bigquery
-
A
CoderProviderRegistrar
for standard types used withBigQueryIO
. - BigQueryCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryCoderProviderRegistrar
- BigQuery Concepts - Search tag in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
- Section
- BigQueryDirectReadSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration
- BigQueryDirectReadSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
-
An implementation of
TypedSchemaTransformProvider
for BigQuery Storage Read API jobs configured viaBigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration
. - BigQueryDirectReadSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider
- BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransform - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
-
A
SchemaTransform
for BigQuery Storage Read API, configured withBigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration
and instantiated byBigQueryDirectReadSchemaTransformProvider
. - BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
-
Configuration for reading from BigQuery with Storage Read API.
- BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
- BigQueryDlqProvider - Class in org.apache.beam.sdk.io.gcp.bigquery
- BigQueryDlqProvider() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryDlqProvider
- BigQueryExportReadSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Configuration for reading from BigQuery.
- BigQueryExportReadSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration
- BigQueryExportReadSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.bigquery
- BigQueryExportReadSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.bigquery
-
An implementation of
TypedSchemaTransformProvider
for BigQuery read jobs configured usingBigQueryExportReadSchemaTransformConfiguration
. - BigQueryExportReadSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformProvider
- BigQueryExportReadSchemaTransformProvider.BigQueryExportSchemaTransform - Class in org.apache.beam.sdk.io.gcp.bigquery
-
An implementation of
SchemaTransform
for BigQuery read jobs configured usingBigQueryExportReadSchemaTransformConfiguration
. - BigQueryFileLoadsSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
-
An implementation of
TypedSchemaTransformProvider
for BigQuery write jobs configured usingBigQueryWriteConfiguration
. - BigQueryFileLoadsSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryFileLoadsSchemaTransformProvider
- BigQueryFileLoadsSchemaTransformProvider.BigQueryFileLoadsSchemaTransform - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
- BigQueryFilter - Class in org.apache.beam.sdk.extensions.sql.meta.provider.bigquery
- BigQueryFilter(List<RexNode>) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryFilter
- BigQueryHelpers - Class in org.apache.beam.sdk.io.gcp.bigquery
-
A set of helper functions and classes used by
BigQueryIO
. - BigQueryHelpers() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
- BigQueryInsertError - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Model definition for BigQueryInsertError.
- BigQueryInsertError(TableRow, TableDataInsertAllResponse.InsertErrors, TableReference) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertError
- BigQueryInsertErrorCoder - Class in org.apache.beam.sdk.io.gcp.bigquery
-
A
Coder
that encodes BigQueryBigQueryInsertError
objects. - BigQueryInsertErrorCoder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
- BigQueryIO - Class in org.apache.beam.sdk.io.gcp.bigquery
-
PTransform
s for reading and writing BigQuery tables. - BigQueryIO.Read - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Implementation of
BigQueryIO.read()
. - BigQueryIO.TypedRead<T> - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Implementation of
BigQueryIO.read(SerializableFunction)
. - BigQueryIO.TypedRead.Method - Enum Class in org.apache.beam.sdk.io.gcp.bigquery
-
Determines the method used to read data from BigQuery.
- BigQueryIO.TypedRead.QueryPriority - Enum Class in org.apache.beam.sdk.io.gcp.bigquery
-
An enumeration type for the priority of a query.
- BigQueryIO.Write<T> - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Implementation of
BigQueryIO.write()
. - BigQueryIO.Write.CreateDisposition - Enum Class in org.apache.beam.sdk.io.gcp.bigquery
-
An enumeration type for the BigQuery create disposition strings.
- BigQueryIO.Write.Method - Enum Class in org.apache.beam.sdk.io.gcp.bigquery
-
Determines the method used to insert data in BigQuery.
- BigQueryIO.Write.SchemaUpdateOption - Enum Class in org.apache.beam.sdk.io.gcp.bigquery
-
An enumeration type for the BigQuery schema update options strings.
- BigQueryIO.Write.WriteDisposition - Enum Class in org.apache.beam.sdk.io.gcp.bigquery
-
An enumeration type for the BigQuery write disposition strings.
- BigQueryIOTranslation - Class in org.apache.beam.sdk.io.gcp.bigquery
- BigQueryIOTranslation() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOTranslation
- BigQueryIOTranslation.ReadRegistrar - Class in org.apache.beam.sdk.io.gcp.bigquery
- BigQueryIOTranslation.WriteRegistrar - Class in org.apache.beam.sdk.io.gcp.bigquery
- BigqueryMatcher - Class in org.apache.beam.sdk.io.gcp.testing
-
A matcher to verify data in BigQuery by processing given query and comparing with content's checksum.
- BigqueryMatcher.TableAndQuery - Class in org.apache.beam.sdk.io.gcp.testing
- BigQueryOptions - Interface in org.apache.beam.sdk.io.gcp.bigquery
-
Properties needed when using Google BigQuery with the Apache Beam SDK.
- BigQuerySchemaIOProvider - Class in org.apache.beam.sdk.io.gcp.bigquery
-
An implementation of
SchemaIOProvider
for reading and writing to BigQuery withBigQueryIO
. - BigQuerySchemaIOProvider() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaIOProvider
- BigQuerySchemaRetrievalException - Exception Class in org.apache.beam.sdk.io.gcp.bigquery
-
Exception to signal that BigQuery schema retrieval failed.
- BigQuerySchemaTransformTranslation - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
- BigQuerySchemaTransformTranslation() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQuerySchemaTransformTranslation
- BigQuerySchemaTransformTranslation.BigQueryStorageReadSchemaTransformTranslator - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
- BigQuerySchemaTransformTranslation.BigQueryWriteSchemaTransformTranslator - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
- BigQuerySchemaTransformTranslation.ReadWriteRegistrar - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
- BigQueryServices - Interface in org.apache.beam.sdk.io.gcp.bigquery
-
An interface for real, mock, or fake implementations of Cloud BigQuery services.
- BigQueryServices.BigQueryServerStream<T> - Interface in org.apache.beam.sdk.io.gcp.bigquery
-
Container for reading data from streaming endpoints.
- BigQueryServices.DatasetService - Interface in org.apache.beam.sdk.io.gcp.bigquery
-
An interface to get, create and delete Cloud BigQuery datasets and tables.
- BigQueryServices.DatasetService.TableMetadataView - Enum Class in org.apache.beam.sdk.io.gcp.bigquery
- BigQueryServices.JobService - Interface in org.apache.beam.sdk.io.gcp.bigquery
-
An interface for the Cloud BigQuery load service.
- BigQueryServices.StorageClient - Interface in org.apache.beam.sdk.io.gcp.bigquery
-
An interface representing a client object for making calls to the BigQuery Storage API.
- BigQueryServices.StreamAppendClient - Interface in org.apache.beam.sdk.io.gcp.bigquery
-
An interface for appending records to a Storage API write stream.
- BigQueryServices.WriteStreamService - Interface in org.apache.beam.sdk.io.gcp.bigquery
-
An interface to get, create and flush Cloud BigQuery STORAGE API write streams.
- BigQueryServicesImpl - Class in org.apache.beam.sdk.io.gcp.bigquery
-
An implementation of
BigQueryServices
that actually communicates with the cloud BigQuery service. - BigQueryServicesImpl() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl
- BigQueryServicesImpl.DatasetServiceImpl - Class in org.apache.beam.sdk.io.gcp.bigquery
- BigQueryServicesImpl.WriteStreamServiceImpl - Class in org.apache.beam.sdk.io.gcp.bigquery
- BigQuerySinkMetrics - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Helper class to create perworker metrics for BigQuery Sink stages.
- BigQuerySinkMetrics() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
- BigQuerySinkMetrics.RpcMethod - Enum Class in org.apache.beam.sdk.io.gcp.bigquery
- BigQueryStorageApiInsertError - Class in org.apache.beam.sdk.io.gcp.bigquery
- BigQueryStorageApiInsertError(TableRow) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertError
- BigQueryStorageApiInsertError(TableRow, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertError
- BigQueryStorageApiInsertErrorCoder - Class in org.apache.beam.sdk.io.gcp.bigquery
- BigQueryStorageApiInsertErrorCoder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertErrorCoder
- BigQueryStorageReadSchemaTransformTranslator() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQuerySchemaTransformTranslation.BigQueryStorageReadSchemaTransformTranslator
- BigQueryStorageTableSource<T> - Class in org.apache.beam.sdk.io.gcp.bigquery
-
A
Source
representing reading from a table. - BigQueryStorageWriteApiSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
-
An implementation of
TypedSchemaTransformProvider
for BigQuery Storage Write API jobs configured viaBigQueryWriteConfiguration
. - BigQueryStorageWriteApiSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider
- BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
-
A
SchemaTransform
for BigQuery Storage Write API, configured withBigQueryWriteConfiguration
and instantiated byBigQueryStorageWriteApiSchemaTransformProvider
. - BigQueryTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.bigquery
-
BigQuery table provider.
- BigQueryTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTableProvider
- BigQueryUtils - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Utility methods for BigQuery related operations.
- BigQueryUtils() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
- BigQueryUtils.ConversionOptions - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Options for how to convert BigQuery data to Beam data.
- BigQueryUtils.ConversionOptions.Builder - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Builder for
BigQueryUtils.ConversionOptions
. - BigQueryUtils.ConversionOptions.TruncateTimestamps - Enum Class in org.apache.beam.sdk.io.gcp.bigquery
-
Controls whether to truncate timestamps to millisecond precision lossily, or to crash when truncation would result.
- BigQueryUtils.SchemaConversionOptions - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Options for how to convert BigQuery schemas to Beam schemas.
- BigQueryUtils.SchemaConversionOptions.Builder - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Builder for
BigQueryUtils.SchemaConversionOptions
. - BigQueryWriteConfiguration - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
-
Configuration for writing to BigQuery with SchemaTransforms.
- BigQueryWriteConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
- BigQueryWriteConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
-
Builder for
BigQueryWriteConfiguration
. - BigQueryWriteConfiguration.ErrorHandling - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
- BigQueryWriteConfiguration.ErrorHandling.Builder - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
- BigQueryWriteSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
-
A BigQuery Write SchemaTransformProvider that routes to either
BigQueryFileLoadsSchemaTransformProvider
orBigQueryStorageWriteApiSchemaTransformProvider
. - BigQueryWriteSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteSchemaTransformProvider
- BigQueryWriteSchemaTransformProvider.BigQueryWriteSchemaTransform - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
- BigQueryWriteSchemaTransformTranslator() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQuerySchemaTransformTranslation.BigQueryWriteSchemaTransformTranslator
- BigtableChangeStreamAccessor - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao
-
This is probably a temporary solution to what is a bigger migration from cloud-bigtable-client-core to java-bigtable.
- BigtableChangeStreamTestOptions - Interface in org.apache.beam.sdk.io.gcp.bigtable.changestreams
- BigtableClientOverride - Interface in org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao
-
Override the configuration of Cloud Bigtable data and admin client.
- BigtableConfig - Class in org.apache.beam.sdk.io.gcp.bigtable
-
Configuration for a Cloud Bigtable client.
- BigtableConfig() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
- BigtableIO - Class in org.apache.beam.sdk.io.gcp.bigtable
-
Transforms
for reading from and writing to Google Cloud Bigtable. - BigtableIO.ExistingPipelineOptions - Enum Class in org.apache.beam.sdk.io.gcp.bigtable
-
Overwrite options to determine what to do if change stream name is being reused and there exists metadata of the same change stream name.
- BigtableIO.Read - Class in org.apache.beam.sdk.io.gcp.bigtable
-
A
PTransform
that reads from Google Cloud Bigtable. - BigtableIO.ReadChangeStream - Class in org.apache.beam.sdk.io.gcp.bigtable
- BigtableIO.Write - Class in org.apache.beam.sdk.io.gcp.bigtable
-
A
PTransform
that writes to Google Cloud Bigtable. - BigtableIO.WriteWithResults - Class in org.apache.beam.sdk.io.gcp.bigtable
-
A
PTransform
that writes to Google Cloud Bigtable and emits aBigtableWriteResult
for each batch written. - BigtableReadSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration
- BigtableReadSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.bigtable
-
An implementation of
TypedSchemaTransformProvider
for Bigtable Read jobs configured viaBigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration
. - BigtableReadSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider
- BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.gcp.bigtable
-
Configuration for reading from Bigtable.
- BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.bigtable
- BigtableReadSchemaTransformProvider.BigtableRowToBeamRow - Class in org.apache.beam.sdk.io.gcp.bigtable
- BigtableRowToBeamRow - Class in org.apache.beam.sdk.io.gcp.bigtable
- BigtableRowToBeamRow() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableRowToBeamRow
- BigtableRowToBeamRow(Schema) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableRowToBeamRow
- BigtableRowToBeamRowFlat - Class in org.apache.beam.sdk.io.gcp.bigtable
- BigtableRowToBeamRowFlat(Schema, Map<String, Set<String>>) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableRowToBeamRowFlat
- BigtableTable - Class in org.apache.beam.sdk.extensions.sql.meta.provider.bigtable
- BigtableTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.bigtable
- BigtableTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.bigtable.BigtableTableProvider
- BigtableUtils - Class in org.apache.beam.sdk.io.gcp.testing
- BigtableUtils() - Constructor for class org.apache.beam.sdk.io.gcp.testing.BigtableUtils
- BigtableWriteResult - Class in org.apache.beam.sdk.io.gcp.bigtable
-
The result of writing a batch of rows to Bigtable.
- BigtableWriteResult() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResult
- BigtableWriteResultCoder - Class in org.apache.beam.sdk.io.gcp.bigtable
-
A coder for
BigtableWriteResult
. - BigtableWriteResultCoder() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResultCoder
- BigtableWriteSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration
- BigtableWriteSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.bigtable
-
An implementation of
TypedSchemaTransformProvider
for Bigtable Write jobs configured viaBigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration
. - BigtableWriteSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider
- BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.gcp.bigtable
-
Configuration for writing to Bigtable.
- BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.bigtable
- BigtableWriteSchemaTransformProvider.GetMutationsFromBeamRow - Class in org.apache.beam.sdk.io.gcp.bigtable
- BinaryCombineDoubleFn() - Constructor for class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
- BinaryCombineFn() - Constructor for class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
- BinaryCombineIntegerFn() - Constructor for class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
- BinaryCombineLongFn() - Constructor for class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
- bind(String, StateBinder) - Method in interface org.apache.beam.sdk.state.StateSpec
-
For internal use only; no backwards-compatibility guarantees.
- bindBag(String, StateSpec<BagState<T>>, Coder<T>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.EarlyBinder
- bindBag(String, StateSpec<BagState<T>>, Coder<T>) - Method in interface org.apache.beam.sdk.state.StateBinder
- bindCombining(String, StateSpec<CombiningState<InputT, AccumT, OutputT>>, Coder<AccumT>, Combine.CombineFn<InputT, AccumT, OutputT>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.EarlyBinder
- bindCombining(String, StateSpec<CombiningState<InputT, AccumT, OutputT>>, Coder<AccumT>, Combine.CombineFn<InputT, AccumT, OutputT>) - Method in interface org.apache.beam.sdk.state.StateBinder
- bindCombiningWithContext(String, StateSpec<CombiningState<InputT, AccumT, OutputT>>, Coder<AccumT>, CombineWithContext.CombineFnWithContext<InputT, AccumT, OutputT>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.EarlyBinder
- bindCombiningWithContext(String, StateSpec<CombiningState<InputT, AccumT, OutputT>>, Coder<AccumT>, CombineWithContext.CombineFnWithContext<InputT, AccumT, OutputT>) - Method in interface org.apache.beam.sdk.state.StateBinder
- Binding AMQP and receive messages - Search tag in class org.apache.beam.sdk.io.amqp.AmqpIO
- Section
- bindMap(String, StateSpec<MapState<KeyT, ValueT>>, Coder<KeyT>, Coder<ValueT>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.EarlyBinder
- bindMap(String, StateSpec<MapState<KeyT, ValueT>>, Coder<KeyT>, Coder<ValueT>) - Method in interface org.apache.beam.sdk.state.StateBinder
- bindMultimap(String, StateSpec<MultimapState<KeyT, ValueT>>, Coder<KeyT>, Coder<ValueT>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.EarlyBinder
- bindMultimap(String, StateSpec<MultimapState<KeyT, ValueT>>, Coder<KeyT>, Coder<ValueT>) - Method in interface org.apache.beam.sdk.state.StateBinder
- bindOrderedList(String, StateSpec<OrderedListState<T>>, Coder<T>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.EarlyBinder
- bindOrderedList(String, StateSpec<OrderedListState<T>>, Coder<T>) - Method in interface org.apache.beam.sdk.state.StateBinder
- bindSet(String, StateSpec<SetState<T>>, Coder<T>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.EarlyBinder
- bindSet(String, StateSpec<SetState<T>>, Coder<T>) - Method in interface org.apache.beam.sdk.state.StateBinder
- bindValue(String, StateSpec<ValueState<T>>, Coder<T>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.EarlyBinder
- bindValue(String, StateSpec<ValueState<T>>, Coder<T>) - Method in interface org.apache.beam.sdk.state.StateBinder
- bindWatermark(String, StateSpec<WatermarkHoldState>, TimestampCombiner) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.EarlyBinder
- bindWatermark(String, StateSpec<WatermarkHoldState>, TimestampCombiner) - Method in interface org.apache.beam.sdk.state.StateBinder
-
Bind to a watermark
StateSpec
. - BIT_XOR - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
- BitSetCoder - Class in org.apache.beam.sdk.coders
-
Coder for
BitSet
. - BitXOr() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations.BitXOr
- BlackholeOutput() - Constructor for class org.apache.beam.sdk.jmh.util.VarIntBenchmark.BlackholeOutput
- BlobstoreClientBuilderFactory - Interface in org.apache.beam.sdk.io.azure.options
-
Construct BlobServiceClientBuilder from Azure pipeline options.
- BlobstoreOptions - Interface in org.apache.beam.sdk.io.azure.options
-
Options used to configure Microsoft Azure Blob Storage.
- Block() - Constructor for class org.apache.beam.sdk.io.BlockBasedSource.Block
- BlockBasedReader(BlockBasedSource<T>) - Constructor for class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
- BlockBasedSource<T> - Class in org.apache.beam.sdk.io
-
A
BlockBasedSource
is aFileBasedSource
where a file consists of blocks of records. - BlockBasedSource(String, long) - Constructor for class org.apache.beam.sdk.io.BlockBasedSource
-
Like
BlockBasedSource(String, EmptyMatchTreatment, long)
but with a defaultEmptyMatchTreatment
ofEmptyMatchTreatment.DISALLOW
. - BlockBasedSource(String, EmptyMatchTreatment, long) - Constructor for class org.apache.beam.sdk.io.BlockBasedSource
-
Creates a
BlockBasedSource
based on a file name or pattern. - BlockBasedSource(MatchResult.Metadata, long, long, long) - Constructor for class org.apache.beam.sdk.io.BlockBasedSource
-
Creates a
BlockBasedSource
for a single file. - BlockBasedSource(ValueProvider<String>, long) - Constructor for class org.apache.beam.sdk.io.BlockBasedSource
- BlockBasedSource(ValueProvider<String>, EmptyMatchTreatment, long) - Constructor for class org.apache.beam.sdk.io.BlockBasedSource
- BlockBasedSource.Block<T> - Class in org.apache.beam.sdk.io
-
A
Block
represents a block of records that can be read. - BlockBasedSource.BlockBasedReader<T> - Class in org.apache.beam.sdk.io
-
A
Reader
that reads records from aBlockBasedSource
. - BlockingCommitterImpl - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
- BlockTracker(OffsetRange, long, long) - Constructor for class org.apache.beam.sdk.io.parquet.ParquetIO.ReadFiles.BlockTracker
- BOOL - Enum constant in enum class org.apache.beam.sdk.io.clickhouse.TableSchema.TypeName
- BOOL - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- BOOLEAN - Enum constant in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- BOOLEAN - Enum constant in enum class org.apache.beam.sdk.transforms.display.DisplayData.Type
- BOOLEAN - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
- BOOLEAN - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
-
The type of boolean fields.
- BooleanCoder - Class in org.apache.beam.sdk.coders
- BooleanCoder() - Constructor for class org.apache.beam.sdk.coders.BooleanCoder
- booleans() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
The
TypeDescriptor
for Boolean. - booleanToByteArray(boolean) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigtableUtils
- borrowDataset(PTransform<? extends PValue, ?>) - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
- borrowDataset(PValue) - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
- BOTH - Enum constant in enum class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider.PushDownOptions
- bounded(String, BoundedSource<T>, SerializablePipelineOptions, int) - Static method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSource
- Bounded(SparkContext, BoundedSource<T>, SerializablePipelineOptions, String) - Constructor for class org.apache.beam.runners.spark.io.SourceRDD.Bounded
- BOUNDED - Enum constant in enum class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker.IsBounded
-
Indicates that a
Restriction
represents a bounded amount of work. - BOUNDED - Enum constant in enum class org.apache.beam.sdk.values.PCollection.IsBounded
-
Indicates that a
PCollection
contains a bounded number of elements. - BOUNDED_UNKNOWN - Static variable in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
- BoundedDataset<T> - Class in org.apache.beam.runners.spark.translation
-
Holds an RDD or values for deferred conversion to an RDD if needed.
- BoundedDatasetFactory - Class in org.apache.beam.runners.spark.structuredstreaming.io
- boundedImpulse() - Static method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSource
- boundedness - Variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSource
- Boundedness - Search tag in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
- Section
- BoundedReader() - Constructor for class org.apache.beam.sdk.io.BoundedSource.BoundedReader
- BoundedReadFromUnboundedSource<T> - Class in org.apache.beam.sdk.io
-
PTransform
that reads a bounded amount of data from anUnboundedSource
, specified as one or both of a maximum number of elements or a maximum period of time to read. - BoundedSource<T> - Class in org.apache.beam.sdk.io
-
A
Source
that reads a finite amount of input and, because of that, supports some additional operations. - BoundedSource() - Constructor for class org.apache.beam.sdk.io.BoundedSource
- BoundedSource.BoundedReader<T> - Class in org.apache.beam.sdk.io
-
A
Reader
that reads a bounded amount of input and supports some additional operations, such as progress estimation and dynamic work rebalancing. - BoundedSourceP<T> - Class in org.apache.beam.runners.jet.processors
-
Jet
Processor
implementation for reading from a bounded Beam source. - boundedTrie(Class<?>, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
-
Create a metric that accumulates and reports set of unique string values bounded to a max limit.
- boundedTrie(String, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
-
Create a metric that accumulates and reports set of unique string values bounded to a max limit.
- BoundedTrie - Interface in org.apache.beam.sdk.metrics
-
Internal: For internal use only and not for public consumption.
- BoundedTrieImpl - Class in org.apache.beam.runners.jet.metrics
-
Implementation of
BoundedTrie
. - BoundedTrieImpl(MetricName) - Constructor for class org.apache.beam.runners.jet.metrics.BoundedTrieImpl
- BoundedTrieResult - Class in org.apache.beam.sdk.metrics
-
Internal: For internal use only and not for public consumption.
- BoundedTrieResult() - Constructor for class org.apache.beam.sdk.metrics.BoundedTrieResult
- BoundedWindow - Class in org.apache.beam.sdk.transforms.windowing
-
A
BoundedWindow
represents window information assigned to data elements. - BoundedWindow() - Constructor for class org.apache.beam.sdk.transforms.windowing.BoundedWindow
- boxIfPrimitive(TypeDescriptor) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
- bqServices - Variable in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
- broadcast(JavaSparkContext) - Method in class org.apache.beam.runners.spark.util.SideInputBroadcast
- broadcast(T) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
- BrokerResponse - Class in org.apache.beam.sdk.io.solace.broker
- BrokerResponse(int, String, InputStream) - Constructor for class org.apache.beam.sdk.io.solace.broker.BrokerResponse
- bucketAccessible(GcsPath) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
-
Returns whether the GCS bucket exists and is accessible.
- bucketOwner(GcsPath) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
-
Returns the project number of the project which owns this bucket.
- buffer(BufferedElement) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.stableinput.KeyedBufferingElementsHandler
- buffer(BufferedElement) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.stableinput.NonKeyedBufferingElementsHandler
- buffered - Enum constant in enum class org.apache.beam.sdk.extensions.ordered.UnprocessedEvent.Reason
- BufferedElement - Interface in org.apache.beam.runners.flink.translation.wrappers.streaming.stableinput
-
An interface for elements buffered during a checkpoint when using @RequiresStableInput.
- BufferedExternalSorter - Class in org.apache.beam.sdk.extensions.sorter
-
Sorter
that will use in memory sorting until the values can't fit into memory and will then fall back to external sorting. - BufferedExternalSorter.Options - Class in org.apache.beam.sdk.extensions.sorter
-
Contains configuration for the sorter.
- bufferingDoFnRunner - Variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- BufferingDoFnRunner<InputT,
OutputT> - Class in org.apache.beam.runners.flink.translation.wrappers.streaming.stableinput -
A
DoFnRunner
which buffers data for supportingDoFn.RequiresStableInput
. - BufferingStreamObserver<T> - Class in org.apache.beam.sdk.fn.stream
-
A thread safe
StreamObserver
which uses a bounded queue to pass elements to a processing thread responsible for interacting with the underlyingCallStreamObserver
. - BufferingStreamObserver(Phaser, CallStreamObserver<T>, ExecutorService, int) - Constructor for class org.apache.beam.sdk.fn.stream.BufferingStreamObserver
- build() - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration.Builder
- build() - Method in class org.apache.beam.io.requestresponse.Monitoring.Builder
- build() - Method in class org.apache.beam.sdk.expansion.service.WindowIntoTransformProvider.Configuration.Builder
- build() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsCreateOptions.Builder
- build() - Method in class org.apache.beam.sdk.extensions.gcp.util.CustomHttpErrors.Builder
- build() - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.CreateOptions.Builder
- build() - Method in class org.apache.beam.sdk.extensions.ml.AnnotateText.Builder
- build() - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText.Builder
- build() - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText.Builder
- build() - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText.Builder
- build() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus.Builder
- build() - Method in class org.apache.beam.sdk.extensions.sbe.SbeSchema.IrOptions.Builder
- build() - Method in class org.apache.beam.sdk.extensions.sbe.UnsignedOptions.Builder
- build() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv.BeamSqlEnvBuilder
-
Build function to create an instance of BeamSqlEnv based on preset fields.
- build() - Method in class org.apache.beam.sdk.extensions.sql.impl.UdfImplReflectiveFunctionBase.ParameterListBuilder
- build() - Method in class org.apache.beam.sdk.extensions.sql.impl.utils.SerializableRexNode.Builder
- build() - Method in class org.apache.beam.sdk.extensions.sql.meta.Table.Builder
- build() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.UserFunctionDefinitions.Builder
- build() - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.Globally.Builder
- build() - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.PerKey.Builder
- build() - Method in class org.apache.beam.sdk.fn.test.TestStreams.Builder
- build() - Method in class org.apache.beam.sdk.io.aws2.auth.StsAssumeRoleForFederatedCredentialsProvider.Builder
-
Validates and fully initializes a
StsAssumeRoleForFederatedCredentialsProvider
instance. - build() - Method in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.aws2.common.RetryConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.RecordAggregation.Builder
- build() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.aws2.s3.SSECustomerKey.Builder
- build() - Method in class org.apache.beam.sdk.io.aws2.sqs.providers.SqsReadConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.cdap.ConfigWrapper
- build() - Method in class org.apache.beam.sdk.io.cdap.Plugin.Builder
- build() - Method in class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider.CsvWriteConfiguration.Builder
-
Builds a
CsvWriteTransformProvider.CsvWriteConfiguration
instance. - build() - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.CsvConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.ParquetConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.XmlConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.fs.CreateOptions.StandardCreateOptions.Builder
- build() - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration.Builder
-
Builds the
BigQueryExportReadSchemaTransformConfiguration
configuration. - build() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.SchemaConversionOptions.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.Builder
-
Builds a
BigQueryWriteConfiguration
instance. - build() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.ErrorHandling.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration.Builder
-
Builds a
BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration
instance. - build() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchGetDocuments.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchWriteWithDeadLetterQueue.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchWriteWithSummary.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.ListCollectionIds.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.ListDocuments.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.PartitionQuery.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.RunQuery.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
-
Create a new instance of
RpcQosOptions
from the current builder state. - build() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.ErrorHandling.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.ErrorHandling.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PublisherOptions.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.SubscriberOptions.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
-
Builds the
ChangeStreamRecordMetadata
. - build() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
-
Builds a
PartitionMetadata
from the given fields. - build() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig.Builder
- build() - Method in class org.apache.beam.sdk.io.iceberg.IcebergDestination.Builder
- build() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
- build() - Method in class org.apache.beam.sdk.io.iceberg.IcebergTableCreateConfig.Builder
- build() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration.Builder
- build() - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo.Builder
- build() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.json.providers.JsonWriteTransformProvider.JsonWriteConfiguration.Builder
-
Builds a
JsonWriteTransformProvider.JsonWriteConfiguration
instance. - build() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
-
Builds a
KafkaReadSchemaTransformConfiguration
instance. - build() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadConfiguration.Builder
-
Builds the
SingleStoreSchemaTransformReadConfiguration
configuration. - build() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteConfiguration.Builder
-
Builds the
SingleStoreSchemaTransformWriteConfiguration
configuration. - build() - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionServiceFactory.Builder
- build() - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthSempClientFactory.Builder
- build() - Method in class org.apache.beam.sdk.io.solace.broker.GCPSecretSessionServiceFactory.Builder
- build() - Method in class org.apache.beam.sdk.io.solace.data.Solace.CorrelationKey.Builder
- build() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Destination.Builder
- build() - Method in class org.apache.beam.sdk.io.solace.data.Solace.PublishResult.Builder
- build() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record.Builder
- build() - Method in class org.apache.beam.sdk.io.sparkreceiver.ReceiverBuilder
- build() - Method in class org.apache.beam.sdk.io.TextRowCountEstimator.Builder
- build() - Method in class org.apache.beam.sdk.io.TFRecordReadSchemaTransformConfiguration.Builder
-
Builds the
TFRecordReadSchemaTransformConfiguration
configuration. - build() - Method in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformConfiguration.Builder
-
Builds the
TFRecordWriteSchemaTransformConfiguration
configuration. - build() - Method in class org.apache.beam.sdk.managed.testing.TestSchemaTransformProvider.Config.Builder
- build() - Method in class org.apache.beam.sdk.metrics.MetricsFilter.Builder
- build() - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Rate.Builder
- build() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.Builder
- build() - Method in class org.apache.beam.sdk.schemas.io.Failure.Builder
- build() - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- build() - Method in class org.apache.beam.sdk.schemas.Schema.Field.Builder
- build() - Method in class org.apache.beam.sdk.schemas.Schema.Options.Builder
- build() - Method in class org.apache.beam.sdk.schemas.transforms.providers.ErrorHandling.Builder
- build() - Method in class org.apache.beam.sdk.schemas.transforms.providers.FlattenTransformProvider.Configuration.Builder
- build() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider.Configuration.Builder
- build() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider.Configuration.Builder
- build() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider.Configuration.Builder
- build() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf.Configuration.Builder
- build() - Method in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider.Configuration.Builder
- build() - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Builder
- build() - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Failure.Builder
- build() - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Record.Builder
- build() - Method in class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn.ParseWithError.Builder
- build() - Method in class org.apache.beam.sdk.values.Row.Builder
- build() - Method in class org.apache.beam.sdk.values.Row.FieldValueBuilder
- build(String) - Method in class org.apache.beam.sdk.metrics.LabeledMetricNameUtils.MetricNameBuilder
- build(BeamSqlEnv.BeamSqlEnvBuilder, boolean, PipelineOptions) - Method in class org.apache.beam.sdk.extensions.sql.BeamSqlCli
- buildBeamSqlNullableSchema(Object...) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableUtils
- buildBeamSqlSchema(Object...) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableUtils
-
Create a RowsBuilder with the specified row type info.
- buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTableProvider
- buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigtable.BigtableTableProvider
- buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
- buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datagen.DataGeneratorTableProvider
-
Instantiates the
DataGeneratorTable
when aCREATE EXTERNAL TABLE
statement withTYPE 'datagen'
is executed. - buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.iceberg.IcebergTableProvider
- buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.KafkaTableProvider
- buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTableProvider
- buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.parquet.ParquetTableProvider
- buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.pubsublite.PubsubLiteTableProvider
- buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.ReadOnlyTableProvider
- buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.SchemaIOTableProviderWrapper
- buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.seqgen.GenerateSequenceTableProvider
- buildBeamSqlTable(Table) - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider
-
Build a
BeamSqlTable
using the given table meta info. - buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider
- buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTableProvider
- buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.store.InMemoryMetaStore
- buildClient(AwsOptions, BuilderT, ClientConfiguration) - Static method in interface org.apache.beam.sdk.io.aws2.common.ClientBuilderFactory
-
Utility to directly build a client of type
ClientBuilderFactory
using builder ofClientBuilderFactory
. - buildDatasource() - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
- buildDatasource() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
Builds
SnowflakeBasicDataSource
based on the current configuration. - builder() - Static method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration
- builder() - Static method in class org.apache.beam.io.requestresponse.Monitoring
- builder() - Static method in class org.apache.beam.runners.jobsubmission.JobPreparation
- builder() - Static method in class org.apache.beam.sdk.expansion.service.WindowIntoTransformProvider.Configuration
- builder() - Static method in class org.apache.beam.sdk.extensions.gcp.storage.GcsCreateOptions
-
Returns a
GcsCreateOptions.Builder
. - builder() - Static method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.CreateOptions
- builder() - Static method in class org.apache.beam.sdk.extensions.sbe.SbeSchema.IrOptions
- builder() - Static method in class org.apache.beam.sdk.extensions.sbe.UnsignedOptions
- builder() - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall
- builder() - Static method in class org.apache.beam.sdk.extensions.sql.impl.UdfImplReflectiveFunctionBase
-
Creates a ParameterListBuilder.
- builder() - Static method in class org.apache.beam.sdk.extensions.sql.meta.Table
- builder() - Static method in class org.apache.beam.sdk.io.aws2.auth.StsAssumeRoleForFederatedCredentialsProvider
-
Creates a builder for the type
StsAssumeRoleForFederatedCredentialsProvider
. - builder() - Static method in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration
- builder() - Static method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration
- builder() - Static method in class org.apache.beam.sdk.io.aws2.common.RetryConfiguration
- builder() - Static method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.RecordAggregation
- builder() - Static method in class org.apache.beam.sdk.io.aws2.kinesis.WatermarkParameters
- builder() - Static method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
-
Creates a new uninitialized
S3FileSystemConfiguration.Builder
. - builder() - Static method in class org.apache.beam.sdk.io.aws2.s3.SSECustomerKey
- builder() - Static method in class org.apache.beam.sdk.io.aws2.sqs.providers.SqsReadConfiguration
- builder() - Static method in class org.apache.beam.sdk.io.cdap.Plugin
-
Creates a plugin builder instance.
- builder() - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- builder() - Static method in class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider.CsvWriteConfiguration
- builder() - Static method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration
- builder() - Static method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration
- builder() - Static method in class org.apache.beam.sdk.io.fs.CreateOptions.StandardCreateOptions
-
Returns a
CreateOptions.StandardCreateOptions.Builder
. - builder() - Static method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata
- builder() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration
-
Instantiates a
BigQueryExportReadSchemaTransformConfiguration.Builder
. - builder() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions
- builder() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.SchemaConversionOptions
- builder() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration
-
Instantiates a
BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration.Builder
instance. - builder() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
-
Instantiates a
BigQueryWriteConfiguration.Builder
instance. - builder() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.ErrorHandling
- builder() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration
- builder() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration
-
Instantiates a
BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration.Builder
instance. - builder() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
- builder() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.ErrorHandling
- builder() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration
- builder() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.ErrorHandling
- builder() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
- builder() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
- builder() - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration
- builder() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration
- builder() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema
- builder() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration
- builder() - Static method in class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig
- builder() - Static method in class org.apache.beam.sdk.io.iceberg.IcebergDestination
- builder() - Static method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- builder() - Static method in class org.apache.beam.sdk.io.iceberg.IcebergTableCreateConfig
- builder() - Static method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration
- builder() - Static method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo
- builder() - Static method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
- builder() - Static method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
- builder() - Static method in class org.apache.beam.sdk.io.json.providers.JsonWriteTransformProvider.JsonWriteConfiguration
- builder() - Static method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
-
Instantiates a
KafkaReadSchemaTransformConfiguration.Builder
instance. - builder() - Static method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration
- builder() - Static method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadConfiguration
-
Instantiates a
SingleStoreSchemaTransformReadConfiguration.Builder
. - builder() - Static method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteConfiguration
-
Instantiates a
SingleStoreSchemaTransformWriteConfiguration.Builder
. - builder() - Static method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionServiceFactory
- builder() - Static method in class org.apache.beam.sdk.io.solace.broker.BasicAuthSempClientFactory
- builder() - Static method in class org.apache.beam.sdk.io.solace.broker.GCPSecretSessionServiceFactory
- builder() - Static method in class org.apache.beam.sdk.io.solace.data.Semp.Queue
- builder() - Static method in class org.apache.beam.sdk.io.solace.data.Semp.QueueData
- builder() - Static method in class org.apache.beam.sdk.io.solace.data.Solace.CorrelationKey
- builder() - Static method in class org.apache.beam.sdk.io.solace.data.Solace.Destination
- builder() - Static method in class org.apache.beam.sdk.io.solace.data.Solace.PublishResult
- builder() - Static method in class org.apache.beam.sdk.io.solace.data.Solace.Record
- builder() - Static method in class org.apache.beam.sdk.io.solace.RetryCallableManager
- builder() - Static method in class org.apache.beam.sdk.io.TextRowCountEstimator
- builder() - Static method in class org.apache.beam.sdk.io.TFRecordReadSchemaTransformConfiguration
-
Instantiates a
TFRecordReadSchemaTransformConfiguration.Builder
instance. - builder() - Static method in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformConfiguration
-
Instantiates a
TFRecordWriteSchemaTransformConfiguration.Builder
instance. - builder() - Static method in class org.apache.beam.sdk.managed.testing.TestSchemaTransformProvider.Config
- builder() - Static method in class org.apache.beam.sdk.metrics.MetricsFilter
- builder() - Static method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration
- builder() - Static method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Rate
- builder() - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor
- builder() - Static method in class org.apache.beam.sdk.schemas.Schema
- builder() - Static method in class org.apache.beam.sdk.schemas.Schema.Options
- builder() - Static method in class org.apache.beam.sdk.schemas.transforms.providers.ErrorHandling
- builder() - Static method in class org.apache.beam.sdk.schemas.transforms.providers.FlattenTransformProvider.Configuration
- builder() - Static method in class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider.Configuration
- builder() - Static method in class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider.Configuration
- builder() - Static method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider.Configuration
- builder() - Static method in class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf.Configuration
- builder() - Static method in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider.Configuration
- builder() - Static method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord
- builder() - Static method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Failure
- builder() - Static method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Record
- builder(Dialect) - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema
- builder(CatalogManager) - Static method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
-
Creates a builder with the default schema backed by the catalog manager.
- builder(TableProvider) - Static method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
-
Creates a builder with the default schema backed by the table provider.
- builder(RexNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.SerializableRexNode
- Builder() - Constructor for class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.io.requestresponse.Monitoring.Builder
- Builder() - Constructor for class org.apache.beam.sdk.expansion.service.WindowIntoTransformProvider.Configuration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.extensions.gcp.storage.GcsCreateOptions.Builder
- Builder() - Constructor for class org.apache.beam.sdk.extensions.gcp.util.CustomHttpErrors.Builder
- Builder() - Constructor for class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.CreateOptions.Builder
- Builder() - Constructor for class org.apache.beam.sdk.extensions.ml.AnnotateText.Builder
- Builder() - Constructor for class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText.Builder
- Builder() - Constructor for class org.apache.beam.sdk.extensions.ml.DLPInspectText.Builder
- Builder() - Constructor for class org.apache.beam.sdk.extensions.ml.DLPReidentifyText.Builder
- Builder() - Constructor for class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus.Builder
- Builder() - Constructor for class org.apache.beam.sdk.extensions.sbe.SbeSchema.IrOptions.Builder
- Builder() - Constructor for class org.apache.beam.sdk.extensions.sbe.UnsignedOptions.Builder
- Builder() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.Table.Builder
- Builder() - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.translation.UserFunctionDefinitions.Builder
- Builder() - Constructor for class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.Globally.Builder
- Builder() - Constructor for class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.PerKey.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.aws2.auth.StsAssumeRoleForFederatedCredentialsProvider.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.aws2.common.ClientConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.aws2.common.RetryConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.RecordAggregation.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.aws2.sqs.providers.SqsReadConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.cdap.Plugin.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider.CsvWriteConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.CsvConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.ParquetConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.XmlConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.fs.CreateOptions.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.fs.CreateOptions.StandardCreateOptions.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.fs.MatchResult.Metadata.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.SchemaConversionOptions.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.ErrorHandling.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.ErrorHandling.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.ErrorHandling.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PublisherOptions.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.SubscriberOptions.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergDestination.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergTableCreateConfig.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.iceberg.SnapshotInfo.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.json.providers.JsonWriteTransformProvider.JsonWriteConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionServiceFactory.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.solace.broker.BasicAuthSempClientFactory.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.solace.broker.GCPSecretSessionServiceFactory.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.solace.data.Solace.CorrelationKey.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.solace.data.Solace.Destination.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.solace.data.Solace.PublishResult.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.solace.data.Solace.Record.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.solace.RetryCallableManager.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.splunk.SplunkEvent.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.splunk.SplunkWriteError.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.TextRowCountEstimator.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.TFRecordReadSchemaTransformConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.managed.testing.TestSchemaTransformProvider.Config.Builder
- Builder() - Constructor for class org.apache.beam.sdk.metrics.MetricsFilter.Builder
- Builder() - Constructor for class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Rate.Builder
- Builder() - Constructor for class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.Builder
- Builder() - Constructor for class org.apache.beam.sdk.schemas.FieldValueTypeInformation.Builder
- Builder() - Constructor for class org.apache.beam.sdk.schemas.io.Failure.Builder
- Builder() - Constructor for class org.apache.beam.sdk.schemas.Schema.Builder
- Builder() - Constructor for class org.apache.beam.sdk.schemas.Schema.Field.Builder
- Builder() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.ErrorHandling.Builder
- Builder() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.FlattenTransformProvider.Configuration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider.Configuration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider.Configuration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider.Configuration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf.Configuration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider.Configuration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.transforms.DoFnSchemaInformation.Builder
- Builder() - Constructor for class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Builder
- Builder() - Constructor for class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Failure.Builder
- Builder() - Constructor for class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Record.Builder
- Builder() - Constructor for class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn.Builder
- Builder() - Constructor for class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn.ParseWithError.Builder
- Builder() - Constructor for class org.apache.beam.sdk.transforms.JsonToRow.ParseResult.Builder
- Builder(JodaClock, FirestoreStatefulComponentFactory, RpcQosOptions, boolean, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.PartitionQuery.Builder
- Builder(JodaClock, FirestoreStatefulComponentFactory, RpcQosOptions, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchGetDocuments.Builder
- Builder(RexNode) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.utils.SerializableRexNode.Builder
- builderForType(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct
- builderFrom(S3Options) - Static method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
-
Creates a new
S3FileSystemConfiguration.Builder
with values initialized by the properties ofs3Options
. - buildExternal(ConfigT) - Method in interface org.apache.beam.sdk.transforms.ExternalTransformBuilder
-
Builds the transform after it has been configured.
- buildExternal(DebeziumTransformRegistrar.ReadBuilder.Configuration) - Method in class org.apache.beam.io.debezium.DebeziumTransformRegistrar.ReadBuilder
- buildExternal(KinesisTransformRegistrar.ReadDataBuilder.Configuration) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.ReadDataBuilder
- buildExternal(KinesisTransformRegistrar.WriteBuilder.Configuration) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.WriteBuilder
- buildExternal(ExternalRead.Configuration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead.ReadBuilder
- buildExternal(ExternalWrite.Configuration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite.WriteBuilder
- buildExternal(SpannerTransformRegistrar.ReadBuilder.Configuration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder
- buildExternal(SpannerTransformRegistrar.WriteBuilder.Configuration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.DeleteBuilder
- buildExternal(SpannerTransformRegistrar.WriteBuilder.Configuration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.InsertBuilder
- buildExternal(SpannerTransformRegistrar.WriteBuilder.Configuration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.InsertOrUpdateBuilder
- buildExternal(SpannerTransformRegistrar.WriteBuilder.Configuration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReplaceBuilder
- buildExternal(SpannerTransformRegistrar.WriteBuilder.Configuration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.UpdateBuilder
- buildExternal(ReadBuilder.Configuration) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.ReadBuilder
- buildExternal(WriteBuilder.Configuration) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.WriteBuilder
- buildFrom(DescriptorProtos.FileDescriptorSet) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoDomain
- buildFrom(Descriptors.Descriptor) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoDomain
- buildFrom(Descriptors.FileDescriptor) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoDomain
- buildFrom(InputStream) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoDomain
- Building a Managed turnkey transform - Search tag in class org.apache.beam.sdk.managed.Managed
- Section
- buildIOReader(PBegin) - Method in class org.apache.beam.sdk.extensions.sql.impl.schema.BeamPCollectionTable
- buildIOReader(PBegin) - Method in interface org.apache.beam.sdk.extensions.sql.meta.BeamSqlTable
-
create a
PCollection<Row>
from source. - buildIOReader(PBegin) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigtable.BigtableTable
- buildIOReader(PBegin) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datagen.DataGeneratorTable
- buildIOReader(PBegin) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.hcatalog.HCatalogTable
- buildIOReader(PBegin) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
- buildIOReader(PBegin) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTable
- buildIOReader(PBegin) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestBoundedTable
- buildIOReader(PBegin) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestUnboundedTable
- buildIOReader(PBegin) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTable
- buildIOReader(PBegin, BeamSqlTableFilter, List<String>) - Method in class org.apache.beam.sdk.extensions.sql.meta.BaseBeamTable
- buildIOReader(PBegin, BeamSqlTableFilter, List<String>) - Method in interface org.apache.beam.sdk.extensions.sql.meta.BeamSqlTable
-
create a
PCollection<Row>
from source with predicate and/or project pushed-down. - buildIOReader(PBegin, BeamSqlTableFilter, List<String>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigtable.BigtableTable
- buildIOReader(PBegin, BeamSqlTableFilter, List<String>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datagen.DataGeneratorTable
- buildIOReader(PBegin, BeamSqlTableFilter, List<String>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTable
- buildIOWriter(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.impl.schema.BeamPCollectionTable
- buildIOWriter(PCollection<Row>) - Method in interface org.apache.beam.sdk.extensions.sql.meta.BeamSqlTable
-
create a
IO.write()
instance to write to target. - buildIOWriter(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigtable.BigtableTable
- buildIOWriter(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datagen.DataGeneratorTable
- buildIOWriter(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.hcatalog.HCatalogTable
- buildIOWriter(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
- buildIOWriter(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTable
- buildIOWriter(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestBoundedTable
- buildIOWriter(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTable
- buildIOWriter(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTable
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamAggregationRel
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCoGBKJoinRel
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIntersectRel
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSinkRel
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMatchRel
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMinusRel
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamPushDownIOSourceRel
- buildPTransform() - Method in interface org.apache.beam.sdk.extensions.sql.impl.rel.BeamRelNode
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSideInputJoinRel
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSideInputLookupJoinRel
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamTableFunctionScanRel
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUncollectRel
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnionRel
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnnestRel
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamValuesRel
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamWindowRel
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.BeamZetaSqlCalcRel
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.BeamZetaSqlUncollectRel
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.BeamZetaSqlUnnestRel
- buildPTransform(PTransform<PCollection<Row>, ? extends POutput>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel
- buildPTransform(PTransform<PCollection<Row>, ? extends POutput>) - Method in interface org.apache.beam.sdk.extensions.sql.impl.rel.BeamRelNode
- buildPTransform(PTransform<PCollection<Row>, ? extends POutput>) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.BeamZetaSqlCalcRel
- buildReader() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
- buildReader() - Method in interface org.apache.beam.sdk.schemas.io.SchemaIO
-
Returns a schema aware reader.
- buildRows(Schema, List<?>) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableUtils
-
Convenient way to build a
BeamSqlRow
s. - buildSchemaWithAttributes(Schema, List<String>, String) - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
-
Builds a new
Schema
by adding additional optional attributes and map field to the provided schema. - buildTemporaryFilename(ResourceId, String) - Static method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
-
Constructs a temporary file resource given the temporary directory and a filename.
- buildTransform(FileReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.fileschematransform.AvroReadSchemaTransformFormatProvider
- buildTransform(FileReadSchemaTransformConfiguration) - Method in interface org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformFormatProvider
- buildTransform(FileReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.fileschematransform.JsonReadSchemaTransformFormatProvider
- buildTransform(FileReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.fileschematransform.LineReadSchemaTransformFormatProvider
- buildTransform(FileReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.fileschematransform.ParquetReadSchemaTransformFormatProvider
- buildTransform(FileWriteSchemaTransformConfiguration, Schema) - Method in class org.apache.beam.sdk.io.fileschematransform.AvroWriteSchemaTransformFormatProvider
-
Builds a
PTransform
that transforms aRow
PCollection
into resultPCollectionTuple
with two tags, one for file names written usingAvroIO.Write
, another for errored-out rows. - buildTransform(FileWriteSchemaTransformConfiguration, Schema) - Method in class org.apache.beam.sdk.io.fileschematransform.CsvWriteSchemaTransformFormatProvider
- buildTransform(FileWriteSchemaTransformConfiguration, Schema) - Method in interface org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformFormatProvider
-
Builds a
PTransform
that writes aRow
PCollection
and outputs the resultingPCollectionTuple
with two tags, one for the file names, and another errored-out rows. - buildTransform(FileWriteSchemaTransformConfiguration, Schema) - Method in class org.apache.beam.sdk.io.fileschematransform.JsonWriteSchemaTransformFormatProvider
-
Builds a
PTransform
that transforms aRow
PCollection
into resultPCollectionTuple
with two tags, one for file names written usingTextIO.Write
, another for errored-out rows. - buildTransform(FileWriteSchemaTransformConfiguration, Schema) - Method in class org.apache.beam.sdk.io.fileschematransform.ParquetWriteSchemaTransformFormatProvider
-
Builds a
PTransform
that transforms aRow
PCollection
into resultPCollectionTuple
with two tags, one for file names written usingParquetIO.Sink
andFileIO.Write
, another for errored-out rows. - buildTransform(FileWriteSchemaTransformConfiguration, Schema) - Method in class org.apache.beam.sdk.io.fileschematransform.XmlWriteSchemaTransformFormatProvider
-
Builds a
PTransform
that transforms aRow
PCollection
into resultPCollectionTuple
with two tags, one for file names written usingXmlIO.Sink
andFileIO.Write
, another for errored-out rows. - buildTwoInputStream(KeyedStream<WindowedValue<KV<K, InputT>>, FlinkKey>, DataStream<RawUnionValue>, String, WindowDoFnOperator<K, InputT, OutputT>, TypeInformation<WindowedValue<KV<K, OutputT>>>) - Static method in class org.apache.beam.runners.flink.FlinkStreamingAggregationsTranslators
- buildWriter() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
- buildWriter() - Method in interface org.apache.beam.sdk.schemas.io.SchemaIO
-
Returns a schema aware writer.
- BUILTIN_AGGREGATOR_FACTORIES - Static variable in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations
- BUILTIN_ANALYTIC_FACTORIES - Static variable in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions
- BuiltinHashFunctions - Class in org.apache.beam.sdk.extensions.sql.impl.udf
-
Hash Functions.
- BuiltinHashFunctions() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinHashFunctions
- BuiltinStringFunctions - Class in org.apache.beam.sdk.extensions.sql.impl.udf
-
BuiltinStringFunctions.
- BuiltinStringFunctions() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinStringFunctions
- BuiltinTrigonometricFunctions - Class in org.apache.beam.sdk.extensions.sql.impl.udf
-
TrigonometricFunctions.
- BuiltinTrigonometricFunctions() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinTrigonometricFunctions
- bulkIO() - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO
- BulkIO() - Constructor for class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BulkIO
- Bulk reading of a single query or table - Search tag in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
- Section
- Bulk reading of multiple queries or tables - Search tag in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
- Section
- Bundle<T,
CollectionT> - Interface in org.apache.beam.runners.local -
An immutable collection of elements which are part of a
PCollection
. - BUNDLE - Enum constant in enum class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Import.ContentStructure
-
The source file contains one or more lines of newline-delimited JSON (ndjson).
- BundleCheckpointHandler - Interface in org.apache.beam.runners.fnexecution.control
-
A handler which is invoked when the SDK returns
BeamFnApi.DelayedBundleApplication
s as part of the bundle completion. - BundleCheckpointHandlers - Class in org.apache.beam.runners.fnexecution.control
-
Utility methods for creating
BundleCheckpointHandler
s. - BundleCheckpointHandlers() - Constructor for class org.apache.beam.runners.fnexecution.control.BundleCheckpointHandlers
- BundleCheckpointHandlers.StateAndTimerBundleCheckpointHandler<T> - Class in org.apache.beam.runners.fnexecution.control
-
A
BundleCheckpointHandler
which usesTimerInternals.TimerData
andValueState
to rescheduleBeamFnApi.DelayedBundleApplication
. - BundleFinalizationHandler - Interface in org.apache.beam.runners.fnexecution.control
-
A handler for the runner when a finalization request has been received.
- BundleFinalizationHandlers - Class in org.apache.beam.runners.fnexecution.control
-
Utility methods for creating
BundleFinalizationHandler
s. - BundleFinalizationHandlers() - Constructor for class org.apache.beam.runners.fnexecution.control.BundleFinalizationHandlers
- BundleFinalizationHandlers.InMemoryFinalizer - Class in org.apache.beam.runners.fnexecution.control
- bundleFinalizer() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator.FlinkStepContext
- BundleProcessorCacheTimeoutFactory() - Constructor for class org.apache.beam.sdk.options.SdkHarnessOptions.BundleProcessorCacheTimeoutFactory
- BundleProgressHandler - Interface in org.apache.beam.runners.fnexecution.control
-
A handler for bundle progress messages, both during bundle execution and on its completion.
- BundleSplitHandler - Interface in org.apache.beam.runners.fnexecution.control
-
A handler which is invoked whenever an active bundle is split.
- by(Contextful<Contextful.Fn<UserT, DestinationT>>) - Method in class org.apache.beam.sdk.io.FileIO.Write
-
Like
FileIO.Write.by(org.apache.beam.sdk.transforms.SerializableFunction<UserT, DestinationT>)
, but with access to context such as side inputs. - by(SerializableFunction<UserT, DestinationT>) - Method in class org.apache.beam.sdk.io.FileIO.Write
-
Specifies how to partition elements into groups ("destinations").
- by(PredicateT) - Static method in class org.apache.beam.sdk.transforms.Filter
-
Returns a
PTransform
that takes an inputPCollection<T>
and returns aPCollection<T>
with elements that satisfy the given predicate. - By() - Constructor for class org.apache.beam.sdk.schemas.transforms.CoGroup.By
- byFieldAccessDescriptor(FieldAccessDescriptor) - Static method in class org.apache.beam.sdk.schemas.transforms.Group
-
Returns a transform that groups all elements in the input
PCollection
keyed by the fields specified. - byFieldIds(Integer...) - Static method in class org.apache.beam.sdk.schemas.transforms.Group
-
Returns a transform that groups all elements in the input
PCollection
keyed by the list of fields specified. - byFieldIds(Iterable<Integer>) - Static method in class org.apache.beam.sdk.schemas.transforms.Group
-
Same as
Group.byFieldIds(Integer...)
. - byFieldNames(Iterable<String>) - Static method in class org.apache.beam.sdk.schemas.transforms.Group
-
Same as
Group.byFieldNames(String...)
. - byFieldNames(String...) - Static method in class org.apache.beam.sdk.schemas.transforms.Group
-
Returns a transform that groups all elements in the input
PCollection
keyed by the list of fields specified. - ByFields() - Constructor for class org.apache.beam.sdk.schemas.transforms.Group.ByFields
- byId(int, int, RetryConfiguration, AsyncBatchWriteHandler.Stats, BiFunction<String, List<RecT>, CompletableFuture<List<ErrT>>>, Function<ErrT, String>, Function<RecT, String>, Function<ErrT, String>) - Static method in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
-
AsyncBatchWriteHandler that correlates records and results by id, all results are erroneous.
- byId(int, FluentBackoff, AsyncBatchWriteHandler.Stats, BiFunction<String, List<RecT>, CompletableFuture<List<ErrT>>>, Function<ErrT, String>, Function<RecT, String>, Function<ErrT, String>) - Static method in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
-
AsyncBatchWriteHandler that correlates records and results by id, all results are erroneous.
- byKey() - Static method in class org.apache.beam.sdk.transforms.Redistribute
- byPosition(int, int, RetryConfiguration, AsyncBatchWriteHandler.Stats, BiFunction<String, List<RecT>, CompletableFuture<List<ResT>>>, Function<ResT, String>) - Static method in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
-
AsyncBatchWriteHandler that correlates records and results by position in the respective list.
- byPosition(int, FluentBackoff, AsyncBatchWriteHandler.Stats, BiFunction<String, List<RecT>, CompletableFuture<List<ResT>>>, Function<ResT, String>) - Static method in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
-
AsyncBatchWriteHandler that correlates records and results by position in the respective list.
- BYTE - Enum constant in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- BYTE - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
-
The type of byte fields.
- ByteArray - Class in org.apache.beam.runners.spark.util
-
Serializable byte array.
- ByteArray(byte[]) - Constructor for class org.apache.beam.runners.spark.util.ByteArray
- ByteArrayCoder - Class in org.apache.beam.sdk.coders
-
A
Coder
forbyte[]
. - ByteArrayKey(byte[]) - Constructor for class org.apache.beam.runners.jet.Utils.ByteArrayKey
- ByteBuddyUtils - Class in org.apache.beam.sdk.schemas.utils
- ByteBuddyUtils() - Constructor for class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils
- ByteBuddyUtils.ConvertType - Class in org.apache.beam.sdk.schemas.utils
-
Give a Java type, returns the Java type expected for use with Row.
- ByteBuddyUtils.ConvertValueForGetter - Class in org.apache.beam.sdk.schemas.utils
-
Takes a
StackManipulation
that returns a value. - ByteBuddyUtils.ConvertValueForSetter - Class in org.apache.beam.sdk.schemas.utils
-
Row is going to call the setter with its internal Java type, however the user object being set might have a different type internally.
- ByteBuddyUtils.DefaultTypeConversionsFactory - Class in org.apache.beam.sdk.schemas.utils
- ByteBuddyUtils.InjectPackageStrategy - Class in org.apache.beam.sdk.schemas.utils
-
A naming strategy for ByteBuddy classes.
- ByteBuddyUtils.TransformingMap<K1,
V1, - Class in org.apache.beam.sdk.schemas.utilsK2, V2> - ByteBuddyUtils.TypeConversion<T> - Class in org.apache.beam.sdk.schemas.utils
- ByteBuddyUtils.TypeConversionsFactory - Interface in org.apache.beam.sdk.schemas.utils
- ByteBufferBundle() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.ByteBufferBundle
- ByteCoder - Class in org.apache.beam.sdk.coders
- ByteKey - Class in org.apache.beam.sdk.io.range
-
A class representing a key consisting of an array of bytes.
- ByteKeyRange - Class in org.apache.beam.sdk.io.range
-
A class representing a range of
ByteKeys
. - ByteKeyRangeTracker - Class in org.apache.beam.sdk.io.range
- ByteKeyRangeTracker - Class in org.apache.beam.sdk.transforms.splittabledofn
- bytes() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
The
TypeDescriptor
for Byte. - Bytes() - Constructor for class org.apache.beam.sdk.jmh.util.VarIntBenchmark.Bytes
- BYTES - Enum constant in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- BYTES - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
-
The type of bytes fields.
- BytesBundle() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.BytesBundle
- bytesRead() - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
-
Counter of bytes read by a source.
- bytesReadBySplit(String) - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
-
Counter of bytes read by a source split.
- BytesThroughputEstimator<T> - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator
-
An estimator to provide an estimate on the byte throughput of the outputted elements.
- BytesThroughputEstimator<T> - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator
-
An estimator to provide an estimate on the throughput of the outputted elements.
- BytesThroughputEstimator(int, SizeEstimator<T>) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.BytesThroughputEstimator
- BytesThroughputEstimator(SizeEstimator<T>, int, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator.BytesThroughputEstimator
- BytesThroughputEstimator(SizeEstimator<T>, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator.BytesThroughputEstimator
- bytesToRowFn(SchemaProvider, TypeDescriptor<T>, Coder<? extends T>) - Static method in class org.apache.beam.sdk.schemas.RowMessages
- bytesToRowFn(SchemaProvider, TypeDescriptor<T>, ProcessFunction<byte[], ? extends T>) - Static method in class org.apache.beam.sdk.schemas.RowMessages
- byteString(byte[]) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.RowUtils
- byteString(byte[]) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigtableUtils
- ByteStringCoder - Class in org.apache.beam.runners.fnexecution.wire
-
A duplicate of
ByteStringCoder
that uses the Apache Beam vendored protobuf. - ByteStringCoder - Class in org.apache.beam.sdk.extensions.protobuf
-
A
Coder
forByteString
objects based on their encoded Protocol Buffer form. - ByteStringOutput() - Constructor for class org.apache.beam.sdk.jmh.util.VarIntBenchmark.ByteStringOutput
- ByteStringOutputStreamBenchmark - Class in org.apache.beam.sdk.jmh.util
-
Benchmarks for
ByteStringOutputStream
. - ByteStringOutputStreamBenchmark() - Constructor for class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
- ByteStringOutputStreamBenchmark.NewVsCopy - Class in org.apache.beam.sdk.jmh.util
-
These benchmarks below provide good details as to the cost of creating a new buffer vs copying a subset of the existing one and re-using the larger one.
- ByteStringOutputStreamBenchmark.NewVsCopy.ArrayCopyState - Class in org.apache.beam.sdk.jmh.util
- ByteStringOutputStreamBenchmark.NewVsCopy.ArrayNewState - Class in org.apache.beam.sdk.jmh.util
- ByteStringOutputStreamBenchmark.ProtobufByteStringOutputStream - Class in org.apache.beam.sdk.jmh.util
- ByteStringOutputStreamBenchmark.SdkCoreByteStringOutputStream - Class in org.apache.beam.sdk.jmh.util
- ByteStringRangeHelper - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams
-
Helper functions to evaluate the completeness of collection of ByteStringRanges.
- ByteStringRangeHelper() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ByteStringRangeHelper
- byteStringUtf8(String) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.RowUtils
- byteStringUtf8(String) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigtableUtils
- bytesWritten() - Static method in class org.apache.beam.sdk.metrics.SinkMetrics
-
Counter of bytes written to a sink.
- ByteToElemFunction<V> - Class in org.apache.beam.runners.twister2.translators.functions
-
ByteToWindow function.
- ByteToElemFunction() - Constructor for class org.apache.beam.runners.twister2.translators.functions.ByteToElemFunction
- ByteToElemFunction(WindowedValues.WindowedValueCoder<V>) - Constructor for class org.apache.beam.runners.twister2.translators.functions.ByteToElemFunction
- ByteToWindowFunction<K,
V> - Class in org.apache.beam.runners.twister2.translators.functions -
ByteToWindow function.
- ByteToWindowFunction() - Constructor for class org.apache.beam.runners.twister2.translators.functions.ByteToWindowFunction
- ByteToWindowFunction(Coder<K>, WindowedValues.WindowedValueCoder<V>) - Constructor for class org.apache.beam.runners.twister2.translators.functions.ByteToWindowFunction
- ByteToWindowFunctionPrimitive<K,
V> - Class in org.apache.beam.runners.twister2.translators.functions -
ByteToWindow function.
- ByteToWindowFunctionPrimitive() - Constructor for class org.apache.beam.runners.twister2.translators.functions.ByteToWindowFunctionPrimitive
- ByteToWindowFunctionPrimitive(Coder<K>, WindowedValues.WindowedValueCoder<V>) - Constructor for class org.apache.beam.runners.twister2.translators.functions.ByteToWindowFunctionPrimitive
- BZIP2 - Enum constant in enum class org.apache.beam.sdk.io.CompressedSource.CompressionMode
-
Deprecated.
- BZIP2 - Enum constant in enum class org.apache.beam.sdk.io.Compression
-
BZip compression.
- BZIP2 - Enum constant in enum class org.apache.beam.sdk.io.FileBasedSink.CompressionType
-
Deprecated.
- BZIP2 - Enum constant in enum class org.apache.beam.sdk.io.TextIO.CompressionType
-
Deprecated.
- BZIP2 - Enum constant in enum class org.apache.beam.sdk.io.xml.XmlIO.Read.CompressionType
-
Deprecated.
C
- cache(String, Coder<?>) - Method in class org.apache.beam.runners.spark.translation.BoundedDataset
- cache(String, Coder<?>) - Method in interface org.apache.beam.runners.spark.translation.Dataset
- cache(String, Coder<?>) - Method in class org.apache.beam.runners.spark.translation.streaming.UnboundedDataset
- Cache - Class in org.apache.beam.io.requestresponse
-
Transforms for reading and writing request/response associations to a cache.
- Cache() - Constructor for class org.apache.beam.io.requestresponse.Cache
- Cache.Pair<RequestT,
ResponseT> - Class in org.apache.beam.io.requestresponse -
A simple POJO that holds both cache read and write
PTransform
s. - CACHED_CREATORS - Static variable in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
- CACHED_CREATORS - Static variable in class org.apache.beam.sdk.schemas.utils.POJOUtils
- CachedSideInputReader - Class in org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions
-
SideInputReader that caches results for costly
Materializations
. - CachedSideInputReader - Class in org.apache.beam.runners.spark.util
-
SideInputReader
that caches materialized views. - CacheFactory - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.cache
- CacheFactory(DaoFactory, Duration) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.cache.CacheFactory
- CachingFactory<CreatedT> - Class in org.apache.beam.sdk.schemas
-
A wrapper around a
Factory
that assumes the schema parameter never changes. - CachingFactory(Factory<CreatedT>) - Constructor for class org.apache.beam.sdk.schemas.CachingFactory
- CalciteConnectionWrapper - Class in org.apache.beam.sdk.extensions.sql.impl
-
Abstract wrapper for
CalciteConnection
to simplify extension. - CalciteConnectionWrapper(CalciteConnection) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- CalciteFactoryWrapper - Class in org.apache.beam.sdk.extensions.sql.impl
-
Wrapper for
CalciteFactory
. - CalciteQueryPlanner - Class in org.apache.beam.sdk.extensions.sql.impl
-
The core component to handle through a SQL statement, from explain execution plan, to generate a Beam pipeline.
- CalciteQueryPlanner(JdbcConnection, Collection<RuleSet>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner
-
Called by
BeamSqlEnv
.instantiatePlanner() reflectively. - CalciteQueryPlanner.NonCumulativeCostImpl - Class in org.apache.beam.sdk.extensions.sql.impl
- CalciteUtils - Class in org.apache.beam.sdk.extensions.sql.impl.utils
-
Utility methods for Calcite related operations.
- CalciteUtils() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
- CalciteUtils.TimeWithLocalTzType - Class in org.apache.beam.sdk.extensions.sql.impl.utils
-
A LogicalType corresponding to TIME_WITH_LOCAL_TIME_ZONE.
- CalcRelSplitter - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
CalcRelSplitter operates on a
Calc
with multipleRexCall
sub-expressions that cannot all be implemented by a single concreteRelNode
. - CalcRelSplitter(Calc, RelBuilder, CalcRelSplitter.RelType[]) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter
-
Constructs a CalcRelSplitter.
- CalcRelSplitter.RelType - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
Type of relational expression.
- calculateRanges(PartitionT, PartitionT, Long) - Method in interface org.apache.beam.sdk.io.jdbc.JdbcReadWithPartitionsHelper
-
Calculate the range of each partition from the lower and upper bound, and number of partitions.
- CalendarWindows - Class in org.apache.beam.sdk.transforms.windowing
-
A collection of
WindowFn
s that windows values into calendar-based windows such as spans of days, months, or years. - CalendarWindows() - Constructor for class org.apache.beam.sdk.transforms.windowing.CalendarWindows
- CalendarWindows.DaysWindows - Class in org.apache.beam.sdk.transforms.windowing
-
A
WindowFn
that windows elements into periods measured by days. - CalendarWindows.MonthsWindows - Class in org.apache.beam.sdk.transforms.windowing
-
A
WindowFn
that windows elements into periods measured by months. - CalendarWindows.YearsWindows - Class in org.apache.beam.sdk.transforms.windowing
-
A
WindowFn
that windows elements into periods measured by years. - call() - Method in class org.apache.beam.runners.spark.translation.streaming.SparkRunnerStreamingContextFactory
- call(Iterator<WindowedValue<InputT>>) - Method in class org.apache.beam.runners.spark.translation.MultiDoFnFunction
- call(K, Iterator<WindowedValue<KV<K, InputT>>>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.GroupAlsoByWindowViaOutputBufferFn
- call(WindowedValue<KV<K, Iterable<InputT>>>) - Method in class org.apache.beam.runners.spark.translation.TranslationUtils.CombineGroupedValues
- call(WindowedValue<KV<K, V>>) - Method in class org.apache.beam.runners.spark.translation.ReifyTimestampsAndWindowsFunction
- call(WindowedValue<T>) - Method in class org.apache.beam.runners.spark.translation.SparkAssignWindowFn
- call(RequestT) - Method in interface org.apache.beam.io.requestresponse.Caller
- call(Tuple2<ByteArray, byte[]>) - Method in class org.apache.beam.runners.spark.coders.CoderHelpers.FromByteFunction
- call(Tuple2<TupleTag<V>, WindowedValue<?>>) - Method in class org.apache.beam.runners.spark.translation.TranslationUtils.TupleTagFilter
- Caller<RequestT,
ResponseT> - Interface in org.apache.beam.io.requestresponse -
Caller
interfaces user custom code intended for API calls. - CallShouldBackoff<ResponseT> - Interface in org.apache.beam.io.requestresponse
-
Informs whether a call to an API should backoff.
- cancel() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
- cancel() - Method in class org.apache.beam.runners.dataflow.util.DataflowTemplateJob
- cancel() - Method in class org.apache.beam.runners.direct.DirectRunner.DirectPipelineResult
- cancel() - Method in class org.apache.beam.runners.flink.FlinkDetachedRunnerResult
- cancel() - Method in class org.apache.beam.runners.flink.FlinkRunnerResult
- cancel() - Method in class org.apache.beam.runners.flink.translation.functions.ImpulseSourceFunction
- cancel() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.StreamingImpulseSource
-
Deprecated.
- cancel() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.TestStreamSource
- cancel() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.UnboundedSourceWrapper
- cancel() - Method in class org.apache.beam.runners.jet.FailedRunningPipelineResults
- cancel() - Method in class org.apache.beam.runners.jet.JetPipelineResult
- cancel() - Method in class org.apache.beam.runners.jobsubmission.JobInvocation
-
Cancel the job.
- cancel() - Method in class org.apache.beam.runners.spark.SparkPipelineResult
- cancel() - Method in class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingPipelineResult
- cancel() - Method in class org.apache.beam.runners.twister2.Twister2PipelineResult
- cancel() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.BigQueryServerStream
-
Cancels the stream, releasing any client- and server-side resources.
- cancel() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices.FakeBigQueryServerStream
- cancel() - Method in interface org.apache.beam.sdk.PipelineResult
-
Cancels the pipeline execution.
- cancel(Exception) - Method in class org.apache.beam.sdk.fn.CancellableQueue
-
Causes any pending and future
CancellableQueue.put(T)
andCancellableQueue.take()
invocations to throw an exception. - cancel(JobApi.CancelJobRequest, StreamObserver<JobApi.CancelJobResponse>) - Method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
- CancellableQueue<T> - Class in org.apache.beam.sdk.fn
-
A simplified
ThreadSafe
blocking queue that can be cancelled freeing any blockedThread
s and preventing futureThread
s from blocking. - CancellableQueue(int) - Constructor for class org.apache.beam.sdk.fn.CancellableQueue
-
Creates a
ThreadSafe
blocking queue with a maximum capacity. - cancelled() - Method in interface org.apache.beam.runners.local.PipelineMessageReceiver
-
Report that the pipeline has been cancelled.
- CANCELLED - Enum constant in enum class org.apache.beam.sdk.PipelineResult.State
-
The job has been explicitly cancelled.
- canConvert(ResolvedNodes.ResolvedQueryStmt) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.QueryStatementConverter
-
Whether this rule can handle the conversion of the specific node.
- canConvertConvention(Convention) - Method in enum class org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention
- canEqual(Object) - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeInformation
- canEqual(Object) - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueTypeInformation
- canImplement(LogicalCalc, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter
-
Returns whether a relational expression can be implemented solely in a given
CalcRelSplitter.RelType
. - canImplement(RexCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter.RelType
- canImplement(RexDynamicParam) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter.RelType
- canImplement(RexFieldAccess) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter.RelType
- canImplement(RexLiteral) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter.RelType
- canImplement(RexNode, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter.RelType
-
Returns whether this
RelType
can implement a given expression. - canImplement(RexProgram) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter.RelType
-
Returns whether this tester's
RelType
can implement a given program. - CannotProvideCoderException - Exception Class in org.apache.beam.sdk.coders
-
The exception thrown when a
CoderRegistry
orCoderProvider
cannot provide aCoder
that has been requested. - CannotProvideCoderException(String) - Constructor for exception class org.apache.beam.sdk.coders.CannotProvideCoderException
- CannotProvideCoderException(String, Throwable) - Constructor for exception class org.apache.beam.sdk.coders.CannotProvideCoderException
- CannotProvideCoderException(String, Throwable, CannotProvideCoderException.ReasonCode) - Constructor for exception class org.apache.beam.sdk.coders.CannotProvideCoderException
- CannotProvideCoderException(String, CannotProvideCoderException.ReasonCode) - Constructor for exception class org.apache.beam.sdk.coders.CannotProvideCoderException
- CannotProvideCoderException(Throwable) - Constructor for exception class org.apache.beam.sdk.coders.CannotProvideCoderException
- CannotProvideCoderException(Throwable, CannotProvideCoderException.ReasonCode) - Constructor for exception class org.apache.beam.sdk.coders.CannotProvideCoderException
- CannotProvideCoderException.ReasonCode - Enum Class in org.apache.beam.sdk.coders
-
Indicates the reason that
Coder
inference failed. - canStopPolling(Instant, StateT) - Method in interface org.apache.beam.sdk.transforms.Watch.Growth.TerminationCondition
-
Called by the
Watch
transform to determine whether the given termination state signals thatWatch
should stop callingWatch.Growth.PollFn
for the current input, regardless of whether the lastWatch.Growth.PollResult
was complete or incomplete. - canTranslate(TransformT) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator
-
Checks if a composite / primitive transform can be translated.
- CassandraIO - Class in org.apache.beam.sdk.io.cassandra
-
An IO to read and write from/to Apache Cassandra
- CassandraIO.MutationType - Enum Class in org.apache.beam.sdk.io.cassandra
-
Specify the mutation type: either write or delete.
- CassandraIO.Read<T> - Class in org.apache.beam.sdk.io.cassandra
-
A
PTransform
to read data from Apache Cassandra. - CassandraIO.ReadAll<T> - Class in org.apache.beam.sdk.io.cassandra
-
A
PTransform
to read data from Apache Cassandra. - CassandraIO.Write<T> - Class in org.apache.beam.sdk.io.cassandra
-
A
PTransform
to mutate into Apache Cassandra. - Cassandra Socket Options - Search tag in class org.apache.beam.sdk.io.cassandra.CassandraIO
- Section
- Cast<T> - Class in org.apache.beam.sdk.schemas.transforms
-
Set of utilities for casting rows between schemas.
- Cast() - Constructor for class org.apache.beam.sdk.schemas.transforms.Cast
- CAST_OP - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
- Cast.CompatibilityError - Class in org.apache.beam.sdk.schemas.transforms
-
Describes compatibility errors during casting.
- Cast.Narrowing - Class in org.apache.beam.sdk.schemas.transforms
-
Narrowing changes type without guarantee to preserve data.
- Cast.Validator - Interface in org.apache.beam.sdk.schemas.transforms
-
Interface for statically validating casts.
- Cast.Widening - Class in org.apache.beam.sdk.schemas.transforms
-
Widening changes to type that can represent any possible value of the original type.
- CastFunctionImpl - Class in org.apache.beam.sdk.extensions.sql.zetasql.translation.impl
-
ZetaSQLCastFunctionImpl.
- CastFunctionImpl() - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.CastFunctionImpl
- castNumber(Number, Schema.TypeName, Schema.TypeName) - Static method in class org.apache.beam.sdk.schemas.transforms.Cast
- castRow(Row, Schema, Schema) - Static method in class org.apache.beam.sdk.schemas.transforms.Cast
- castValue(Object, Schema.FieldType, Schema.FieldType) - Static method in class org.apache.beam.sdk.schemas.transforms.Cast
- catalog() - Method in class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig
- Catalog - Interface in org.apache.beam.sdk.extensions.sql.meta.catalog
-
Represents a named and configurable container for managing tables.
- catalogManager(CatalogManager) - Method in class org.apache.beam.sdk.extensions.sql.BeamSqlCli
- CatalogManager - Interface in org.apache.beam.sdk.extensions.sql.meta.catalog
-
Top-level authority that manages
Catalog
s. - catalogName() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAICreateCatalogItem
- catalogName() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportCatalogItems
- catalogName() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportUserEvents
- catalogName() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIPredict
- catalogName() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIWriteUserEvent
- CatalogRegistrar - Interface in org.apache.beam.sdk.extensions.sql.meta.catalog
-
Over-arching registrar to capture available
Catalog
s. - catchUpToNow - Variable in class org.apache.beam.sdk.transforms.PeriodicSequence.SequenceDefinition
- catchUpToNow(boolean) - Method in class org.apache.beam.sdk.transforms.PeriodicImpulse
-
For internal use only; no backwards-compatibility guarantees.
- CdapIO - Class in org.apache.beam.sdk.io.cdap
-
A
CdapIO
is a Transform for reading data from source or writing data to sink of a Cdap Plugin. - CdapIO() - Constructor for class org.apache.beam.sdk.io.cdap.CdapIO
- CdapIO.Read<K,
V> - Class in org.apache.beam.sdk.io.cdap -
A
PTransform
to read from CDAP source. - CdapIO.Write<K,
V> - Class in org.apache.beam.sdk.io.cdap -
A
PTransform
to write to CDAP sink. - cdapPluginObj - Variable in class org.apache.beam.sdk.io.cdap.Plugin
- CELL_SCHEMA - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider
- CEPCall - Class in org.apache.beam.sdk.extensions.sql.impl.cep
-
A
CEPCall
instance represents an operation (node) that contains an operator and a list of operands. - CEPFieldRef - Class in org.apache.beam.sdk.extensions.sql.impl.cep
-
A
CEPFieldRef
instance represents a node that points to a specified field in aRow
. - CEPKind - Enum Class in org.apache.beam.sdk.extensions.sql.impl.cep
-
CEPKind
corresponds to Calcite'sSqlKind
. - CEPLiteral - Class in org.apache.beam.sdk.extensions.sql.impl.cep
-
CEPLiteral
represents a literal node. - CEPMeasure - Class in org.apache.beam.sdk.extensions.sql.impl.cep
-
The
CEPMeasure
class represents the Measures clause and contains information about output columns. - CEPMeasure(Schema, String, CEPOperation) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.cep.CEPMeasure
- CEPOperation - Class in org.apache.beam.sdk.extensions.sql.impl.cep
-
CEPOperation
is the base class for the evaluation operations defined in theDEFINE
syntax ofMATCH_RECOGNIZE
. - CEPOperation() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.cep.CEPOperation
- CEPOperator - Class in org.apache.beam.sdk.extensions.sql.impl.cep
-
The
CEPOperator
records the operators (i.e. - CEPPattern - Class in org.apache.beam.sdk.extensions.sql.impl.cep
-
Core pattern class that stores the definition of a single pattern.
- CEPUtils - Class in org.apache.beam.sdk.extensions.sql.impl.cep
-
Some utility methods for transforming Calcite's constructs into our own Beam constructs (for serialization purpose).
- CEPUtils() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.cep.CEPUtils
- CF_CONTINUATION_TOKEN - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
- CF_INITIAL_TOKEN - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
- CF_LOCK - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
- CF_MISSING_PARTITIONS - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
- CF_PARENT_LOW_WATERMARKS - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
- CF_PARENT_PARTITIONS - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
- CF_SHOULD_DELETE - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
- CF_VERSION - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
- CF_WATERMARK - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
- CHANGE_SQN_COLUMN - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiCDC
- CHANGE_STREAM_MUTATION_GC_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
-
Counter for the total number of ChangeStreamMutations that are initiated by garbage collection (not user initiated) identified during the execution of the Connector.
- CHANGE_STREAM_MUTATION_USER_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
-
Counter for the total number of ChangeStreamMutations that are initiated by users (not garbage collection) identified during the execution of the Connector.
- CHANGE_TYPE_COLUMN - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiCDC
- changeStreamAction(ChangeStreamMetrics) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ActionFactory
-
Creates and returns a singleton instance of an action class for processing individual ChangeStreamMutation in
ReadChangeStreamPartitionDoFn
. - ChangeStreamAction - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.action
-
This class is responsible for processing individual ChangeStreamRecord.
- ChangeStreamAction(ChangeStreamMetrics) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ChangeStreamAction
-
Constructs ChangeStreamAction to process individual ChangeStreamRecord.
- ChangeStreamContinuationTokenHelper - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams
- ChangeStreamContinuationTokenHelper() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamContinuationTokenHelper
- ChangeStreamDao - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao
-
Data access object to list and read stream partitions of a table.
- ChangeStreamDao - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
-
Responsible for making change stream queries for a given partition.
- ChangeStreamDao(BigtableDataClient, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.ChangeStreamDao
- ChangeStreamMetrics - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams
-
Class to aggregate metrics related functionality.
- ChangeStreamMetrics - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams
-
Class to aggregate metrics related functionality.
- ChangeStreamMetrics() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
- ChangeStreamMetrics() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Constructs a ChangeStreamMetrics instance with the following metrics enabled by default.
- ChangeStreamMetrics(Set<MetricName>) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Constructs a ChangeStreamMetrics instance with the given metrics enabled.
- changeStreamQuery(String, Timestamp, Timestamp, long) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamDao
-
Performs a change stream query.
- ChangeStreamRecord - Interface in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
Represents a Spanner Change Stream Record.
- changeStreamRecordMapper() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper.MapperFactory
-
Creates and returns a singleton instance of a mapper class capable of transforming a
Struct
into aList
ofChangeStreamRecord
subclasses. - ChangeStreamRecordMapper - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper
- ChangeStreamRecordMetadata - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
Holds internal execution metrics / metadata for the processed
ChangeStreamRecord
. - ChangeStreamRecordMetadata.Builder - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
- ChangeStreamResultSet - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
-
Decorator class over a
ResultSet
that provides telemetry for the streamed records. - ChangeStreamResultSetMetadata - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
-
Represents telemetry metadata gathered during the consumption of a change stream query.
- ChangeStreamsConstants - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams
-
Single place for defining the constants used in the
Spanner.readChangeStreams()
connector. - ChangeStreamsConstants() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
- channelNames - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
- CHAR - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
- CHAR_LENGTH - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
- CHAR_LENGTH_METHOD - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.BeamBuiltinMethods
- characters() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
The
TypeDescriptor
for Character. - charLength(String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
- check(RelNode) - Method in interface org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall.JoinChecker
- checkClientTrusted(X509Certificate[], String) - Method in class org.apache.beam.sdk.io.splunk.CustomX509TrustManager
- checkConfiguration(ClientConfiguration, AwsOptions) - Method in interface org.apache.beam.sdk.io.aws2.common.ClientBuilderFactory
-
Check if all necessary configuration is available to create clients.
- checkConfiguration(ClientConfiguration, AwsOptions) - Method in class org.apache.beam.sdk.io.aws2.common.ClientBuilderFactory.DefaultClientBuilder
- checkDone() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.ReadChangeStreamPartitionProgressTracker
-
This is to signal to the runner that this restriction has completed.
- checkDone() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
-
Checks if the restriction has been processed successfully.
- checkDone() - Method in class org.apache.beam.sdk.transforms.PeriodicSequence.OutputRangeTracker
- checkDone() - Method in class org.apache.beam.sdk.transforms.splittabledofn.ByteKeyRangeTracker
- checkDone() - Method in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
- checkDone() - Method in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker
-
Checks whether the restriction has been fully processed.
- checkExceptionAndMaybeThrow() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- checkForAsyncFailure() - Method in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
-
Check if any failure happened async.
- checkIdleTimeoutAndMaybeStartCountdown() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- checkIfAnySubscriptionExists(String, Duration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
- checkpoint(long) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.stableinput.BufferingDoFnRunner
-
Should be called when a checkpoint is created.
- Checkpoint - Class in org.apache.beam.runners.spark.translation.streaming
-
Checkpoint data to make it available in future pipeline runs.
- Checkpoint() - Constructor for class org.apache.beam.runners.spark.translation.streaming.Checkpoint
- Checkpoint.CheckpointDir - Class in org.apache.beam.runners.spark.translation.streaming
-
Checkpoint dir tree.
- checkpointCompleted(long) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.stableinput.BufferingDoFnRunner
-
Should be called when a checkpoint is completed.
- CheckpointDir(String) - Constructor for class org.apache.beam.runners.spark.translation.streaming.Checkpoint.CheckpointDir
- checkpointIfNeeded(DStream<?>, SerializablePipelineOptions) - Static method in class org.apache.beam.runners.spark.translation.TranslationUtils
-
Checkpoints the given DStream if checkpointing is enabled in the pipeline options.
- CheckpointMarkImpl - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
- CheckpointStats - Class in org.apache.beam.runners.flink.translation.utils
-
Helpers for reporting checkpoint durations.
- CheckpointStats(Supplier<DistributionCell>) - Constructor for class org.apache.beam.runners.flink.translation.utils.CheckpointStats
- checkServerTrusted(X509Certificate[], String) - Method in class org.apache.beam.sdk.io.splunk.CustomX509TrustManager
- CheckStopReadingFn - Interface in org.apache.beam.sdk.io.kafka
- CheckStopReadingFnWrapper - Class in org.apache.beam.sdk.io.kafka
- checksum() - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata
-
An optional checksum to identify the contents of a file.
- ChildPartition - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
A child partition represents a new partition that should be queried.
- ChildPartition(String, String) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartition
-
Constructs a child partition, which will have its own token and the parent that it originated from.
- ChildPartition(String, HashSet<String>) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartition
-
Constructs a child partition, which will have its own token and the parents that it originated from.
- ChildPartitionsRecord - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
Represents a ChildPartitionsRecord.
- ChildPartitionsRecord(Timestamp, String, List<ChildPartition>, ChangeStreamRecordMetadata) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
-
Constructs a child partitions record containing one or more child partitions.
- childPartitionsRecordAction(PartitionMetadataDao, ChangeStreamMetrics) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.ActionFactory
-
Creates and returns a singleton instance of an action class capable of process
ChildPartitionsRecord
s. - ChildPartitionsRecordAction - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.action
-
This class is part of the process for
ReadChangeStreamPartitionDoFn
SDF. - Choosing an End Point (ICEBERG_CDC only) - Search tag in class org.apache.beam.sdk.io.iceberg.IcebergIO
- Section
- Choosing a Starting Point (ICEBERG_CDC only) - Search tag in class org.apache.beam.sdk.io.iceberg.IcebergIO
- Section
- CivilTimeEncoder - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Encoder for TIME and DATETIME values, according to civil_time encoding.
- classesToTranslators() - Method in interface org.apache.beam.runners.dataflow.util.CoderCloudObjectTranslatorRegistrar
- classesToTranslators() - Method in class org.apache.beam.runners.dataflow.util.DefaultCoderCloudObjectTranslatorRegistrar
- ClassLoaderFileSystem - Class in org.apache.beam.sdk.io
-
A read-only
FileSystem
implementation looking up resources using a ClassLoader. - ClassLoaderFileSystem.ClassLoaderFileSystemRegistrar - Class in org.apache.beam.sdk.io
-
AutoService
registrar for theClassLoaderFileSystem
. - ClassLoaderFileSystem.ClassLoaderResourceId - Class in org.apache.beam.sdk.io
- ClassLoaderFileSystemRegistrar() - Constructor for class org.apache.beam.sdk.io.ClassLoaderFileSystem.ClassLoaderFileSystemRegistrar
- classNamesToTranslators() - Method in interface org.apache.beam.runners.dataflow.util.CoderCloudObjectTranslatorRegistrar
-
Gets a map from the name returned by
CloudObject.getClassName()
to a translator that can convert into the equivalentCoder
. - classNamesToTranslators() - Method in class org.apache.beam.runners.dataflow.util.DefaultCoderCloudObjectTranslatorRegistrar
- ClassWithSchema() - Constructor for class org.apache.beam.sdk.schemas.utils.ReflectUtils.ClassWithSchema
- CleanTmpFilesFromGcsFn(ValueProvider<String>, String) - Constructor for class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read.CleanTmpFilesFromGcsFn
-
Created object that will remove temp files from stage.
- cleanup() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
- cleanUp() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.ExecutableStageDoFnOperator
- cleanUpPrefix() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
-
Delete all the metadata rows starting with the change stream name prefix, except for detect new partition row because it signals the existence of a pipeline with the change stream name.
- CleanUpReadChangeStreamDoFn - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn
- CleanUpReadChangeStreamDoFn(DaoFactory) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.CleanUpReadChangeStreamDoFn
- clear() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.stableinput.KeyedBufferingElementsHandler
- clear() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.stableinput.NonKeyedBufferingElementsHandler
- clear() - Static method in class org.apache.beam.runners.spark.metrics.MetricsAccumulator
- clear() - Static method in class org.apache.beam.runners.spark.structuredstreaming.metrics.MetricsAccumulator
- clear() - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
- clear() - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
- clear() - Method in interface org.apache.beam.sdk.state.State
-
Clear out the state location.
- clear() - Method in interface org.apache.beam.sdk.state.Timer
-
Clears a timer.
- clear(K, W) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.BagUserStateHandler
-
Clears the bag user state for the given key and window.
- clearCache() - Static method in class org.apache.beam.runners.spark.io.MicrobatchSource
- clearGlobalState() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals
-
Allows to clear all state for the global watermark when the maximum watermark arrives.
- clearOutputElements() - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
Deprecated.Use
TestPipeline
with theDirectRunner
. - clearOutputElements(TupleTag<T>) - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
Deprecated.Use
TestPipeline
with theDirectRunner
. - clearProvidedSparkContext() - Static method in class org.apache.beam.runners.spark.translation.SparkContextFactory
- clearRange(Instant, Instant) - Method in interface org.apache.beam.sdk.state.OrderedListState
-
Clear a timestamp-limited subrange of the list.
- clearState(ReduceFn.Context) - Method in class org.apache.beam.runners.twister2.translators.functions.internal.SystemReduceFnBuffering
- clearWarnings() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- ClickHouseIO - Class in org.apache.beam.sdk.io.clickhouse
-
An IO to write to ClickHouse.
- ClickHouseIO() - Constructor for class org.apache.beam.sdk.io.clickhouse.ClickHouseIO
- ClickHouseIO.Write<T> - Class in org.apache.beam.sdk.io.clickhouse
-
A
PTransform
to write to ClickHouse. - ClickHouseWriter - Class in org.apache.beam.sdk.io.clickhouse
-
Writes Rows and field values using
ClickHousePipedOutputStream
. - ClickHouseWriter() - Constructor for class org.apache.beam.sdk.io.clickhouse.ClickHouseWriter
- clientBuffered(ExecutorService) - Static method in class org.apache.beam.sdk.fn.stream.OutboundObserverFactory
-
Create a buffering
OutboundObserverFactory
for client-side RPCs with the specifiedExecutorService
and the default buffer size. - clientBuffered(ExecutorService, int) - Static method in class org.apache.beam.sdk.fn.stream.OutboundObserverFactory
-
Create a buffering
OutboundObserverFactory
for client-side RPCs with the specifiedExecutorService
and buffer size. - ClientBuilderFactory - Interface in org.apache.beam.sdk.io.aws2.common
-
Factory to build and configure any
AwsClientBuilder
using a specificClientConfiguration
or the globally provided settings inAwsOptions
as fallback. - ClientBuilderFactory.DefaultClientBuilder - Class in org.apache.beam.sdk.io.aws2.common
-
Default implementation of
ClientBuilderFactory
. - ClientBuilderFactory.SkipCertificateVerificationTrustManagerProvider - Class in org.apache.beam.sdk.io.aws2.common
-
Trust provider to skip certificate verification.
- ClientConfiguration - Class in org.apache.beam.sdk.io.aws2.common
-
AWS client configuration.
- ClientConfiguration() - Constructor for class org.apache.beam.sdk.io.aws2.common.ClientConfiguration
- ClientConfiguration.Builder - Class in org.apache.beam.sdk.io.aws2.common
- clientDirect() - Static method in class org.apache.beam.sdk.fn.stream.OutboundObserverFactory
-
Create the default
OutboundObserverFactory
for client-side RPCs, which uses basic unbuffered flow control. - Client-side rate limiting - Search tag in class org.apache.beam.sdk.io.googleads.GoogleAdsV19
- Section
- Clock - Interface in org.apache.beam.runners.direct
-
Access to the current time.
- clone() - Method in class org.apache.beam.runners.dataflow.util.CloudObject
- clone() - Method in class org.apache.beam.runners.flink.metrics.MetricsAccumulator
- CLONE_ONCE - Enum constant in enum class org.apache.beam.sdk.transforms.DoFnTester.CloningBehavior
-
Deprecated.
- CLONE_PER_BUNDLE - Enum constant in enum class org.apache.beam.sdk.transforms.DoFnTester.CloningBehavior
-
Deprecated.Clone the
DoFn
and callDoFn.Setup
every time a bundle starts; callDoFn.Teardown
every time a bundle finishes. - clonesOf(OutT) - Static method in class org.apache.beam.sdk.transforms.SerializableFunctions
- close() - Method in class org.apache.beam.runners.dataflow.util.PackageUtil
- close() - Method in class org.apache.beam.runners.flink.metrics.FileReporter
- close() - Method in class org.apache.beam.runners.flink.translation.functions.FlinkDoFnFunction
- close() - Method in class org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction
- close() - Method in class org.apache.beam.runners.flink.translation.functions.FlinkStatefulDoFnFunction
- close() - Method in class org.apache.beam.runners.flink.translation.utils.Locker
- close() - Method in class org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat
- close() - Method in class org.apache.beam.runners.flink.translation.wrappers.SourceInputFormat
- close() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- close() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- close() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceSplitEnumerator
- close() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.LazyFlinkSourceSplitEnumerator
- close() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.UnboundedSourceWrapper
- close() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.SplittableDoFnOperator
- close() - Method in class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
- close() - Method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService
- close() - Method in class org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext
- close() - Method in class org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory
- close() - Method in class org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.WrappedSdkHarnessClient
- close() - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClient
- close() - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService
- close() - Method in interface org.apache.beam.runners.fnexecution.control.RemoteBundle
-
Closes this bundle.
- close() - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient.BundleProcessor.ActiveBundle
-
Blocks until bundle processing is finished.
- close() - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient
- close() - Method in class org.apache.beam.runners.fnexecution.control.SingleEnvironmentInstanceJobBundleFactory
-
Deprecated.
- close() - Method in class org.apache.beam.runners.fnexecution.data.GrpcDataService
- close() - Method in class org.apache.beam.runners.fnexecution.environment.ProcessEnvironment
- close() - Method in interface org.apache.beam.runners.fnexecution.environment.RemoteEnvironment
-
.
- close() - Method in class org.apache.beam.runners.fnexecution.environment.StaticRemoteEnvironment
- close() - Method in class org.apache.beam.runners.fnexecution.logging.GrpcLoggingService
- close() - Method in class org.apache.beam.runners.fnexecution.provisioning.StaticGrpcProvisionService
- close() - Method in class org.apache.beam.runners.fnexecution.state.GrpcStateService
- close() - Method in class org.apache.beam.runners.fnexecution.status.BeamWorkerStatusGrpcService
- close() - Method in class org.apache.beam.runners.jet.processors.ParDoP
- close() - Method in class org.apache.beam.runners.jet.processors.StatefulParDoP
- close() - Method in class org.apache.beam.runners.jet.processors.BoundedSourceP
- close() - Method in class org.apache.beam.runners.jet.processors.UnboundedSourceP
- close() - Method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
- close() - Method in class org.apache.beam.runners.portability.CloseableResource
-
Closes the underlying resource.
- close() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
- close() - Method in class org.apache.beam.runners.twister2.translators.functions.DoFnFunction
- close() - Method in class org.apache.beam.runners.twister2.translators.functions.Twister2SinkFunction
- close() - Method in class org.apache.beam.sdk.expansion.service.ExpansionServer
- close() - Method in class org.apache.beam.sdk.expansion.service.ExpansionService
- close() - Method in class org.apache.beam.sdk.extensions.arrow.ArrowConversion.RecordBatchRowIterator
- close() - Method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingReadableByteChannel
- close() - Method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingSeekableByteChannel
- close() - Method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingWritableByteChannel
- close() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- close() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
- close() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer
- close() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver
- close() - Method in interface org.apache.beam.sdk.fn.data.CloseableFnDataReceiver
-
.
- close() - Method in interface org.apache.beam.sdk.fn.server.FnService
-
.
- close() - Method in class org.apache.beam.sdk.fn.server.GrpcFnServer
- close() - Method in class org.apache.beam.sdk.fn.stream.DataStreams.ElementDelimitedOutputStream
- close() - Method in class org.apache.beam.sdk.io.aws2.auth.StsAssumeRoleForFederatedCredentialsProvider
- close() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
-
Closes the channel and returns the bundle result.
- close() - Method in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
-
Closes any
ReadableByteChannel
created for the current reader. - close() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StorageClient
-
Close the client object.
- close() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
- close() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.WriteStreamServiceImpl
- close() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.BigtableChangeStreamAccessor
- close() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.DaoFactory
- close() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
-
Gracefully close the underlying netty channel.
- close() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
- close() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- close() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.BlockingCommitterImpl
- close() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.ManagedFactoryImpl
- close() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
- close() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSet
-
Closes the current change stream
ResultSet
. - close() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor
- close() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- close() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
- close() - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantDeserializer
- close() - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantSerializer
- close() - Method in class org.apache.beam.sdk.io.solace.broker.JcsmpSessionService
- close() - Method in interface org.apache.beam.sdk.io.solace.broker.MessageProducer
-
Closes the message producer.
- close() - Method in interface org.apache.beam.sdk.io.solace.broker.MessageReceiver
-
Closes the message receiver.
- close() - Method in class org.apache.beam.sdk.io.solace.broker.SessionService
-
Gracefully closes the connection to the service.
- close() - Method in class org.apache.beam.sdk.io.solace.broker.SolaceMessageProducer
- close() - Method in class org.apache.beam.sdk.io.solace.broker.SolaceMessageReceiver
- close() - Method in class org.apache.beam.sdk.io.Source.Reader
-
Closes the reader.
- close() - Method in class org.apache.beam.sdk.io.thrift.ThriftIO.ThriftWriter
- close() - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
Deprecated.Use
TestPipeline
with theDirectRunner
. - close() - Method in class org.apache.beam.sdk.transforms.errorhandling.ErrorHandler.DefaultErrorHandler
- close() - Method in class org.apache.beam.sdk.transforms.errorhandling.ErrorHandler.PTransformErrorHandler
- close(T) - Method in interface org.apache.beam.runners.portability.CloseableResource.Closer
- CloseableFnDataReceiver<T> - Interface in org.apache.beam.sdk.fn.data
-
A receiver of streamed data that can be closed.
- CloseableResource<T> - Class in org.apache.beam.runners.portability
-
An
AutoCloseable
that wraps a resource that needs to be cleaned up but does not implementAutoCloseable
itself. - CloseableResource.CloseException - Exception Class in org.apache.beam.runners.portability
-
An exception that wraps errors thrown while a resource is being closed.
- CloseableResource.Closer<T> - Interface in org.apache.beam.runners.portability
-
A function that knows how to clean up after a resource.
- CloseableThrowingConsumer<ExceptionT,
T> - Interface in org.apache.beam.sdk.function -
A
ThrowingConsumer
that can be closed. - CLOSESTREAM_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
-
Counter for the total number of heartbeats identified during the execution of the Connector.
- closeTo(double, double) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.closeTo(double,double)
. - CloudObject - Class in org.apache.beam.runners.dataflow.util
-
A representation of an arbitrary Java object to be instantiated by Dataflow workers.
- cloudObjectClassName() - Method in interface org.apache.beam.runners.dataflow.util.CloudObjectTranslator
-
Gets the class name that will represent the
CloudObject
created by thisCloudObjectTranslator
. - cloudObjectClassName() - Method in class org.apache.beam.runners.dataflow.util.RowCoderCloudObjectTranslator
- cloudObjectClassName() - Method in class org.apache.beam.runners.dataflow.util.SchemaCoderCloudObjectTranslator
- CloudObjects - Class in org.apache.beam.runners.dataflow.util
-
Utilities for converting an object to a
CloudObject
. - CloudObjectTranslator<T> - Interface in org.apache.beam.runners.dataflow.util
-
A translator that takes an object and creates a
CloudObject
which can be converted back to the original object. - CloudPubsubTransforms - Class in org.apache.beam.sdk.io.gcp.pubsublite
-
A class providing transforms between Cloud Pub/Sub and Pub/Sub Lite message types.
- CloudResourceManagerOptions - Interface in org.apache.beam.sdk.extensions.gcp.options
-
Properties needed when using Google CloudResourceManager with the Apache Beam SDK.
- CloudVision - Class in org.apache.beam.sdk.extensions.ml
-
Factory class for implementations of
AnnotateImages
. - CloudVision() - Constructor for class org.apache.beam.sdk.extensions.ml.CloudVision
- CloudVision.AnnotateImagesFromBytes - Class in org.apache.beam.sdk.extensions.ml
-
Accepts
ByteString
(encoded image contents) with optionalDoFn.SideInput
with aMap
ofImageContext
to the image. - CloudVision.AnnotateImagesFromBytesWithContext - Class in org.apache.beam.sdk.extensions.ml
- CloudVision.AnnotateImagesFromGcsUri - Class in org.apache.beam.sdk.extensions.ml
-
Accepts
String
(image URI on GCS) with optionalDoFn.SideInput
with aMap
ofImageContext
to the image. - CloudVision.AnnotateImagesFromGcsUriWithContext - Class in org.apache.beam.sdk.extensions.ml
- CO_GBK_RESULT_SCHEMA - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- CodahaleCsvSink - Class in org.apache.beam.runners.spark.structuredstreaming.metrics.sink
-
A
Sink
for Spark's metric system reporting metrics (including Beam step metrics) to a CSV file. - CodahaleCsvSink(Properties, MetricRegistry) - Constructor for class org.apache.beam.runners.spark.structuredstreaming.metrics.sink.CodahaleCsvSink
-
Constructor for Spark 3.2.x and later.
- CodahaleCsvSink(Properties, MetricRegistry, SecurityManager) - Constructor for class org.apache.beam.runners.spark.structuredstreaming.metrics.sink.CodahaleCsvSink
-
Constructor for Spark 3.1.x and earlier.
- CodahaleGraphiteSink - Class in org.apache.beam.runners.spark.structuredstreaming.metrics.sink
-
A
Sink
for Spark's metric system reporting metrics (including Beam step metrics) to Graphite. - CodahaleGraphiteSink(Properties, MetricRegistry) - Constructor for class org.apache.beam.runners.spark.structuredstreaming.metrics.sink.CodahaleGraphiteSink
-
Constructor for Spark 3.2.x and later.
- CodahaleGraphiteSink(Properties, MetricRegistry, SecurityManager) - Constructor for class org.apache.beam.runners.spark.structuredstreaming.metrics.sink.CodahaleGraphiteSink
-
Constructor for Spark 3.1.x and earlier.
- coder - Variable in class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceTransform
- coder() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.TimerSpec
- Coder<T> - Class in org.apache.beam.sdk.coders
-
A
Coder<T>
defines how to encode and decode values of typeT
into byte streams. - Coder() - Constructor for class org.apache.beam.sdk.coders.Coder
- Coder() - Constructor for class org.apache.beam.sdk.io.range.OffsetRange.Coder
- Coder.Context - Class in org.apache.beam.sdk.coders
-
Deprecated.To implement a coder, do not use any
Coder.Context
. Just implement only those abstract methods which do not accept aCoder.Context
and leave the default implementations for methods accepting aCoder.Context
. - Coder.NonDeterministicException - Exception Class in org.apache.beam.sdk.coders
-
Exception thrown by
Coder.verifyDeterministic()
if the encoding is not deterministic, including details of why the encoding is not deterministic. - CoderCloudObjectTranslatorRegistrar - Interface in org.apache.beam.runners.dataflow.util
-
Coder
authors have the ability to automatically have theirCoder
registered with the Dataflow Runner by creating aServiceLoader
entry and a concrete implementation of this interface. - coderConsistentWithEquals(Coder<T>, T, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given
Coder<T>
and values of typeT
, the values are equal if and only if the encoded bytes are equal. - coderConsistentWithEqualsInContext(Coder<T>, Coder.Context, T, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given
Coder<T>
,Coder.Context
, and values of typeT
, the values are equal if and only if the encoded bytes are equal, in anyCoder.Context
. - coderDecodeEncodeContentsEqual(Coder<CollectionT>, CollectionT) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given
Coder<Collection<T>>
, and value of typeCollection<T>
, encoding followed by decoding yields an equal value of typeCollection<T>
, in anyCoder.Context
. - coderDecodeEncodeContentsEqualInContext(Coder<CollectionT>, Coder.Context, CollectionT) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given
Coder<Collection<T>>
, and value of typeCollection<T>
, encoding followed by decoding yields an equal value of typeCollection<T>
, in the givenCoder.Context
. - coderDecodeEncodeContentsInSameOrder(Coder<IterableT>, IterableT) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given
Coder<Collection<T>>
, and value of typeCollection<T>
, encoding followed by decoding yields an equal value of typeCollection<T>
, in anyCoder.Context
. - coderDecodeEncodeContentsInSameOrderInContext(Coder<IterableT>, Coder.Context, IterableT) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given
Coder<Iterable<T>>
, and value of typeIterable<T>
, encoding followed by decoding yields an equal value of typeCollection<T>
, in the givenCoder.Context
. - coderDecodeEncodeEqual(Coder<T>, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given
Coder<T>
, and value of typeT
, encoding followed by decoding yields an equal value of typeT
, in anyCoder.Context
. - coderDecodeEncodeEqualInContext(Coder<T>, Coder.Context, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given
Coder<T>
,Coder.Context
, and value of typeT
, encoding followed by decoding yields an equal value of typeT
. - coderDecodeEncodeInContext(Coder<T>, Coder.Context, T, Matcher<T>) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given
Coder<T>
,Coder.Context
, and value of typeT
, encoding followed by decoding yields a value of typeT
and tests that the matcher succeeds on the values. - coderDecodesBase64(Coder<T>, String, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
- coderDecodesBase64(Coder<T>, List<String>, List<T>) - Static method in class org.apache.beam.sdk.testing.CoderProperties
- coderDecodesBase64ContentsEqual(Coder<IterableT>, String, IterableT) - Static method in class org.apache.beam.sdk.testing.CoderProperties
- coderDecodesBase64ContentsEqual(Coder<IterableT>, List<String>, List<IterableT>) - Static method in class org.apache.beam.sdk.testing.CoderProperties
- coderDeterministic(Coder<T>, T, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given
Coder<T>
, and values of typeT
, if the values are equal then the encoded bytes are equal, in anyCoder.Context
. - coderDeterministicInContext(Coder<T>, Coder.Context, T, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given
Coder<T>
,Coder.Context
, and values of typeT
, if the values are equal then the encoded bytes are equal. - coderEncodesBase64(Coder<T>, List<T>, List<String>) - Static method in class org.apache.beam.sdk.testing.CoderProperties
- coderEncodesBase64(Coder<T>, T, String) - Static method in class org.apache.beam.sdk.testing.CoderProperties
- CoderException - Exception Class in org.apache.beam.sdk.coders
-
An
Exception
thrown if there is a problem encoding or decoding a value. - CoderException(String) - Constructor for exception class org.apache.beam.sdk.coders.CoderException
- CoderException(String, Throwable) - Constructor for exception class org.apache.beam.sdk.coders.CoderException
- CoderException(Throwable) - Constructor for exception class org.apache.beam.sdk.coders.CoderException
- coderFor(TypeDescriptor<T>, List<? extends Coder<?>>) - Method in class org.apache.beam.sdk.coders.CoderProvider
-
Returns a
Coder<T>
to use for values of a particular type, given the Coders for each of the type's generic parameter types. - coderFor(TypeDescriptor<T>, List<? extends Coder<?>>) - Method in class org.apache.beam.sdk.coders.DefaultCoder.DefaultCoderProviderRegistrar.DefaultCoderProvider
-
Returns the
Coder
returned according to theCoderProvider
from anyDefaultCoder
annotation on the given class. - coderForFieldType(Schema.FieldType) - Static method in class org.apache.beam.sdk.schemas.SchemaCoder
- coderFromCloudObject(CloudObject) - Static method in class org.apache.beam.runners.dataflow.util.CloudObjects
- CoderHelpers - Class in org.apache.beam.runners.spark.coders
-
Serialization utility class.
- CoderHelpers - Class in org.apache.beam.runners.spark.structuredstreaming.translation.helpers
-
Serialization utility class.
- CoderHelpers.FromByteFunction<K,
V> - Class in org.apache.beam.runners.spark.coders -
A function for converting a byte array pair to a key-value pair.
- CoderProperties - Class in org.apache.beam.sdk.testing
-
Properties for use in
Coder
tests. - CoderProperties() - Constructor for class org.apache.beam.sdk.testing.CoderProperties
- CoderProperties.TestElementByteSizeObserver - Class in org.apache.beam.sdk.testing
-
An
ElementByteSizeObserver
that records the observed element sizes for testing purposes. - CoderProvider - Class in org.apache.beam.sdk.coders
-
A
CoderProvider
providesCoder
s. - CoderProvider() - Constructor for class org.apache.beam.sdk.coders.CoderProvider
- CoderProviderRegistrar - Interface in org.apache.beam.sdk.coders
-
Coder
creators have the ability to automatically have theircoders
registered with this SDK by creating aServiceLoader
entry and a concrete implementation of this interface. - CoderProviders - Class in org.apache.beam.sdk.coders
-
Static utility methods for creating and working with
CoderProvider
s. - CoderRegistry - Class in org.apache.beam.sdk.coders
- coderSerializable(Coder<T>) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that the given
Coder<T>
can be correctly serialized and deserialized. - CoderSizeEstimator<T> - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator
-
This class is used to estimate the size in bytes of a given element.
- CoderSizeEstimator(Coder<T>) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator.CoderSizeEstimator
- CoderTypeInformation<T> - Class in org.apache.beam.runners.flink.translation.types
-
Flink
TypeInformation
for BeamCoder
s. - CoderTypeInformation(Coder<T>, SerializablePipelineOptions) - Constructor for class org.apache.beam.runners.flink.translation.types.CoderTypeInformation
- CoderTypeInformation(Coder<T>, PipelineOptions) - Constructor for class org.apache.beam.runners.flink.translation.types.CoderTypeInformation
- CoderTypeSerializer<T> - Class in org.apache.beam.runners.flink.translation.types
-
Flink
TypeSerializer
for BeamCoders
. - CoderTypeSerializer(Coder<T>, boolean) - Constructor for class org.apache.beam.runners.flink.translation.types.CoderTypeSerializer
- CoderTypeSerializer(Coder<T>, SerializablePipelineOptions) - Constructor for class org.apache.beam.runners.flink.translation.types.CoderTypeSerializer
- CoGbkResult - Class in org.apache.beam.sdk.transforms.join
-
A row result of a
CoGroupByKey
. - CoGbkResult(CoGbkResultSchema, Iterable<RawUnionValue>) - Constructor for class org.apache.beam.sdk.transforms.join.CoGbkResult
-
A row in the
PCollection
resulting from aCoGroupByKey
transform. - CoGbkResult(CoGbkResultSchema, Iterable<RawUnionValue>, int, int) - Constructor for class org.apache.beam.sdk.transforms.join.CoGbkResult
- CoGbkResult.CoGbkResultCoder - Class in org.apache.beam.sdk.transforms.join
-
A
Coder
forCoGbkResult
s. - CoGbkResultSchema - Class in org.apache.beam.sdk.transforms.join
-
A schema for the results of a
CoGroupByKey
. - CoGbkResultSchema(TupleTagList) - Constructor for class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
-
Builds a schema from a tuple of
TupleTag<?>
s. - CoGroup - Class in org.apache.beam.sdk.schemas.transforms
-
A transform that performs equijoins across multiple schema
PCollection
s. - CoGroup() - Constructor for class org.apache.beam.sdk.schemas.transforms.CoGroup
- CoGroup.By - Class in org.apache.beam.sdk.schemas.transforms
-
Defines the set of fields to extract for the join key, as well as other per-input join options.
- CoGroup.ExpandCrossProduct - Class in org.apache.beam.sdk.schemas.transforms
-
A
PTransform
that calculates the cross-product join. - CoGroup.Impl - Class in org.apache.beam.sdk.schemas.transforms
-
The implementing PTransform.
- CoGroup.Result - Class in org.apache.beam.sdk.schemas.transforms
- CoGroupByKey<K> - Class in org.apache.beam.sdk.transforms.join
-
A
PTransform
that performs aCoGroupByKey
on a tuple of tables. - collect(String, Dataset<T>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.EvaluationContext
-
The purpose of this utility is to mark the evaluation of Spark actions, both during Pipeline translation, when evaluation is required, and when finally evaluating the pipeline.
- COLLECTION_TYPES - Static variable in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- CollectionCoder<T> - Class in org.apache.beam.sdk.coders
- CollectionCoder(Coder<T>) - Constructor for class org.apache.beam.sdk.coders.CollectionCoder
- collectionEncoder(Encoder<T>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers
- collectionEncoder(Encoder<T>, boolean) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers
- column(SqlParserPos, SqlIdentifier, SqlDataTypeSpec, SqlNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlDdlNodes
-
Creates a column declaration.
- Column() - Constructor for class org.apache.beam.sdk.io.clickhouse.TableSchema.Column
- Column() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema.Column
- COLUMN_CREATED_AT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for the timestamp at which the partition row was first created.
- COLUMN_END_TIMESTAMP - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for the timestamp to end the change stream query of the partition.
- COLUMN_FAMILIES - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
- COLUMN_FINISHED_AT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for the timestamp at which the partition was marked as finished by the
ReadChangeStreamPartitionDoFn
SDF. - COLUMN_HEARTBEAT_MILLIS - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for the change stream query heartbeat interval in millis.
- COLUMN_PARENT_TOKENS - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for parent partition tokens.
- COLUMN_PARTITION_TOKEN - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for the partition token.
- COLUMN_RUNNING_AT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for the timestamp at which the partition was marked as running by the
ReadChangeStreamPartitionDoFn
SDF. - COLUMN_SCHEDULED_AT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for the timestamp at which the partition was scheduled by the
DetectNewPartitionsDoFn
SDF. - COLUMN_START_TIMESTAMP - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for the timestamp to start the change stream query of the partition.
- COLUMN_STATE - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for the state that the partition is currently in.
- COLUMN_WATERMARK - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for the current watermark of the partition.
- columns() - Method in class org.apache.beam.sdk.io.clickhouse.TableSchema
- COLUMNS - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiCDC
- COLUMNS_MAPPING - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.RowUtils
- columnType() - Method in class org.apache.beam.sdk.io.clickhouse.TableSchema.Column
- ColumnType - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
Defines a column type from a Cloud Spanner table with the following information: column name, column type, flag indicating if column is primary key and column position in the table.
- ColumnType() - Constructor for class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- ColumnType(String, TypeCode, boolean, long) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
- combine(Iterable<? extends Instant>) - Method in enum class org.apache.beam.sdk.transforms.windowing.TimestampCombiner
-
Combines the given times, which must be from the same window and must have been passed through
TimestampCombiner.merge(org.apache.beam.sdk.transforms.windowing.BoundedWindow, java.lang.Iterable<? extends org.joda.time.Instant>)
. - combine(Iterable<WindowedValue<KV<K, InputT>>>, Collector<WindowedValue<KV<K, AccumT>>>) - Method in class org.apache.beam.runners.flink.translation.functions.FlinkPartialReduceFunction
- combine(AbstractFlinkCombineRunner.FlinkCombiner<K, InputT, AccumT, OutputT>, WindowingStrategy<Object, W>, SideInputReader, PipelineOptions, Iterable<WindowedValue<KV<K, InputT>>>, Collector<WindowedValue<KV<K, OutputT>>>) - Method in class org.apache.beam.runners.flink.translation.functions.AbstractFlinkCombineRunner
-
Consumes
WindowedValues
and produces combined output to the given output. - combine(AbstractFlinkCombineRunner.FlinkCombiner<K, InputT, AccumT, OutputT>, WindowingStrategy<Object, W>, SideInputReader, PipelineOptions, Iterable<WindowedValue<KV<K, InputT>>>, Collector<WindowedValue<KV<K, OutputT>>>) - Method in class org.apache.beam.runners.flink.translation.functions.HashingFlinkCombineRunner
- combine(AbstractFlinkCombineRunner.FlinkCombiner<K, InputT, AccumT, OutputT>, WindowingStrategy<Object, W>, SideInputReader, PipelineOptions, Iterable<WindowedValue<KV<K, InputT>>>, Collector<WindowedValue<KV<K, OutputT>>>) - Method in class org.apache.beam.runners.flink.translation.functions.SingleWindowFlinkCombineRunner
- combine(AbstractFlinkCombineRunner.FlinkCombiner<K, InputT, AccumT, OutputT>, WindowingStrategy<Object, W>, SideInputReader, PipelineOptions, Iterable<WindowedValue<KV<K, InputT>>>, Collector<WindowedValue<KV<K, OutputT>>>) - Method in class org.apache.beam.runners.flink.translation.functions.SortingFlinkCombineRunner
- combine(Instant...) - Method in enum class org.apache.beam.sdk.transforms.windowing.TimestampCombiner
-
Varargs variant of
TimestampCombiner.combine(java.lang.Iterable<? extends org.joda.time.Instant>)
. - Combine - Class in org.apache.beam.sdk.transforms
-
PTransform
s for combiningPCollection
elements globally and per-key. - Combine.AccumulatingCombineFn<InputT,
AccumT, - Class in org.apache.beam.sdk.transformsOutputT> -
A
CombineFn
that uses a subclass ofCombine.AccumulatingCombineFn.Accumulator
as its accumulator type. - Combine.AccumulatingCombineFn.Accumulator<InputT,
AccumT, - Interface in org.apache.beam.sdk.transformsOutputT> -
The type of mutable accumulator values used by this
AccumulatingCombineFn
. - Combine.BinaryCombineDoubleFn - Class in org.apache.beam.sdk.transforms
-
An abstract subclass of
Combine.CombineFn
for implementing combiners that are more easily and efficiently expressed as binary operations ondouble
s. - Combine.BinaryCombineFn<V> - Class in org.apache.beam.sdk.transforms
-
An abstract subclass of
Combine.CombineFn
for implementing combiners that are more easily expressed as binary operations. - Combine.BinaryCombineIntegerFn - Class in org.apache.beam.sdk.transforms
-
An abstract subclass of
Combine.CombineFn
for implementing combiners that are more easily and efficiently expressed as binary operations onint
s - Combine.BinaryCombineLongFn - Class in org.apache.beam.sdk.transforms
-
An abstract subclass of
Combine.CombineFn
for implementing combiners that are more easily and efficiently expressed as binary operations onlong
s. - Combine.CombineFn<InputT,
AccumT, - Class in org.apache.beam.sdk.transformsOutputT> -
A
CombineFn<InputT, AccumT, OutputT>
specifies how to combine a collection of input values of typeInputT
into a single output value of typeOutputT
. - Combine.Globally<InputT,
OutputT> - Class in org.apache.beam.sdk.transforms -
Combine.Globally<InputT, OutputT>
takes aPCollection<InputT>
and returns aPCollection<OutputT>
whose elements are the result of combining all the elements in each window of the inputPCollection
, using a specifiedCombineFn<InputT, AccumT, OutputT>
. - Combine.GloballyAsSingletonView<InputT,
OutputT> - Class in org.apache.beam.sdk.transforms -
Combine.GloballyAsSingletonView<InputT, OutputT>
takes aPCollection<InputT>
and returns aPCollectionView<OutputT>
whose elements are the result of combining all the elements in each window of the inputPCollection
, using a specifiedCombineFn<InputT, AccumT, OutputT>
. - Combine.GroupedValues<K,
InputT, - Class in org.apache.beam.sdk.transformsOutputT> -
GroupedValues<K, InputT, OutputT>
takes aPCollection<KV<K, Iterable<InputT>>>
, such as the result ofGroupByKey
, applies a specifiedCombineFn<InputT, AccumT, OutputT>
to each of the inputKV<K, Iterable<InputT>>
elements to produce a combined outputKV<K, OutputT>
element, and returns aPCollection<KV<K, OutputT>>
containing all the combined output elements. - Combine.Holder<V> - Class in org.apache.beam.sdk.transforms
-
Holds a single value value of type
V
which may or may not be present. - Combine.IterableCombineFn<V> - Class in org.apache.beam.sdk.transforms
- Combine.PerKey<K,
InputT, - Class in org.apache.beam.sdk.transformsOutputT> -
PerKey<K, InputT, OutputT>
takes aPCollection<KV<K, InputT>>
, groups it by key, applies a combining function to theInputT
values associated with each key to produce a combinedOutputT
value, and returns aPCollection<KV<K, OutputT>>
representing a map from each distinct key of the inputPCollection
to the corresponding combined value. - Combine.PerKeyWithHotKeyFanout<K,
InputT, - Class in org.apache.beam.sdk.transformsOutputT> -
Like
Combine.PerKey
, but sharding the combining of hot keys. - Combine.SimpleCombineFn<V> - Class in org.apache.beam.sdk.transforms
-
Deprecated.
- CombineAsIterable<T> - Class in org.apache.beam.sdk.io.gcp.bigquery
- CombineAsIterable() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.CombineAsIterable
- CombineFieldsByFields() - Constructor for class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
- combineFn - Variable in class org.apache.beam.runners.flink.translation.functions.FlinkPartialReduceFunction
- combineFn - Variable in class org.apache.beam.runners.flink.translation.functions.FlinkReduceFunction
- combineFn() - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CountIf
- combineFn() - Static method in class org.apache.beam.sdk.transforms.Count
-
Returns a
Combine.CombineFn
that counts the number of its inputs. - combineFn() - Static method in class org.apache.beam.sdk.transforms.Latest
-
Returns a
Combine.CombineFn
that selects the latest element among its inputs. - combineFn(int) - Static method in class org.apache.beam.sdk.transforms.Sample
-
Returns a
Combine.CombineFn
that computes a fixed-sized uniform sample of its inputs. - CombineFn() - Constructor for class org.apache.beam.sdk.transforms.Combine.CombineFn
- CombineFnBase - Class in org.apache.beam.sdk.transforms
-
For internal use only; no backwards-compatibility guarantees.
- CombineFnBase() - Constructor for class org.apache.beam.sdk.transforms.CombineFnBase
- CombineFnBase.GlobalCombineFn<InputT,
AccumT, - Interface in org.apache.beam.sdk.transformsOutputT> -
For internal use only; no backwards-compatibility guarantees.
- CombineFns - Class in org.apache.beam.sdk.transforms
-
Static utility methods that create combine function instances.
- CombineFns() - Constructor for class org.apache.beam.sdk.transforms.CombineFns
- CombineFns.CoCombineResult - Class in org.apache.beam.sdk.transforms
-
A tuple of outputs produced by a composed combine functions.
- CombineFns.ComposeCombineFnBuilder - Class in org.apache.beam.sdk.transforms
-
A builder class to construct a composed
CombineFnBase.GlobalCombineFn
. - CombineFns.ComposedCombineFn<DataT> - Class in org.apache.beam.sdk.transforms
-
A composed
Combine.CombineFn
that applies multipleCombineFns
. - CombineFns.ComposedCombineFnWithContext<DataT> - Class in org.apache.beam.sdk.transforms
-
A composed
CombineWithContext.CombineFnWithContext
that applies multipleCombineFnWithContexts
. - CombineFnTester - Class in org.apache.beam.sdk.testing
-
Utilities for testing
CombineFns
. - CombineFnTester() - Constructor for class org.apache.beam.sdk.testing.CombineFnTester
- CombineFnWithContext() - Constructor for class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
- combineGlobally(JavaRDD<WindowedValue<InputT>>, SparkCombineFn<InputT, InputT, AccumT, OutputT>, Coder<AccumT>, WindowingStrategy<?, ?>) - Static method in class org.apache.beam.runners.spark.translation.GroupCombineFunctions
-
Apply a composite
Combine.Globally
transformation. - CombineGroupedValues(SparkCombineFn<KV<K, InputT>, InputT, ?, OutputT>) - Constructor for class org.apache.beam.runners.spark.translation.TranslationUtils.CombineGroupedValues
- combinePerKey(JavaRDD<WindowedValue<KV<K, V>>>, SparkCombineFn<KV<K, V>, V, AccumT, ?>, Coder<K>, Coder<V>, Coder<AccumT>, WindowingStrategy<?, ?>) - Static method in class org.apache.beam.runners.spark.translation.GroupCombineFunctions
-
Apply a composite
Combine.PerKey
transformation. - CombineWithContext - Class in org.apache.beam.sdk.transforms
-
This class contains combine functions that have access to
PipelineOptions
and side inputs throughCombineWithContext.Context
. - CombineWithContext() - Constructor for class org.apache.beam.sdk.transforms.CombineWithContext
- CombineWithContext.CombineFnWithContext<InputT,
AccumT, - Class in org.apache.beam.sdk.transformsOutputT> -
A combine function that has access to
PipelineOptions
and side inputs throughCombineWithContext.Context
. - CombineWithContext.Context - Class in org.apache.beam.sdk.transforms
-
Information accessible to all methods in
CombineFnWithContext
andKeyedCombineFnWithContext
. - CombineWithContext.RequiresContextInternal - Interface in org.apache.beam.sdk.transforms
-
An internal interface for signaling that a
GloballyCombineFn
or aPerKeyCombineFn
needs to accessCombineWithContext.Context
. - combining(Coder<AccumT>, Combine.CombineFn<InputT, AccumT, OutputT>) - Static method in class org.apache.beam.sdk.state.StateSpecs
-
Identical to
StateSpecs.combining(CombineFn)
, but with an accumulator coder explicitly supplied. - combining(Coder<AccumT>, CombineWithContext.CombineFnWithContext<InputT, AccumT, OutputT>) - Static method in class org.apache.beam.sdk.state.StateSpecs
-
For internal use only; no backwards compatibility guarantees
- combining(Combine.CombineFn<InputT, AccumT, OutputT>) - Static method in class org.apache.beam.sdk.state.StateSpecs
-
Create a
StateSpec
for aCombiningState
which uses aCombine.CombineFn
to automatically merge multiple values of typeInputT
into a single resultingOutputT
. - combining(CombineWithContext.CombineFnWithContext<InputT, AccumT, OutputT>) - Static method in class org.apache.beam.sdk.state.StateSpecs
-
For internal use only; no backwards compatibility guarantees
- combiningFromInputInternal(Coder<InputT>, Combine.CombineFn<InputT, AccumT, OutputT>) - Static method in class org.apache.beam.sdk.state.StateSpecs
-
For internal use only; no backwards-compatibility guarantees.
- CombiningState<InputT,
AccumT, - Interface in org.apache.beam.sdk.stateOutputT> -
A
ReadableState
cell defined by aCombine.CombineFn
, accepting multiple input values, combining them as specified into accumulators, and producing a single output value. - comment(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.Table.Builder
- commit() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- commitOffset(Offset) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.BlockingCommitterImpl
- commitOffsets() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
Enable committing record offset.
- commitOffsetsInFinalize() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Finalized offsets are committed to Kafka.
- commitWriteStreams(String, Iterable<String>) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.WriteStreamService
-
Commit write streams of type PENDING.
- commitWriteStreams(String, Iterable<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.WriteStreamServiceImpl
- commitWriteStreams(String, Iterable<String>) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- Common Kafka Consumer Configurations - Search tag in class org.apache.beam.sdk.io.kafka.KafkaIO
- Section
- commonPrefixLength(RandomAccessData, RandomAccessData) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.UnsignedLexicographicalComparator
-
Compute the length of the common prefix of the two provided sets of bytes.
- compact(AccumT) - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
- compact(AccumT) - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
-
Returns an accumulator that represents the same logical value as the input accumulator, but may have a more compact representation.
- compact(AccumT, CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
-
Returns an accumulator that represents the same logical value as the input accumulator, but may have a more compact representation.
- compact(Object[]) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
- compact(Object[], CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
- compact(List<V>) - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
- compare(byte[], byte[]) - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueComparator
- compare(JobMessage, JobMessage) - Method in class org.apache.beam.runners.dataflow.util.MonitoringUtil.TimeStampComparator
- compare(RandomAccessData, RandomAccessData) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.UnsignedLexicographicalComparator
- compare(RandomAccessData, RandomAccessData, int) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.UnsignedLexicographicalComparator
-
Compare the two sets of bytes starting at the given offset.
- compare(KV<K, V>, KV<K, V>) - Method in class org.apache.beam.sdk.values.KV.OrderByKey
- compare(KV<K, V>, KV<K, V>) - Method in class org.apache.beam.sdk.values.KV.OrderByValue
- compare(Row, Row) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel.BeamSqlRowComparator
- compare(T, T) - Method in class org.apache.beam.sdk.transforms.Top.Largest
-
Deprecated.
- compare(T, T) - Method in class org.apache.beam.sdk.transforms.Top.Natural
- compare(T, T) - Method in class org.apache.beam.sdk.transforms.Top.Reversed
- compare(T, T) - Method in class org.apache.beam.sdk.transforms.Top.Smallest
-
Deprecated.
- compareSchemaField(Schema.Field, Schema.Field) - Static method in class org.apache.beam.sdk.io.jdbc.SchemaUtil
-
compares two fields.
- compareSerialized(DataInputView, DataInputView) - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueComparator
- compareTo(Object) - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
- compareTo(Path) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- compareTo(ByteArray) - Method in class org.apache.beam.runners.spark.util.ByteArray
- compareTo(ContiguousSequenceRange) - Method in class org.apache.beam.sdk.extensions.ordered.ContiguousSequenceRange
- compareTo(ByteKey) - Method in class org.apache.beam.sdk.io.range.ByteKey
-
ByteKey
implementsComparable<ByteKey>
by comparing the arrays in lexicographic order. - compareTo(RedisCursor) - Method in class org.apache.beam.sdk.io.redis.RedisCursor
-
RedisCursor
implementsComparable<RedisCursor>
by transforming the cursors to an index of the Redis table. - compareTo(IntervalWindow) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
- compareToReference(TypeComparator<byte[]>) - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueComparator
- comparing(SerializableFunction<? super T, ? extends V>) - Static method in interface org.apache.beam.sdk.transforms.SerializableComparator
-
Analogous to
Comparator.comparing(Function)
, except that it takes in aSerializableFunction
as the key extractor and returns aSerializableComparator
. - comparingNullFirst(Function<? super T, ? extends K>) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
- CompatibilityError() - Constructor for class org.apache.beam.sdk.schemas.transforms.Cast.CompatibilityError
- compile(List<CEPPattern>, Schema) - Static method in class org.apache.beam.sdk.extensions.sql.impl.nfa.NFA
- CompileException(DiagnosticCollector<?>) - Constructor for exception class org.apache.beam.sdk.schemas.transforms.providers.StringCompiler.CompileException
- complete() - Method in class org.apache.beam.runners.jet.processors.ParDoP
- complete() - Method in class org.apache.beam.runners.jet.processors.BoundedSourceP
- complete() - Method in class org.apache.beam.runners.jet.processors.ImpulseP
- complete() - Method in class org.apache.beam.runners.jet.processors.StatefulParDoP
- complete() - Method in class org.apache.beam.runners.jet.processors.UnboundedSourceP
- complete() - Method in class org.apache.beam.runners.jet.processors.ViewP
- complete() - Method in class org.apache.beam.runners.jet.processors.WindowGroupP
- complete(List<TimestampedValue<OutputT>>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth.PollResult
-
Constructs a
Watch.Growth.PollResult
with the given outputs and declares that there will be no new outputs for the current input. - complete(Instant, List<OutputT>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth.PollResult
-
Like
Watch.Growth.PollResult.complete(List)
, but assigns the same timestamp to all new outputs. - completed() - Method in interface org.apache.beam.runners.local.PipelineMessageReceiver
-
Report that the pipeline has successfully completed.
- completeEdge(int) - Method in class org.apache.beam.runners.jet.processors.ParDoP
- completeEdge(int) - Method in class org.apache.beam.runners.jet.processors.StatefulParDoP
- CompleteFlinkCombiner(CombineFnBase.GlobalCombineFn<InputT, AccumT, OutputT>) - Constructor for class org.apache.beam.runners.flink.translation.functions.AbstractFlinkCombineRunner.CompleteFlinkCombiner
- complexityFactor - Variable in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator
- COMPONENT_ENCODINGS - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- compose() - Static method in class org.apache.beam.sdk.transforms.CombineFns
-
Returns a
CombineFns.ComposeCombineFnBuilder
to construct a composedCombineFnBase.GlobalCombineFn
. - compose(String, SerializableFunction<InputT, OutputT>) - Static method in class org.apache.beam.sdk.transforms.PTransform
-
Like
PTransform.compose(SerializableFunction)
, but with a custom name. - compose(SerializableFunction<InputT, OutputT>) - Static method in class org.apache.beam.sdk.transforms.PTransform
-
For a
SerializableFunction<InputT, OutputT>
fn
, returns aPTransform
given by applyingfn.apply(v)
to the inputPCollection<InputT>
. - ComposeCombineFnBuilder() - Constructor for class org.apache.beam.sdk.transforms.CombineFns.ComposeCombineFnBuilder
- COMPOSITE_TYPES - Static variable in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- CompressedReader(CompressedSource<T>, FileBasedSource.FileBasedReader<T>) - Constructor for class org.apache.beam.sdk.io.CompressedSource.CompressedReader
-
Create a
CompressedReader
from aCompressedSource
and delegate reader. - CompressedSource<T> - Class in org.apache.beam.sdk.io
-
A Source that reads from compressed files.
- CompressedSource.CompressedReader<T> - Class in org.apache.beam.sdk.io
-
Reader for a
CompressedSource
. - CompressedSource.CompressionMode - Enum Class in org.apache.beam.sdk.io
-
Deprecated.Use
Compression
instead - CompressedSource.DecompressingChannelFactory - Interface in org.apache.beam.sdk.io
-
Factory interface for creating channels that decompress the content of an underlying channel.
- Compression - Enum Class in org.apache.beam.sdk.io
-
Various compression types for reading/writing files.
- compute(Iterator<RawUnionValue>, RecordCollector<WindowedValue<OutputT>>) - Method in class org.apache.beam.runners.twister2.translators.functions.OutputTagFilter
- compute(Iterator<WindowedValue<InputT>>, RecordCollector<RawUnionValue>) - Method in class org.apache.beam.runners.twister2.translators.functions.DoFnFunction
- compute(Iterator<WindowedValue<T>>, RecordCollector<WindowedValue<T>>) - Method in class org.apache.beam.runners.twister2.translators.functions.AssignWindowsFunction
- compute(Partition, TaskContext) - Method in class org.apache.beam.runners.spark.io.SourceRDD.Bounded
- compute(Partition, TaskContext) - Method in class org.apache.beam.runners.spark.io.SourceRDD.Unbounded
- compute(Time) - Method in class org.apache.beam.runners.spark.translation.SingleEmitInputDStream
- compute(Time) - Method in class org.apache.beam.runners.spark.translation.streaming.TestDStream
- computeIfAbsent(K, Function<? super K, ? extends V>) - Method in interface org.apache.beam.sdk.state.MapState
-
A deferred read-followed-by-write.
- computeOutputs() - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
-
Computes the outputs for all RDDs that are leaves in the DAG and do not have any actions (like saving to a file) registered on them (i.e.
- computeOutputs() - Method in class org.apache.beam.runners.spark.translation.SparkTranslationContext
-
Compute the outputs for all RDDs that are leaves in the DAG.
- computeSelfCost(RelOptPlanner, RelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamEnumerableConverter
- computeSelfCost(RelOptPlanner, RelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
- concat(Iterable<T>...) - Static method in class org.apache.beam.sdk.fn.stream.PrefetchableIterables
-
Concatentates the
Iterable
s. - concat(String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
- concat(String, String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
- concat(String, String, String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
- concat(String, String, String, String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
- concat(String, String, String, String, String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
- concat(Iterator<T>...) - Static method in class org.apache.beam.sdk.fn.stream.PrefetchableIterators
-
Concatentates the
Iterator
s. - concat(List<T>, List<T>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.utils.ScalaInterop
- CONCAT - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
- CONCAT_FIELD_NAMES - Static variable in class org.apache.beam.sdk.schemas.utils.SelectHelpers
-
This policy keeps all levels of a name.
- CONCAT_METHOD - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.BeamBuiltinMethods
- Concatenate() - Constructor for class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Concatenate
- ConcatenateAsIterable() - Constructor for class org.apache.beam.runners.flink.FlinkStreamingAggregationsTranslators.ConcatenateAsIterable
- concatFieldNames() - Method in class org.apache.beam.sdk.schemas.transforms.Select.Flattened
-
For nested fields, concatenate all the names separated by a _ character in the flattened schema.
- concatIterators(Iterator<Iterator<T>>) - Static method in class org.apache.beam.sdk.fn.stream.PrefetchableIterators
- CONCRETE_CLASS - Static variable in class org.apache.beam.sdk.io.WriteFiles
-
For internal use by runners.
- config() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- config() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.hcatalog.HCatalogTable
- Config() - Constructor for class org.apache.beam.sdk.managed.testing.TestSchemaTransformProvider.Config
- configuration - Variable in class org.apache.beam.runners.jobsubmission.JobServerDriver
- Configuration - Search tag in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
- Section
- Configuration() - Constructor for class org.apache.beam.io.debezium.DebeziumTransformRegistrar.ReadBuilder.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.expansion.service.WindowIntoTransformProvider.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.extensions.schemaio.expansion.ExternalSchemaIOTransformRegistrar.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.extensions.sql.expansion.ExternalSqlTransformRegistrar.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.ReadDataBuilder.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.WriteBuilder.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergCdcReadSchemaTransformProvider.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergReadSchemaTransformProvider.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaIO.Write.External.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.io.snowflake.crosslanguage.ReadBuilder.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.io.snowflake.crosslanguage.WriteBuilder.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.FlattenTransformProvider.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider.Configuration
- configurationClass() - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.expansion.service.WindowIntoTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformProvider
-
Provides the required
TypedSchemaTransformProvider.configurationClass()
. - configurationClass() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformProvider
-
Returns the expected class of the configuration.
- configurationClass() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.io.json.providers.JsonWriteTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadProvider
-
Returns the expected class of the configuration.
- configurationClass() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteProvider
-
Returns the expected class of the configuration.
- configurationClass() - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.schemas.transforms.providers.FlattenTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.schemas.transforms.TypedSchemaTransformProvider
- ConfigurationLocator() - Constructor for class org.apache.beam.sdk.io.hdfs.HadoopFileSystemOptions.ConfigurationLocator
- Configuration of AWS clients - Search tag in class org.apache.beam.sdk.io.aws2.dynamodb.DynamoDBIO
- Section
- Configuration of AWS clients - Search tag in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO
- Section
- Configuration of AWS clients - Search tag in class org.apache.beam.sdk.io.aws2.sns.SnsIO
- Section
- Configuration of AWS clients - Search tag in class org.apache.beam.sdk.io.aws2.sqs.SqsIO
- Section
- Configuration Options - Search tag in class org.apache.beam.sdk.io.iceberg.IcebergIO
- Section
- configurationSchema() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSchemaIOProvider
-
Returns the expected schema of the configuration object.
- configurationSchema() - Method in class org.apache.beam.sdk.extensions.sql.expansion.SqlTransformSchemaTransformProvider
- configurationSchema() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaIOProvider
-
Returns the expected schema of the configuration object.
- configurationSchema() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider
-
Returns the expected schema of the configuration object.
- configurationSchema() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
-
Returns the expected schema of the configuration object.
- configurationSchema() - Method in class org.apache.beam.sdk.io.jdbc.JdbcSchemaIOProvider
-
Returns the expected schema of the configuration object.
- configurationSchema() - Method in interface org.apache.beam.sdk.schemas.io.SchemaIOProvider
-
Returns the expected schema of the configuration object.
- configurationSchema() - Method in interface org.apache.beam.sdk.schemas.transforms.SchemaTransformProvider
-
Returns the expected schema of the configuration object.
- configurationSchema() - Method in class org.apache.beam.sdk.schemas.transforms.TypedSchemaTransformProvider
- Configurations of ReadSourceDescriptors - Search tag in class org.apache.beam.sdk.io.kafka.KafkaIO
- Section
- configure() - Static method in class org.apache.beam.sdk.transforms.windowing.Window
-
Returns a new builder for a
Window
transform for setting windowing parameters other than the windowing function. - configure(Map<String, ?>, boolean) - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantDeserializer
- configure(Map<String, ?>, boolean) - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantSerializer
- configure(Configuration) - Method in class org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat
- configure(Configuration) - Method in class org.apache.beam.runners.flink.translation.wrappers.SourceInputFormat
- ConfigWrapper<T> - Class in org.apache.beam.sdk.io.cdap
-
Class for building
PluginConfig
object of the specific class . - ConfigWrapper(Class<T>) - Constructor for class org.apache.beam.sdk.io.cdap.ConfigWrapper
- ConfluentSchemaRegistryDeserializerProvider<T> - Class in org.apache.beam.sdk.io.kafka
-
A
DeserializerProvider
that uses Confluent Schema Registry to resolve aDeserializer
s andCoder
given a subject. - connect() - Method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
-
Connect to the Redis instance.
- connect() - Method in class org.apache.beam.sdk.io.solace.broker.JcsmpSessionService
- connect() - Method in class org.apache.beam.sdk.io.solace.broker.SessionService
-
Establishes a connection to the service.
- connect(String, Properties) - Method in class org.apache.beam.sdk.extensions.sql.impl.JdbcDriver
-
Configures Beam-specific options and opens a JDBC connection to Calcite.
- connect(CatalogManager, PipelineOptions) - Static method in class org.apache.beam.sdk.extensions.sql.impl.JdbcDriver
-
Like
JdbcDriver.connect(TableProvider, PipelineOptions)
, but overrides the top-level schema with aCatalogManager
. - connect(TableProvider, PipelineOptions) - Static method in class org.apache.beam.sdk.extensions.sql.impl.JdbcDriver
-
Connects to the driver using standard
JdbcDriver.connect(String, Properties)
call, but overrides the initial schema factory. - CONNECT_STRING_PREFIX - Static variable in class org.apache.beam.sdk.extensions.sql.impl.JdbcDriver
- connection() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- connectionAcquisitionTimeout() - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration
-
Milliseconds to wait when acquiring a connection from the pool before giving up and timing out.
- connectionAcquisitionTimeout(Integer) - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration.Builder
-
Milliseconds to wait when acquiring a connection from the pool before giving up and timing out.
- ConnectionConfiguration() - Constructor for class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
- ConnectionConfiguration() - Constructor for class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.ConnectionConfiguration
- ConnectionConfiguration() - Constructor for class org.apache.beam.sdk.io.mqtt.MqttIO.ConnectionConfiguration
- ConnectionConfiguration() - Constructor for class org.apache.beam.sdk.io.solr.SolrIO.ConnectionConfiguration
- ConnectionManager - Class in org.apache.beam.sdk.io.cassandra
- ConnectionManager() - Constructor for class org.apache.beam.sdk.io.cassandra.ConnectionManager
- connectionMaxIdleTime() - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration
-
Maximum milliseconds a connection should be allowed to remain open while idle.
- connectionMaxIdleTime(Integer) - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration.Builder
-
Maximum milliseconds a connection should be allowed to remain open while idle.
- connectionTimeout() - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration
-
Milliseconds to wait when initially establishing a connection before giving up and timing out.
- connectionTimeout(Integer) - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration.Builder
-
Milliseconds to wait when initially establishing a connection before giving up and timing out.
- connectionTimeToLive() - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration
-
Maximum milliseconds a connection should be allowed to remain open, regardless of usage frequency.
- connectionTimeToLive(Integer) - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration.Builder
-
Maximum milliseconds a connection should be allowed to remain open, regardless of usage frequency.
- ConnectorConfiguration() - Constructor for class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
- Connector retries - Search tag in class org.apache.beam.sdk.io.solace.SolaceIO
- Section
- Connectors - Enum Class in org.apache.beam.io.debezium
-
Enumeration of debezium connectors.
- consistentWithEquals() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.FooterCoder
- consistentWithEquals() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmRecordCoder
- consistentWithEquals() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmShardCoder
- consistentWithEquals() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.KeyPrefixCoder
- consistentWithEquals() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
- consistentWithEquals() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.SingletonKeyedWorkItemCoder
- consistentWithEquals() - Method in class org.apache.beam.runners.fnexecution.wire.ByteStringCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BigEndianLongCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BigEndianShortCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BitSetCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BooleanCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.ByteCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.Coder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.DequeCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.DoubleCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.DurationCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.FloatCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.InstantCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.KvCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
-
LengthPrefixCoder
is consistent with equals if the nestedCoder
is. - consistentWithEquals() - Method in class org.apache.beam.sdk.coders.ListCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.MapCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.NullableCoder
-
NullableCoder
is consistent with equals if the nestedCoder
is. - consistentWithEquals() - Method in class org.apache.beam.sdk.coders.OptionalCoder
-
OptionalCoder
is consistent with equals if the nestedCoder
is. - consistentWithEquals() - Method in class org.apache.beam.sdk.coders.SerializableCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.SortedMapCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.TimestampPrefixingWindowCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.VarIntCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.VarLongCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.ZstdCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow.Coder
- consistentWithEquals() - Method in class org.apache.beam.sdk.io.fs.MetadataCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.io.fs.MetadataCoderV2
- consistentWithEquals() - Method in class org.apache.beam.sdk.io.fs.ResourceIdCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.io.kafka.ProducerRecordCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.io.range.OffsetRange.Coder
- consistentWithEquals() - Method in class org.apache.beam.sdk.schemas.SchemaCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow.Coder
- consistentWithEquals() - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
- ConsoleIO - Class in org.apache.beam.runners.spark.io
-
Print to console.
- ConsoleIO.Write - Class in org.apache.beam.runners.spark.io
-
Write to console.
- ConsoleIO.Write.Unbound<T> - Class in org.apache.beam.runners.spark.io
-
PTransform
writingPCollection
to the console. - constant(FileBasedSink.FilenamePolicy) - Static method in class org.apache.beam.sdk.io.DynamicFileDestinations
-
A specialization of
DynamicFileDestinations.constant(FilenamePolicy, SerializableFunction)
for the case where UserT and OutputT are the same type and the format function is the identity. - constant(FileBasedSink.FilenamePolicy, SerializableFunction<UserT, OutputT>) - Static method in class org.apache.beam.sdk.io.DynamicFileDestinations
-
Returns a
FileBasedSink.DynamicDestinations
that always returns the sameFileBasedSink.FilenamePolicy
. - constant(OutT) - Static method in class org.apache.beam.sdk.transforms.SerializableFunctions
- CONSTANT_WINDOW_SIZE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
- constantDestinations(FileBasedSink.FilenamePolicy, Schema, Map<String, Object>, CodecFactory, SerializableFunction<UserT, OutputT>) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
-
Returns a
DynamicAvroDestinations
that always returns the sameFileBasedSink.FilenamePolicy
, schema, metadata, and codec. - constantDestinations(FileBasedSink.FilenamePolicy, Schema, Map<String, Object>, CodecFactory, SerializableFunction<UserT, OutputT>, AvroSink.DatumWriterFactory<OutputT>) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
-
Returns a
DynamicAvroDestinations
that always returns the sameFileBasedSink.FilenamePolicy
, schema, metadata, and codec. - Constraints - Search tag in class org.apache.beam.sdk.extensions.sql.impl.cep.OrderKey
- Section
- Constraints - Search tag in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel
- Section
- Constraints - Search tag in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamWindowRel
- Section
- constructFilter(List<RexNode>) - Method in class org.apache.beam.sdk.extensions.sql.meta.BaseBeamTable
- constructFilter(List<RexNode>) - Method in interface org.apache.beam.sdk.extensions.sql.meta.BeamSqlTable
-
Generate an IO implementation of
BeamSqlTableFilter
for predicate push-down. - constructFilter(List<RexNode>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigtable.BigtableTable
- constructFilter(List<RexNode>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTable
- constructName(ResourceId, String, String, int, int, String, String) - Static method in class org.apache.beam.sdk.io.DefaultFilenamePolicy
-
Constructs a fully qualified name from components.
- Consumed positions - Search tag in interface org.apache.beam.sdk.io.range.RangeTracker
- Section
- consumesProjection() - Method in interface org.apache.beam.sdk.schemas.ProjectionConsumer
-
Returns a map from input
TupleTag
id to aFieldAccessDescriptor
describing which Schema fieldsthis
must access from the corresponding inputPCollection
to complete successfully. - Consuming messages from RabbitMQ server - Search tag in class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO
- Section
- contains(Descriptors.Descriptor) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoDomain
- contains(List<SerializableMatcher<? super T>>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.contains(List)
. - contains(Coder<T>, T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.contains(Object[])
. - contains(SerializableMatcher<? super T>...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.contains(Matcher[])
. - contains(IntervalWindow) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
-
Returns whether this window contains the given window.
- contains(PCollectionView<T>) - Method in class org.apache.beam.runners.flink.translation.functions.FlinkSideInputReader
- contains(PCollectionView<T>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.CachedSideInputReader
- contains(PCollectionView<T>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.SparkSideInputReader
- contains(PCollectionView<T>) - Method in class org.apache.beam.runners.spark.util.CachedSideInputReader
- contains(PCollectionView<T>) - Method in class org.apache.beam.runners.spark.util.SparkSideInputReader
- contains(PCollectionView<T>) - Method in class org.apache.beam.runners.twister2.utils.Twister2SideInputReader
- contains(T) - Method in interface org.apache.beam.sdk.state.SetState
-
Returns a
ReadableState
whoseReadableState.read()
method will return true if this set contains the specified element at the point when thatReadableState.read()
call returns. - contains(T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.contains(Object[])
. - containsInAnyOrder() - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
-
Deprecated.Prefer
PAssert.IterableAssert.empty()
to this method. - containsInAnyOrder() - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
- containsInAnyOrder(Iterable<T>) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
-
Asserts that the iterable in question contains the provided elements.
- containsInAnyOrder(Iterable<T>) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
-
Checks that the
Iterable
contains the expected elements, in any order. - containsInAnyOrder(Collection<SerializableMatcher<? super T>>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.containsInAnyOrder(Collection)
. - containsInAnyOrder(Coder<T>, T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.containsInAnyOrder(Object[])
. - containsInAnyOrder(SerializableMatcher<? super T>...) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
-
Asserts that the iterable in question matches the provided elements.
- containsInAnyOrder(SerializableMatcher<? super T>...) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
-
Checks that the
Iterable
contains elements that match the provided matchers, in any order. - containsInAnyOrder(SerializableMatcher<? super T>...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.containsInAnyOrder(Matcher[])
. - containsInAnyOrder(T...) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
-
Asserts that the iterable in question contains the provided elements.
- containsInAnyOrder(T...) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
-
Checks that the
Iterable
contains the expected elements, in any order. - containsInAnyOrder(T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.containsInAnyOrder(Object[])
. - containsKey(Object) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
- containsKey(K) - Method in interface org.apache.beam.sdk.state.MultimapState
-
Returns a
ReadableState
whoseReadableState.read()
method will return true if this multimap contains the specified key at the point when thatReadableState.read()
call returns. - containsKey(ByteKey) - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
-
Returns
true
if the specifiedByteKey
is contained within this range. - containsSeekableInput(RelNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
-
This method returns whether any of the children of the relNode are Seekable.
- containsString(String) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.containsString(java.lang.String)
. - containsValue(Object) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
- CONTENT_STRUCTURE_UNSPECIFIED - Enum constant in enum class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Import.ContentStructure
-
If the content structure is not specified, the default value BUNDLE will be used.
- context - Variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- context - Variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.QueryStatementConverter
-
Conversion context, contains things like FrameworkConfig, QueryTrait and other state used during conversion.
- Context() - Constructor for class org.apache.beam.sdk.schemas.utils.SchemaZipFold.Context
- Context() - Constructor for class org.apache.beam.sdk.transforms.CombineWithContext.Context
- Context() - Constructor for class org.apache.beam.sdk.transforms.Contextful.Fn.Context
- Context(boolean) - Constructor for class org.apache.beam.sdk.coders.Coder.Context
-
Deprecated.
- Context(TableDataInsertAllResponse.InsertErrors) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy.Context
- Contextful<ClosureT> - Class in org.apache.beam.sdk.transforms
-
Pair of a bit of user code (a "closure") and the
Requirements
needed to run it. - Contextful.Fn<InputT,
OutputT> - Interface in org.apache.beam.sdk.transforms -
A function from an input to an output that may additionally access
Contextful.Fn.Context
when computing the result. - Contextful.Fn.Context - Class in org.apache.beam.sdk.transforms
-
An accessor for additional capabilities available in
Contextful.Fn.apply(InputT, org.apache.beam.sdk.transforms.Contextful.Fn.Context)
. - contextSideInput - Variable in class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromBytes
- contextSideInput - Variable in class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromBytesWithContext
- contextSideInput - Variable in class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromGcsUri
- contextSideInput - Variable in class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromGcsUriWithContext
- ContextualTextIO - Class in org.apache.beam.sdk.io.contextualtextio
-
PTransform
s that read text files and collect contextual information of the elements in the input. - ContextualTextIO.Read - Class in org.apache.beam.sdk.io.contextualtextio
-
Implementation of
ContextualTextIO.read()
. - ContextualTextIO.ReadFiles - Class in org.apache.beam.sdk.io.contextualtextio
-
Implementation of
ContextualTextIO.readFiles()
. - ContiguousSequenceRange - Class in org.apache.beam.sdk.extensions.ordered
-
A range of contiguous event sequences and the latest timestamp of the events in the range.
- ContiguousSequenceRange() - Constructor for class org.apache.beam.sdk.extensions.ordered.ContiguousSequenceRange
- CONTINUE - Enum constant in enum class org.apache.beam.runners.local.ExecutionDriver.DriverState
- continuously(Duration, Watch.Growth.TerminationCondition<String, ?>) - Method in class org.apache.beam.sdk.io.FileIO.Match
- continuously(Duration, Watch.Growth.TerminationCondition<String, ?>) - Method in class org.apache.beam.sdk.io.FileIO.MatchAll
- continuously(Duration, Watch.Growth.TerminationCondition<String, ?>) - Method in class org.apache.beam.sdk.io.FileIO.MatchConfiguration
-
Continuously watches for new files at the given interval until the given termination condition is reached, where the input to the condition is the filepattern.
- continuously(Duration, Watch.Growth.TerminationCondition<String, ?>, boolean) - Method in class org.apache.beam.sdk.io.FileIO.Match
- continuously(Duration, Watch.Growth.TerminationCondition<String, ?>, boolean) - Method in class org.apache.beam.sdk.io.FileIO.MatchAll
- continuously(Duration, Watch.Growth.TerminationCondition<String, ?>, boolean) - Method in class org.apache.beam.sdk.io.FileIO.MatchConfiguration
-
Continuously watches for new files at the given interval until the given termination condition is reached, where the input to the condition is the filepattern.
- control(StreamObserver<BeamFnApi.InstructionRequest>) - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService
-
Called by gRPC for each incoming connection from an SDK harness, and enqueue an available SDK harness client.
- ControlClientPool - Interface in org.apache.beam.runners.fnexecution.control
-
A pool of control clients that brokers incoming SDK harness connections (in the form of
InstructionRequestHandlers
. - ControlClientPool.Sink - Interface in org.apache.beam.runners.fnexecution.control
-
A sink for
InstructionRequestHandlers
keyed by worker id. - ControlClientPool.Source - Interface in org.apache.beam.runners.fnexecution.control
-
A source of
InstructionRequestHandlers
. - ConversionContext - Class in org.apache.beam.sdk.extensions.sql.zetasql.translation
-
Conversion context, some rules need this data to convert the nodes.
- ConversionOptions() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions
- convert() - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTable.RowToDocument
- convert(ResolvedNodes.ResolvedQueryStmt, List<RelNode>) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.QueryStatementConverter
- convert(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamCalcRule
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamEnumerableConverterRule
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamIntersectRule
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamIOSinkRule
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamMatchRule
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamMinusRule
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamSideInputLookupJoinRule
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamSortRule
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamTableFunctionScanRule
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamUncollectRule
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamUnionRule
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamValuesRule
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamWindowRule
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.BeamZetaSqlUncollectRule
- Convert - Class in org.apache.beam.sdk.schemas.transforms
-
A set of utilities for converting between different objects supporting schemas.
- Convert() - Constructor for class org.apache.beam.sdk.schemas.transforms.Convert
- CONVERT_TO_BIG_DECIMAL - Enum constant in enum class org.apache.beam.sdk.extensions.sbe.UnsignedOptions.Behavior
-
Converts the unsigned value to a
BigDecimal
value. - CONVERT_TO_STRING - Enum constant in enum class org.apache.beam.sdk.extensions.sbe.UnsignedOptions.Behavior
-
Converts the unsigned value to a string representation.
- convertArray(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
- convertArray(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
- convertArray(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
- convertArray(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
- convertAvroFieldStrict(Object, Schema, Schema.FieldType) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
-
Strict conversion from AVRO to Beam, strict because it doesn't do widening or narrowing during conversion.
- convertAvroFieldStrict(Object, Schema, Schema.FieldType, GenericData) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
-
Strict conversion from AVRO to Beam, strict because it doesn't do widening or narrowing during conversion.
- convertAvroFormat(Schema.FieldType, Object, BigQueryUtils.ConversionOptions) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
-
Tries to convert an Avro decoded value to a Beam field value based on the target type of the Beam field.
- convertByteBuffer(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
- convertByteBuffer(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
- convertByteBuffer(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
- convertByteBuffer(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
- convertCharSequence(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
- convertCharSequence(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
- convertCharSequence(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
- convertCharSequence(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
- convertCollection(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
- convertCollection(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
- convertCollection(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
- convertCollection(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
- convertDateTime(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
- convertDateTime(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
- convertDateTime(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
- convertDateTime(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
- convertDefault(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.AvroConvertType
- convertDefault(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.AvroConvertValueForGetter
- convertDefault(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.AvroConvertValueForSetter
- convertDefault(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
- convertDefault(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
- convertDefault(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
- convertDefault(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
- ConvertedSchemaInformation(SchemaCoder<T>, Schema.FieldType) - Constructor for class org.apache.beam.sdk.schemas.utils.ConvertHelpers.ConvertedSchemaInformation
- convertEnum(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
- convertEnum(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
- convertEnum(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
- convertEnum(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
- convertGenericRecordToTableRow(GenericRecord) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
-
Convert generic record to Bq TableRow.
- convertGenericRecordToTableRow(GenericRecord, TableSchema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
-
Deprecated.
- ConvertHelpers - Class in org.apache.beam.sdk.schemas.utils
-
Helper functions for converting between equivalent schema types.
- ConvertHelpers() - Constructor for class org.apache.beam.sdk.schemas.utils.ConvertHelpers
- ConvertHelpers.ConvertedSchemaInformation<T> - Class in org.apache.beam.sdk.schemas.utils
-
Return value after converting a schema.
- convertIterable(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
- convertIterable(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
- convertIterable(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
- convertIterable(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
- convertList(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
- convertList(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
- convertList(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
- convertList(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
- convertMap(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
- convertMap(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
- convertMap(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
- convertMap(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
- convertNewPartitionRowKeyToPartition(ByteString) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
-
Convert new partition row key to partition to process metadata read from Bigtable.
- convertNode2Map(JsonNode) - Static method in class org.apache.beam.sdk.extensions.sql.TableUtils
- convertNumbers(TableRow) - Static method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
- convertPartitionToNewPartitionRowKey(Range.ByteStringRange) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
-
Convert partition to a New Partition row key to query for partitions ready to be streamed as the result of splits and merges.
- convertPartitionToStreamPartitionRowKey(Range.ByteStringRange) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
-
Convert partition to a Stream Partition row key to query for metadata of partitions that are currently being streamed.
- convertPrimitive(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
- convertPrimitive(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
- convertPrimitive(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
- convertPrimitive(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
- convertRelNodeToRexRangeRef(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.ExpressionConverter
- convertRelOptCost(RelOptCost) - Static method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
- convertResolvedLiteral(ResolvedNodes.ResolvedLiteral) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.ExpressionConverter
-
Convert a resolved literal to a RexNode.
- convertRexNodeFromResolvedExpr(ResolvedNodes.ResolvedExpr) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.ExpressionConverter
-
Create a RexNode for a corresponding resolved expression.
- convertRexNodeFromResolvedExpr(ResolvedNodes.ResolvedExpr, List<ResolvedColumn>, List<RelDataTypeField>, Map<String, RexNode>) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.ExpressionConverter
-
Create a RexNode for a corresponding resolved expression node.
- convertRootQuery(ConversionContext, ResolvedNodes.ResolvedQueryStmt) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.QueryStatementConverter
- convertStreamPartitionRowKeyToPartition(ByteString) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
-
Convert stream partition row key to partition to process metadata read from Bigtable.
- convertTableValuedFunction(RelNode, TableValuedFunction, List<ResolvedNodes.ResolvedFunctionArgument>, List<ResolvedColumn>) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.ExpressionConverter
-
Convert a TableValuedFunction in ZetaSQL to a RexCall in Calcite.
- convertToBagSpecInternal(StateSpec<CombiningState<InputT, AccumT, OutputT>>) - Static method in class org.apache.beam.sdk.state.StateSpecs
-
For internal use only; no backwards-compatibility guarantees.
- convertToBeamRel(String) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner
- convertToBeamRel(String, List<Value>) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner
- convertToBeamRel(String, Map<String, Value>) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner
- convertToBeamRel(String, QueryPlanner.QueryParameters) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner
-
It parses and validate the input query, then convert into a
BeamRelNode
tree. - convertToBeamRel(String, QueryPlanner.QueryParameters) - Method in interface org.apache.beam.sdk.extensions.sql.impl.QueryPlanner
-
It parses and validate the input query, then convert into a
BeamRelNode
tree. - convertToBeamRel(String, QueryPlanner.QueryParameters) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner
- convertToFileResourceIfPossible(String) - Static method in class org.apache.beam.sdk.io.FileBasedSink
-
This is a helper function for turning a user-provided output filename prefix and converting it into a
ResourceId
for writing output files. - convertToJcsmpDestination(Solace.Destination) - Static method in class org.apache.beam.sdk.io.solace.SolaceIO
-
Convert to a JCSMP destination from a schema-enabled
Solace.Destination
. - convertToMapSpecInternal(StateSpec<SetState<KeyT>>) - Static method in class org.apache.beam.sdk.state.StateSpecs
-
For internal use only; no backwards-compatibility guarantees.
- convertToMultimapSpecInternal(StateSpec<MapState<KeyT, ValueT>>) - Static method in class org.apache.beam.sdk.state.StateSpecs
-
For internal use only; no backwards-compatibility guarantees.
- ConvertType(boolean) - Constructor for class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
- ConvertValueForGetter(StackManipulation) - Constructor for class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
- ConvertValueForSetter(StackManipulation) - Constructor for class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
- copy() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
-
Returns a copy of this RandomAccessData.
- copy() - Method in class org.apache.beam.runners.spark.metrics.MetricsContainerStepMapAccumulator
- copy() - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.MetricsAccumulator
- copy(byte[]) - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueSerializer
- copy(byte[], byte[]) - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueSerializer
- copy(Iterable<String>, Iterable<String>) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
- copy(List<ClassLoaderFileSystem.ClassLoaderResourceId>, List<ClassLoaderFileSystem.ClassLoaderResourceId>) - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem
- copy(List<ResourceId>, List<ResourceId>, MoveOptions...) - Static method in class org.apache.beam.sdk.io.FileSystems
-
Copies a
List
of file-like resources from one location to another. - copy(List<ResourceIdT>, List<ResourceIdT>) - Method in class org.apache.beam.sdk.io.FileSystem
-
Copies a
List
of file-like resources from one location to another. - copy(StateNamespace) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.FlinkStateNamespaceKeySerializer
- copy(StateNamespace, StateNamespace) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.FlinkStateNamespaceKeySerializer
- copy(RelTraitSet, List<RelNode>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamEnumerableConverter
- copy(RelTraitSet, List<RelNode>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSinkRel
- copy(RelTraitSet, List<RelNode>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMatchRel
- copy(RelTraitSet, List<RelNode>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamWindowRel
- copy(RelTraitSet, List<RelNode>) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.ZetaSqlUnnest
- copy(RelTraitSet, List<RelNode>, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIntersectRel
- copy(RelTraitSet, List<RelNode>, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMinusRel
- copy(RelTraitSet, List<RelNode>, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnionRel
- copy(RelTraitSet, List<RelNode>, RexNode, Type, RelDataType, Set<RelColumnMapping>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamTableFunctionScanRel
- copy(RelTraitSet, RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUncollectRel
- copy(RelTraitSet, RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnnestRel
- copy(RelTraitSet, RelNode) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.BeamZetaSqlUncollectRel
- copy(RelTraitSet, RelNode) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.BeamZetaSqlUnnestRel
- copy(RelTraitSet, RelNode) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.ZetaSqlUnnest
- copy(RelTraitSet, RelNode, List<RexLiteral>, RelDataType, List<Window.Group>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamWindowRel
- copy(RelTraitSet, RelNode, RelCollation, RexNode, RexNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel
- copy(RelTraitSet, RelNode, RexProgram) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel
- copy(RelTraitSet, RelNode, RexProgram) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.BeamZetaSqlCalcRel
- copy(RelTraitSet, RelNode, ImmutableBitSet, List<ImmutableBitSet>, List<AggregateCall>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamAggregationRel
- copy(RelTraitSet, RexNode, RelNode, RelNode, JoinRelType, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCoGBKJoinRel
- copy(RelTraitSet, RexNode, RelNode, RelNode, JoinRelType, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSideInputJoinRel
- copy(RelTraitSet, RexNode, RelNode, RelNode, JoinRelType, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSideInputLookupJoinRel
- copy(RelNode, RelDataType, RexNode, boolean, boolean, Map<String, RexNode>, Map<String, RexNode>, RexNode, Map<String, ? extends SortedSet<String>>, boolean, ImmutableBitSet, RelCollation, RexNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMatchRel
- copy(DataInputView, DataOutputView) - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeSerializer
- copy(DataInputView, DataOutputView) - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueSerializer
- copy(DataInputView, DataOutputView) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.FlinkStateNamespaceKeySerializer
- copy(T) - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeSerializer
- copy(T, T) - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeSerializer
- copyFrom(byte[]) - Static method in class org.apache.beam.sdk.io.range.ByteKey
-
Creates a new
ByteKey
backed by a copy of the specifiedbyte[]
. - copyFrom(ByteBuffer) - Static method in class org.apache.beam.sdk.io.range.ByteKey
-
Creates a new
ByteKey
backed by a copy of the data remaining in the specifiedByteBuffer
. - copyFrom(FieldSpecifierNotationParser.DotExpressionComponentContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionComponentContext
- copyFrom(FieldSpecifierNotationParser.QualifierListContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifierListContext
- copyResourcesFromJar(JarFile) - Method in class org.apache.beam.runners.jobsubmission.PortablePipelineJarCreator
-
Copy resources from
inputJar
toPortablePipelineJarCreator.outputStream
. - copyToList(ArrayData, DataType) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers.Utils
- coreName() - Method in class org.apache.beam.sdk.io.solr.SolrIO.ReplicaInfo
- coreUrl() - Method in class org.apache.beam.sdk.io.solr.SolrIO.ReplicaInfo
- CorrelationKey() - Constructor for class org.apache.beam.sdk.io.solace.data.Solace.CorrelationKey
- cosh(Double) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinTrigonometricFunctions
-
COSH(X)
- CosmosClientBuilderFactory() - Constructor for class org.apache.beam.sdk.io.azure.cosmos.CosmosOptions.CosmosClientBuilderFactory
- CosmosIO - Class in org.apache.beam.sdk.io.azure.cosmos
- CosmosIO.BoundedCosmosBDSource<T> - Class in org.apache.beam.sdk.io.azure.cosmos
-
A
BoundedSource
reading from Comos. - CosmosIO.Read<T> - Class in org.apache.beam.sdk.io.azure.cosmos
- CosmosOptions - Interface in org.apache.beam.sdk.io.azure.cosmos
- CosmosOptions.CosmosClientBuilderFactory - Class in org.apache.beam.sdk.io.azure.cosmos
-
Create a cosmos client from the pipeline options.
- COST_OPTIMIZED - Enum constant in enum class org.apache.beam.runners.dataflow.options.DataflowPipelineOptions.FlexResourceSchedulingGoal
-
Optimize for lower cost.
- Count - Class in org.apache.beam.sdk.transforms
-
PTransforms
to count the elements in aPCollection
. - COUNT - Enum constant in enum class org.apache.beam.sdk.extensions.sql.impl.cep.CEPKind
- countAsserts(Pipeline) - Static method in class org.apache.beam.sdk.testing.PAssert
- counter(Class<?>, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
-
Create a metric that can be incremented and decremented, and is aggregated by taking the sum.
- counter(String, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
-
Create a metric that can be incremented and decremented, and is aggregated by taking the sum.
- Counter - Interface in org.apache.beam.sdk.metrics
-
A metric that reports a single long value and can be incremented or decremented.
- COUNTER - Static variable in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTable
- CounterImpl - Class in org.apache.beam.runners.jet.metrics
-
Implementation of
Counter
. - CounterMark(long, Instant) - Constructor for class org.apache.beam.sdk.io.CountingSource.CounterMark
-
Creates a checkpoint mark reflecting the last emitted value.
- CounterMarkCoder() - Constructor for class org.apache.beam.sdk.io.CountingSource.CounterMarkCoder
- CountErrors(Counter) - Constructor for class org.apache.beam.sdk.transforms.errorhandling.ErrorHandler.PTransformErrorHandler.WriteErrorMetrics.CountErrors
- CountIf - Class in org.apache.beam.sdk.extensions.sql.impl.transform.agg
-
Returns the count of TRUE values for expression.
- COUNTIF - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
- CountIf.CountIfFn - Class in org.apache.beam.sdk.extensions.sql.impl.transform.agg
- CountIfFn() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CountIf.CountIfFn
- CountingPipelineVisitor - Class in org.apache.beam.runners.flink.translation.utils
-
Pipeline visitors that fills a lookup table of
PValue
to number of consumers. - CountingPipelineVisitor() - Constructor for class org.apache.beam.runners.flink.translation.utils.CountingPipelineVisitor
- CountingReadableByteChannel - Class in org.apache.beam.sdk.extensions.gcp.util.channels
- CountingReadableByteChannel(ReadableByteChannel, Consumer<Integer>) - Constructor for class org.apache.beam.sdk.extensions.gcp.util.channels.CountingReadableByteChannel
- CountingSeekableByteChannel - Class in org.apache.beam.sdk.extensions.gcp.util.channels
- CountingSeekableByteChannel(SeekableByteChannel, Consumer<Integer>, Consumer<Integer>) - Constructor for class org.apache.beam.sdk.extensions.gcp.util.channels.CountingSeekableByteChannel
- CountingSource - Class in org.apache.beam.sdk.io
-
Most users should use
GenerateSequence
instead. - CountingSource.CounterMark - Class in org.apache.beam.sdk.io
-
The checkpoint for an unbounded
CountingSource
is simply the last value produced. - CountingSource.CounterMarkCoder - Class in org.apache.beam.sdk.io
-
A custom coder for
CounterMark
. - CountingWritableByteChannel - Class in org.apache.beam.sdk.extensions.gcp.util.channels
- CountingWritableByteChannel(WritableByteChannel, Consumer<Integer>) - Constructor for class org.apache.beam.sdk.extensions.gcp.util.channels.CountingWritableByteChannel
- countPartitionsCreatedAfter(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
-
Counts all partitions with a
PartitionMetadataAdminDao.COLUMN_CREATED_AT
less than the given timestamp. - CountWords() - Constructor for class org.apache.beam.runners.spark.structuredstreaming.examples.WordCount.CountWords
- CovarianceFn<T> - Class in org.apache.beam.sdk.extensions.sql.impl.transform.agg
-
Combine.CombineFn
for Covariance onNumber
types. - coverSameKeySpace(List<Range.ByteStringRange>, Range.ByteStringRange) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ByteStringRangeHelper
-
Returns true if parentPartitions form a proper superset of childPartition.
- CrashingRunner - Class in org.apache.beam.sdk.testing
-
A
PipelineRunner
that applies no overrides and throws an exception on calls toPipeline.run()
. - CrashingRunner() - Constructor for class org.apache.beam.sdk.testing.CrashingRunner
- create() - Static method in class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
-
Creates a ConnectorConfiguration.
- create() - Static method in class org.apache.beam.runners.dataflow.internal.DataflowGroupByKey
-
Returns a
DataflowGroupByKey<K, V>
PTransform
. - create() - Static method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.PayloadTranslator
- create() - Static method in class org.apache.beam.runners.fnexecution.control.MapControlClientPool
-
Creates a
MapControlClientPool
. - create() - Static method in class org.apache.beam.runners.fnexecution.environment.ProcessManager
- create() - Static method in class org.apache.beam.runners.fnexecution.state.GrpcStateService
-
Create a new
GrpcStateService
. - create() - Method in interface org.apache.beam.runners.jobsubmission.JobServerDriver.JobInvokerFactory
- create() - Static method in class org.apache.beam.runners.spark.SparkRunner
-
Creates and returns a new SparkRunner with default options.
- create() - Static method in class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingRunner
-
Creates and returns a new SparkStructuredStreamingRunner with default options.
- create() - Static method in class org.apache.beam.sdk.fn.server.InProcessServerFactory
- create() - Static method in class org.apache.beam.sdk.io.aws2.kinesis.WatermarkParameters
- create() - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Document
- create() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingInsertsMetrics.StreamingInsertsMetricsImpl
- create() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
-
Creates an instance of this rule using options provided by
TestPipeline.testingPipelineOptions()
. - create() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal
-
Creates an instance of this rule.
- create() - Static method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- create() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- create() - Static method in class org.apache.beam.sdk.io.jdbc.JdbcWriteResult
- create() - Static method in class org.apache.beam.sdk.io.jms.RetryConfiguration
- create() - Static method in class org.apache.beam.sdk.io.kafka.KafkaMetrics.KafkaMetricsImpl
- create() - Static method in class org.apache.beam.sdk.io.mongodb.AggregationQuery
- create() - Static method in class org.apache.beam.sdk.io.mongodb.FindQuery
- create() - Static method in class org.apache.beam.sdk.io.mongodb.UpdateConfiguration
- create() - Static method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverConfiguration
- create() - Static method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
- create() - Static method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- create() - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionServiceFactory
- create() - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthSempClientFactory
- create() - Method in class org.apache.beam.sdk.io.solace.broker.GCPSecretSessionServiceFactory
- create() - Method in interface org.apache.beam.sdk.io.solace.broker.SempClientFactory
-
This method is the core of the factory interface.
- create() - Method in class org.apache.beam.sdk.io.solace.broker.SessionServiceFactory
-
This is the core method that subclasses must implement.
- create() - Static method in class org.apache.beam.sdk.io.solace.RetryCallableManager
-
Creates a new
RetryCallableManager
with default retry settings. - create() - Method in class org.apache.beam.sdk.io.splunk.SplunkEvent.Builder
-
Creates a
SplunkEvent
object. - create() - Method in class org.apache.beam.sdk.io.splunk.SplunkWriteError.Builder
-
Builds a
SplunkWriteError
object. - create() - Method in class org.apache.beam.sdk.options.PipelineOptionsFactory.Builder
-
Creates and returns an object that implements
PipelineOptions
using the values configured on this builder during construction. - create() - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
-
Creates and returns an object that implements
PipelineOptions
. - create() - Static method in class org.apache.beam.sdk.Pipeline
-
Constructs a pipeline from default
PipelineOptions
. - create() - Static method in class org.apache.beam.sdk.PipelineRunner
-
Creates a runner from the default app
PipelineOptions
. - create() - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
-
Return an empty
FieldAccessDescriptor
. - create() - Static method in class org.apache.beam.sdk.schemas.transforms.AddFields
- create() - Static method in class org.apache.beam.sdk.schemas.transforms.Filter
- create() - Static method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
-
Returns a transform that does a global combine using an aggregation built up by calls to aggregateField and aggregateFields.
- create() - Static method in class org.apache.beam.sdk.schemas.transforms.RenameFields
-
Create an instance of this transform.
- create() - Static method in class org.apache.beam.sdk.schemas.transforms.Select
- create() - Static method in class org.apache.beam.sdk.testing.TestPipeline
-
Creates and returns a new test pipeline.
- create() - Static method in class org.apache.beam.sdk.transforms.Distinct
-
Returns a
Distinct<T>
PTransform
. - create() - Static method in class org.apache.beam.sdk.transforms.DoFnSchemaInformation
-
Create an instance.
- create() - Static method in class org.apache.beam.sdk.transforms.GroupByKey
-
Returns a
GroupByKey<K, V>
PTransform
. - create() - Static method in class org.apache.beam.sdk.transforms.Impulse
-
Create a new
Impulse
PTransform
. - create() - Static method in class org.apache.beam.sdk.transforms.join.CoGroupByKey
-
Returns a
CoGroupByKey<K>
PTransform
. - create() - Static method in class org.apache.beam.sdk.transforms.Keys
-
Returns a
Keys<K>
PTransform
. - create() - Static method in class org.apache.beam.sdk.transforms.KvSwap
-
Returns a
KvSwap<K, V>
PTransform
. - create() - Static method in class org.apache.beam.sdk.transforms.PeriodicImpulse
- create() - Static method in class org.apache.beam.sdk.transforms.PeriodicSequence
- create() - Static method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHints
-
Creates a
ResourceHints
instance with no hints. - create() - Static method in class org.apache.beam.sdk.transforms.Values
-
Returns a
Values<V>
PTransform
. - create(boolean) - Static method in class org.apache.beam.runners.flink.TestFlinkRunner
- create(boolean, Map<TupleTag<?>, KV<WindowingStrategy<?, ?>, SideInputBroadcast<?>>>) - Static method in class org.apache.beam.runners.spark.util.SideInputReaderFactory
-
Creates and returns a
SideInputReader
based on the configuration. - create(byte[], SparkPCollectionView.Type, Coder<Iterable<WindowedValue<?>>>) - Static method in class org.apache.beam.runners.spark.translation.SideInputMetadata
-
Creates a new instance of SideInputMetadata.
- create(byte[], SparkPCollectionView.Type, Coder<T>) - Static method in class org.apache.beam.runners.spark.util.SideInputBroadcast
- create(double) - Static method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
-
Returns
TDigestQuantiles.TDigestQuantilesFn
combiner with the given compression factor. - create(double) - Static method in class org.apache.beam.sdk.extensions.sql.impl.planner.NodeStats
-
It creates an instance with rate=0 and window=rowCount for bounded sources.
- create(double, double, double) - Static method in class org.apache.beam.sdk.extensions.sql.impl.planner.NodeStats
- create(int) - Static method in class org.apache.beam.sdk.io.jms.RetryConfiguration
- create(int) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
-
Like
ApproximateQuantiles.ApproximateQuantilesCombineFn.create(int, Comparator)
, but sorts values using their natural ordering. - create(int, ComparatorT) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
-
Returns an approximate quantiles combiner with the given
compareFn
and desired number of quantiles. - create(int, ComparatorT, long, double) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
-
Creates an approximate quantiles combiner with the given
compareFn
and desired number of quantiles. - create(int, Duration, Duration) - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO.RetryConfiguration
- create(int, Duration, Duration) - Static method in class org.apache.beam.sdk.io.jms.RetryConfiguration
- create(int, Duration) - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.RetryConfiguration
-
Creates RetryConfiguration for
ElasticsearchIO
with provided maxAttempts, maxDurations and exponential backoff based retries. - create(int, Duration) - Static method in class org.apache.beam.sdk.io.solr.SolrIO.RetryConfiguration
- create(long) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResult
- create(long, long, long, long) - Static method in class org.apache.beam.sdk.metrics.DistributionResult
- create(long, long, SerializableFunction<InputT, Long>, Duration) - Static method in class org.apache.beam.sdk.transforms.GroupIntoBatches.BatchingParams
- create(long, Instant) - Static method in class org.apache.beam.sdk.metrics.GaugeResult
- create(BuilderT, ClientConfiguration, AwsOptions) - Method in interface org.apache.beam.sdk.io.aws2.common.ClientBuilderFactory
-
Configure a client builder
ClientBuilderFactory
using the providedClientConfiguration
and fall back to the global defaults inAwsOptions
where necessary. - create(BuilderT, ClientConfiguration, AwsOptions) - Method in class org.apache.beam.sdk.io.aws2.common.ClientBuilderFactory.DefaultClientBuilder
- create(BuilderT, AwsOptions) - Method in interface org.apache.beam.sdk.io.aws2.common.ClientBuilderFactory
-
Configure a client builder
ClientBuilderFactory
using the global defaults inAwsOptions
. - create(StorageObject) - Static method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.StorageObjectOrIOException
- create(BatchTransactionId) - Static method in class org.apache.beam.sdk.io.gcp.spanner.Transaction
- create(Mutation, Mutation...) - Static method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
-
Creates a new group.
- create(Mutation, Iterable<Mutation>) - Static method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
- create(JCSMPProperties, Queue) - Static method in class org.apache.beam.sdk.io.solace.broker.JcsmpSessionService
- create(EventT, Exception) - Static method in class org.apache.beam.sdk.extensions.ordered.UnprocessedEvent
-
Create new unprocessed event which failed due to an exception thrown.
- create(EventT, UnprocessedEvent.Reason) - Static method in class org.apache.beam.sdk.extensions.ordered.UnprocessedEvent
-
Create new unprocessed event.
- create(IOException) - Static method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.StorageObjectOrIOException
- create(String, String) - Static method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.GcsCountersOptions
- create(String, MetricName) - Static method in class org.apache.beam.sdk.metrics.MetricKey
- create(Class<?>, String, String, String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.ZetaSqlScalarFunctionImpl
-
Creates
Function
from given class. - create(Class<?>, Schema) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils.ClassWithSchema
- create(Iterable<MetricResult<Long>>, Iterable<MetricResult<DistributionResult>>, Iterable<MetricResult<GaugeResult>>, Iterable<MetricResult<StringSetResult>>, Iterable<MetricResult<BoundedTrieResult>>, Iterable<MetricResult<HistogramData>>) - Static method in class org.apache.beam.sdk.metrics.MetricQueryResults
- create(Long, long, Long, Long, long, long, long, boolean, ContiguousSequenceRange) - Static method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus
- create(Object...) - Method in interface org.apache.beam.sdk.schemas.SchemaUserTypeCreator
- create(Method) - Static method in class org.apache.beam.sdk.extensions.sql.impl.ScalarFunctionImpl
-
Creates
Function
from given method. - create(Method, String) - Static method in class org.apache.beam.sdk.extensions.sql.impl.ScalarFunctionImpl
-
Creates
invalid reference
org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.Function
- create(Method, String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.UserFunctionDefinitions.JavaScalarFunction
- create(Method, String, String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.ZetaSqlScalarFunctionImpl
-
Creates
Function
from given method. - create(String) - Method in interface org.apache.beam.runners.fnexecution.control.OutputReceiverFactory
-
Get a new
FnDataReceiver
for an output PCollection. - create(String) - Static method in class org.apache.beam.sdk.fn.channel.AddHarnessIdInterceptor
- create(String) - Static method in interface org.apache.beam.sdk.io.aws2.auth.WebIdTokenProvider
-
Factory method for OIDC web identity token provider implementations.
- create(String) - Static method in class org.apache.beam.sdk.io.mqtt.MqttIO.ConnectionConfiguration
- create(String) - Static method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.DataSourceConfiguration
- create(String) - Static method in class org.apache.beam.sdk.io.solr.SolrIO.ConnectionConfiguration
-
Creates a new Solr connection configuration.
- create(String) - Static method in class org.apache.beam.sdk.metrics.LabeledMetricNameUtils.ParsedMetricName
- create(String[]) - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
Creates a new Elasticsearch connection configuration with no default index nor type.
- create(String...) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
-
Create an enumeration type from a fixed set of String values; integer values will be automatically chosen.
- create(String[], String) - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
Creates a new Elasticsearch connection configuration with no default type.
- create(String[], String, String) - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
Creates a new Elasticsearch connection configuration.
- create(String, int) - Static method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
- create(String, String) - Method in class org.apache.beam.runners.fnexecution.control.TimerReceiverFactory
- create(String, String) - Static method in class org.apache.beam.sdk.io.gcp.datastore.RowToEntity
-
Create a PTransform instance.
- create(String, String) - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
- create(String, String) - Static method in class org.apache.beam.sdk.io.mqtt.MqttIO.ConnectionConfiguration
-
Describe a connection configuration to the MQTT broker.
- create(String, String, String) - Static method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverConfiguration
- create(String, String, String, long, long) - Static method in class org.apache.beam.sdk.io.aws2.sqs.SqsMessage
- create(String, String, String, Boolean) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher.TableAndQuery
- create(String, String, String, Struct) - Static method in class org.apache.beam.runners.fnexecution.provisioning.JobInfo
- create(String, String, Coder<T>, FnDataReceiver<T>) - Static method in class org.apache.beam.sdk.fn.data.TimerEndpoint
- create(String, Map<String, String>) - Static method in class org.apache.beam.sdk.metrics.LabeledMetricNameUtils.ParsedMetricName
- create(String, Coder<T>, FnDataReceiver<T>) - Static method in class org.apache.beam.sdk.fn.data.DataEndpoint
- create(String, Schema.FieldType) - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations
- create(String, Schema.FieldType) - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions
- create(String, ByteString, OutputStream) - Static method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService.ArtifactDestination
- create(WritableByteChannel) - Method in enum class org.apache.beam.sdk.io.FileBasedSink.CompressionType
-
Deprecated.
- create(WritableByteChannel) - Method in interface org.apache.beam.sdk.io.FileBasedSink.WritableByteChannelFactory
- create(List<? extends FnService>, Endpoints.ApiServiceDescriptor, ServerFactory) - Static method in class org.apache.beam.sdk.fn.server.GrpcFnServer
-
Create
GrpcFnServer
s for the providedFnService
s running on a specified port. - create(List<String>) - Static method in class org.apache.beam.sdk.extensions.sql.impl.TableName
-
Full table name with path.
- create(List<String>) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
-
Create an enumeration type from a fixed set of String values; integer values will be automatically chosen.
- create(List<String>, String) - Static method in class org.apache.beam.sdk.extensions.sql.impl.TableName
-
Table name plus the path up to but not including table name.
- create(List<String>, String) - Static method in class org.apache.beam.sdk.schemas.transforms.Cast.CompatibilityError
- create(List<String>, Map<String, List<Dependency>>) - Static method in class org.apache.beam.sdk.expansion.service.ExpansionServiceConfig
- create(List<String>, Optional<Schema.TypeName>) - Static method in class org.apache.beam.sdk.schemas.utils.SchemaZipFold.Context
- create(List<Schema.Field>) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
-
Create an
OneOfType
logical type. - create(List<Schema.Field>, Map<String, Integer>) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
-
Create an
OneOfType
logical type. - create(List<BindableService>, Endpoints.ApiServiceDescriptor) - Method in class org.apache.beam.sdk.fn.server.InProcessServerFactory
- create(List<BindableService>, Endpoints.ApiServiceDescriptor) - Method in class org.apache.beam.sdk.fn.server.ServerFactory
-
Creates an instance of this server at the address specified by the given service descriptor and bound to multiple services.
- create(List<BindableService>, Endpoints.ApiServiceDescriptor) - Method in class org.apache.beam.sdk.fn.server.ServerFactory.InetSocketAddressServerFactory
- create(Map<String, Integer>) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
-
Create an enumeration type over a set of String->Integer values.
- create(Map<String, String>) - Static method in class org.apache.beam.sdk.io.hcatalog.HCatalogBeamSchema
-
Create the schema adapter.
- create(Map<String, Broadcast<SideInputValues<?>>>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.SparkSideInputReader
-
Creates a
SideInputReader
for Spark from a map of PCollectionViewtag ids
and the corresponding broadcastedSideInputValues
. - create(Set<String>) - Static method in class org.apache.beam.sdk.metrics.StringSetResult
-
Creates a
StringSetResult
from the givenSet
by making an immutable copy. - create(Set<List<String>>) - Static method in class org.apache.beam.sdk.metrics.BoundedTrieResult
-
Creates a
BoundedTrieResult
from the givenSet
by making an immutable copy. - create(DataSource) - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
- create(DataSource) - Static method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
Creates
SnowflakeIO.DataSourceConfiguration
from existing instance ofDataSource
. - create(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.TypeWithNullability
- create(ProvisionApi.ProvisionInfo, HeaderAccessor) - Static method in class org.apache.beam.runners.fnexecution.provisioning.StaticGrpcProvisionService
- create(Endpoints.ApiServiceDescriptor, HeaderAccessor) - Static method in class org.apache.beam.runners.fnexecution.status.BeamWorkerStatusGrpcService
-
Create new instance of
BeamWorkerStatusGrpcService
. - create(DoFnRunner<InputT, OutputT>, String, Coder, Coder, OperatorStateBackend, KeyedStateBackend<Object>, int, SerializablePipelineOptions) - Static method in class org.apache.beam.runners.flink.translation.wrappers.streaming.stableinput.BufferingDoFnRunner
- create(DoFnRunner<InputT, OutputT>, String, Coder, Coder, OperatorStateBackend, KeyedStateBackend<Object>, int, SerializablePipelineOptions, Supplier<Locker>, Function<InputT, Object>, Runnable) - Static method in class org.apache.beam.runners.flink.translation.wrappers.streaming.stableinput.BufferingDoFnRunner
- create(DataflowPipelineOptions) - Static method in class org.apache.beam.runners.dataflow.DataflowClient
- create(FlinkJobServerDriver.FlinkServerConfiguration) - Static method in class org.apache.beam.runners.flink.FlinkJobInvoker
- create(ReferenceCountingExecutableStageContextFactory.Creator, SerializableFunction<Object, Boolean>) - Static method in class org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory
- create(EnvironmentFactory, GrpcFnServer<GrpcDataService>, GrpcFnServer<GrpcStateService>, IdGenerator) - Static method in class org.apache.beam.runners.fnexecution.control.SingleEnvironmentInstanceJobBundleFactory
-
Deprecated.
- create(ProcessManager, RunnerApi.Environment, String, InstructionRequestHandler) - Static method in class org.apache.beam.runners.fnexecution.environment.ProcessEnvironment
- create(ProcessManager, GrpcFnServer<StaticGrpcProvisionService>, ControlClientPool.Source, IdGenerator, PipelineOptions) - Static method in class org.apache.beam.runners.fnexecution.environment.ProcessEnvironmentFactory
- create(JobInfo) - Static method in class org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext
- create(JobInfo) - Static method in class org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory
- create(JobInfo, Map<String, EnvironmentFactory.Provider>) - Static method in class org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory
- create(SparkJobServerDriver.SparkServerConfiguration) - Static method in class org.apache.beam.runners.spark.SparkJobInvoker
- create(SparkPipelineOptions) - Static method in class org.apache.beam.runners.spark.SparkRunner
-
Creates and returns a new SparkRunner with specified options.
- create(SparkStructuredStreamingPipelineOptions) - Static method in class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingRunner
-
Creates and returns a new SparkStructuredStreamingRunner with specified options.
- create(SparkCombineFn<InputT, ValueT, AccumT, ?>, Function<InputT, ValueT>, WindowingStrategy<?, ?>, Comparator<BoundedWindow>) - Static method in interface org.apache.beam.runners.spark.translation.SparkCombineFn.WindowedAccumulator
-
Create concrete accumulator for given type.
- create(Coder<InputT>) - Static method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
-
Returns an
ApproximateDistinct.ApproximateDistinctFn
combiner with the given input coder. - create(Coder<InputT>) - Static method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
-
Returns a
SketchFrequencies.CountMinSketchFn
combiner with the given input coder. - create(Coder<T>) - Static method in class org.apache.beam.sdk.testing.TestStream
-
Create a new
TestStream.Builder
with no elements and watermark equal toBoundedWindow.TIMESTAMP_MIN_VALUE
. - create(Coder<T>, Coder<MetaT>) - Static method in class org.apache.beam.sdk.values.PCollectionViews.ValueOrMetadataCoder
- create(Coder<T>, FnDataReceiver<T>) - Static method in class org.apache.beam.sdk.fn.data.DecodingFnDataReceiver
- create(ExpansionService, String, int) - Static method in class org.apache.beam.sdk.expansion.service.ExpansionServer
-
Create a
ExpansionServer
for the provided ExpansionService running on an arbitrary port. - create(GcsPath, String) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
-
Deprecated.Use
GcsUtil.create(GcsPath, CreateOptions)
instead. - create(GcsPath, String, Integer) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
-
Deprecated.Use
GcsUtil.create(GcsPath, CreateOptions)
instead. - create(GcsPath, GcsUtil.CreateOptions) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
-
Creates an object in GCS and prepares for uploading its contents.
- create(OrderedProcessingHandler<EventTypeT, EventKeyTypeT, StateTypeT, ResultTypeT>) - Static method in class org.apache.beam.sdk.extensions.ordered.OrderedEventProcessor
-
Create the transform.
- create(BufferedExternalSorter.Options) - Static method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter
- create(BufferedExternalSorter.Options) - Static method in class org.apache.beam.sdk.extensions.sorter.SortValues
-
Returns a
SortValues<PrimaryKeyT, SecondaryKeyT, ValueT>
PTransform
. - create(ExternalSorter.Options) - Static method in class org.apache.beam.sdk.extensions.sorter.ExternalSorter
-
Returns a
Sorter
configured with the givenExternalSorter.Options
. - create(DataCatalogPipelineOptions) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
- create(GrpcFnServer<ArtifactStagingService>, Function<String, String>, ThrowingConsumer<Exception, String>, JobInvoker) - Static method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
-
Creates an InMemoryJobService.
- create(GrpcFnServer<ArtifactStagingService>, Function<String, String>, ThrowingConsumer<Exception, String>, JobInvoker, int) - Static method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
-
Creates an InMemoryJobService.
- create(GrpcFnServer<FnApiControlClientPoolService>, GrpcFnServer<GrpcLoggingService>, GrpcFnServer<ArtifactRetrievalService>, GrpcFnServer<StaticGrpcProvisionService>, ControlClientPool.Source, IdGenerator) - Static method in class org.apache.beam.runners.fnexecution.environment.ExternalEnvironmentFactory
- create(ClassLoaderFileSystem.ClassLoaderResourceId, CreateOptions) - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem
- create(EmptyMatchTreatment) - Static method in class org.apache.beam.sdk.io.FileIO.MatchConfiguration
-
Creates a
FileIO.MatchConfiguration
with the givenEmptyMatchTreatment
. - create(MatchResult.Status, IOException) - Static method in class org.apache.beam.sdk.io.fs.MatchResult
- create(MatchResult.Status, List<MatchResult.Metadata>) - Static method in class org.apache.beam.sdk.io.fs.MatchResult
- create(ResourceId, String) - Static method in class org.apache.beam.sdk.io.FileSystems
-
Returns a write channel for the given
ResourceId
. - create(ResourceId, CreateOptions) - Static method in class org.apache.beam.sdk.io.FileSystems
-
Returns a write channel for the given
ResourceId
withCreateOptions
. - create(SubscriptionPartition) - Method in interface org.apache.beam.sdk.io.gcp.pubsublite.internal.ManagedFactory
- create(SubscriptionPartition) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.ManagedFactoryImpl
- create(SpannerConfig, String, String) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.MetadataSpannerConfigFactory
-
Generates a SpannerConfig that can be used to access the change stream metadata database by copying only the necessary fields from the given primary database SpannerConfig and setting the instance ID and database ID to the supplied metadata values.
- create(MetricKey, T, T) - Static method in class org.apache.beam.sdk.metrics.MetricResult
- create(MetricKey, Boolean, T) - Static method in class org.apache.beam.sdk.metrics.MetricResult
- create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions.DataflowClientFactory
- create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions.StagerFactory
- create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions.UnboundedReaderMaxReadTimeFactory
- create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowPipelineOptions.StagingLocationFactory
- create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions.EnableWindmillServiceDirectPathFactory
- create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions.GlobalConfigRefreshPeriodFactory
- create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions.HarnessUpdateReportingPeriodFactory
- create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions.LocalWindmillHostportFactory
- create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions.MaxStackTraceDepthToReportFactory
- create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions.PeriodicStatusPageDirectoryFactory
- create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions.WindmillServiceStreamingRpcBatchLimitFactory
- create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DefaultGcpRegionFactory
- create(PipelineOptions) - Method in class org.apache.beam.runners.direct.DirectOptions.AvailableParallelismFactory
- create(PipelineOptions) - Method in class org.apache.beam.runners.flink.FlinkPipelineOptions.MaxBundleSizeFactory
- create(PipelineOptions) - Method in class org.apache.beam.runners.flink.FlinkPipelineOptions.MaxBundleTimeFactory
- create(PipelineOptions) - Method in class org.apache.beam.runners.portability.testing.TestPortablePipelineOptions.DefaultJobServerConfigFactory
- create(PipelineOptions) - Method in class org.apache.beam.runners.spark.SparkCommonPipelineOptions.StorageLevelFactory
- create(PipelineOptions) - Method in class org.apache.beam.runners.spark.SparkCommonPipelineOptions.TmpCheckpointDirFactory
- create(PipelineOptions) - Method in class org.apache.beam.runners.spark.SparkContextOptions.EmptyListenersList
- create(PipelineOptions) - Method in class org.apache.beam.runners.spark.TestSparkPipelineOptions.DefaultStopPipelineWatermarkFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.expansion.service.ExpansionServiceOptions.ExpansionServiceConfigFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.expansion.service.ExpansionServiceOptions.JavaClassLookupAllowListFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.DefaultProjectFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.EnableStreamingEngineFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.GcpOAuthScopesFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.GcpTempLocationFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.GcpUserCredentialsFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcsOptions.ExecutorServiceFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcsOptions.PathValidatorFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.GcsReadOptionsFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.GcsUtilFactory
-
Returns an instance of
GcsUtil
based on thePipelineOptions
. - create(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIOOptions.MapFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws2.options.AwsOptions.AwsRegionFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws2.options.AwsOptions.AwsUserCredentialsFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws2.options.S3Options.S3UploadBufferSizeBytesFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws2.options.S3Options.SSECustomerKeyFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.io.azure.cosmos.CosmosOptions.CosmosClientBuilderFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.io.azure.options.AzureOptions.AzureUserCredentialsFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsOptions.GoogleAdsCredentialsFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.io.hdfs.HadoopFileSystemOptions.ConfigurationLocator
- create(PipelineOptions) - Method in class org.apache.beam.sdk.metrics.MetricsOptions.NoOpMetricsSink
- create(PipelineOptions) - Method in interface org.apache.beam.sdk.options.DefaultValueFactory
-
Creates a default value for a getter marked with
Default.InstanceFactory
. - create(PipelineOptions) - Method in class org.apache.beam.sdk.options.ExecutorOptions.ScheduledExecutorServiceFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.options.PipelineOptions.AtomicLongFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.options.PipelineOptions.DirectRunner
- create(PipelineOptions) - Method in class org.apache.beam.sdk.options.PipelineOptions.JobNameFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.options.PipelineOptions.UserAgentFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.options.SdkHarnessOptions.BundleProcessorCacheTimeoutFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.options.SdkHarnessOptions.DefaultMaxCacheMemoryUsageMbFactory
- create(PipelineOptions) - Static method in class org.apache.beam.sdk.Pipeline
-
Constructs a pipeline from the provided
PipelineOptions
. - create(PipelineOptions) - Method in class org.apache.beam.sdk.testing.TestPipelineOptions.AlwaysPassMatcherFactory
- create(PipelineOptions) - Method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHintsOptions.EmptyListDefault
- create(PipelineOptions, Storage, HttpRequestInitializer, ExecutorService, Credentials, Integer, GcsUtil.GcsCountersOptions, GoogleCloudStorageReadOptions) - Static method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.GcsUtilFactory
-
Returns an instance of
GcsUtil
based on the given parameters. - create(PipelineOptions, ExecutorService, OutboundObserverFactory) - Static method in class org.apache.beam.runners.fnexecution.data.GrpcDataService
- create(PipelineOptions, GrpcFnServer<GrpcLoggingService>, GrpcFnServer<FnApiControlClientPoolService>, ControlClientPool.Source) - Static method in class org.apache.beam.runners.fnexecution.environment.EmbeddedEnvironmentFactory
- create(ValueProvider<TableReference>, DataFormat, ValueProvider<List<String>>, ValueProvider<String>, SerializableFunction<SchemaAndRecord, T>, Coder<T>, BigQueryServices, boolean) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
- create(ValueProvider<TableReference>, ValueProvider<List<String>>, ValueProvider<String>, SerializableFunction<SchemaAndRecord, T>, Coder<T>, BigQueryServices) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
- create(ValueProvider<String>, ValueProvider<String>) - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
- create(ValueProvider<String>, ValueProvider<Integer>) - Static method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
- create(ValueProvider<String>, ValueProvider<String>, ValueProvider<String>) - Static method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.DataSourceConfiguration
- create(Schema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
-
Creates an instance of this rule.
- create(Schema) - Static method in class org.apache.beam.sdk.testing.TestStream
- create(Schema.Field...) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
-
Create an
OneOfType
logical type. - create(Schema, String) - Static method in class org.apache.beam.sdk.io.gcp.datastore.EntityToRow
-
Create a PTransform instance.
- create(Schema, TypeDescriptor<T>, SerializableFunction<T, Row>, SerializableFunction<Row, T>) - Static method in class org.apache.beam.sdk.testing.TestStream
- create(JsonToRow.JsonToRowWithErrFn) - Static method in class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn.ParseWithError
- create(PCollectionView<?>, Coder<T>) - Static method in class org.apache.beam.runners.dataflow.DataflowRunner.StreamingPCollectionViewWriterFn
- create(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.schemas.CachingFactory
- create(TypeDescriptor<?>, Schema) - Method in interface org.apache.beam.sdk.schemas.Factory
- create(TypeDescriptor<T>, Schema) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils.TypeDescriptorWithSchema
- create(RelTraitSet, RelNode, boolean) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.ZetaSqlUnnest
-
Creates an Uncollect.
- create(StreamObserver<ReqT>, Runnable) - Static method in class org.apache.beam.sdk.fn.stream.ForwardingClientResponseObserver
- create(StreamObserver<ReqT>, Runnable, Runnable) - Static method in class org.apache.beam.sdk.fn.stream.ForwardingClientResponseObserver
- create(Output<StreamRecord<WindowedValue<OutputT>>>, Lock, OperatorStateBackend) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator.MultiOutputOutputManagerFactory
- create(Function<InputT, ValueT>, SparkCombineFn.WindowedAccumulator.Type, Iterable<WindowedValue<AccumT>>, Comparator<BoundedWindow>) - Static method in interface org.apache.beam.runners.spark.translation.SparkCombineFn.WindowedAccumulator
-
Create concrete accumulator for given type.
- create(Function<InputT, ValueT>, SparkCombineFn.WindowedAccumulator.Type, Comparator<BoundedWindow>) - Static method in interface org.apache.beam.runners.spark.translation.SparkCombineFn.WindowedAccumulator
- create(ResourceIdT, CreateOptions) - Method in class org.apache.beam.sdk.io.FileSystem
-
Returns a write channel for the given
FileSystem
. - create(ServiceT, Endpoints.ApiServiceDescriptor) - Static method in class org.apache.beam.sdk.fn.server.GrpcFnServer
-
Deprecated.This create function is used for Dataflow migration purpose only.
- create(ServiceT, Endpoints.ApiServiceDescriptor, ServerFactory) - Static method in class org.apache.beam.sdk.fn.server.GrpcFnServer
-
Create a
GrpcFnServer
for the providedFnService
which will run at the endpoint specified in theEndpoints.ApiServiceDescriptor
. - create(AwsCredentialsProvider, Region, URI) - Static method in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration
- create(T) - Static method in class org.apache.beam.sdk.values.PCollectionViews.ValueOrMetadata
- Create<T> - Class in org.apache.beam.sdk.transforms
-
Create<T>
takes a collection of elements of typeT
known when the pipeline is constructed and returns aPCollection<T>
containing the elements. - Create() - Constructor for class org.apache.beam.sdk.transforms.Create
- CREATE_IF_NEEDED - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.CreateDisposition
-
Specifies that tables should be created if needed.
- CREATE_IF_NEEDED - Enum constant in enum class org.apache.beam.sdk.io.snowflake.enums.CreateDisposition
- CREATE_NEVER - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.CreateDisposition
-
Specifics that tables should not be created.
- CREATE_NEVER - Enum constant in enum class org.apache.beam.sdk.io.snowflake.enums.CreateDisposition
- CREATE_STREAMING_SPARK_VIEW_URN - Static variable in class org.apache.beam.runners.spark.translation.streaming.CreateStreamingSparkView
- CREATE_TIME - Enum constant in enum class org.apache.beam.sdk.io.kafka.KafkaTimestampType
- Create.OfValueProvider<T> - Class in org.apache.beam.sdk.transforms
- Create.TimestampedValues<T> - Class in org.apache.beam.sdk.transforms
-
A
PTransform
that creates aPCollection
whose elements have associated timestamps. - Create.Values<T> - Class in org.apache.beam.sdk.transforms
-
A
PTransform
that creates aPCollection
from a set of in-memory objects. - Create.WindowedValues<T> - Class in org.apache.beam.sdk.transforms
-
A
PTransform
that creates aPCollection
whose elements have associated windowing metadata. - createAccumulator() - Method in class org.apache.beam.runners.flink.FlinkStreamingAggregationsTranslators.ConcatenateAsIterable
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.ordered.combiner.DefaultSequenceCombiner
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.impl.LazyAggregateCombineFn
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CountIf.CountIfFn
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CovarianceFn
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.VarianceFn
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations.BitXOr
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.ArrayAgg.ArrayAggArray
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggByte
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggString
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.Sum
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
- createAccumulator() - Method in interface org.apache.beam.sdk.extensions.sql.udf.AggregateFn
-
Returns a new, mutable accumulator value, representing the accumulation of zero input values.
- createAccumulator() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Concatenate
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
-
Deprecated.
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
-
Returns a new, mutable accumulator value, representing the accumulation of zero input values.
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.Top.TopCombineFn
- createAccumulator(CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
- createAccumulator(CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
-
Returns a new, mutable accumulator value, representing the accumulation of zero input values.
- createAll(Class<?>) - Static method in class org.apache.beam.sdk.extensions.sql.impl.ScalarFunctionImpl
-
Creates
Function
for each method in a given class. - createAndTrackNextReader() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- createArrayOf(String, Object[]) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- createArtifactServerFactory(JobServerDriver.ServerConfiguration) - Static method in class org.apache.beam.runners.jobsubmission.JobServerDriver
- createBacklogGauge(MetricName) - Static method in class org.apache.beam.sdk.io.kafka.KafkaSinkMetrics
-
Creates a
Gauge
metric to record per partition backlog with the name - createBatch(Class<?>, Class<?>, Class<?>) - Static method in class org.apache.beam.sdk.io.cdap.Plugin
-
Creates a batch plugin instance.
- createBatchExecutionEnvironment(FlinkPipelineOptions) - Static method in class org.apache.beam.runners.flink.FlinkExecutionEnvironments
-
If the submitted job is a batch processing job, this method creates the adequate Flink
ExecutionEnvironment
depending on the user-specified options. - createBigQueryClientCustomErrors() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl
- createBitXOr(Schema.FieldType) - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations
- createBlob() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- createBlockGenerator(BlockGeneratorListener) - Method in class org.apache.beam.sdk.io.sparkreceiver.WrappedSupervisor
- createBounded() - Static method in interface org.apache.beam.runners.spark.translation.SparkInputDataProcessor
-
Creates
SparkInputDataProcessor
which does process input elements in separate thread and observes produced outputs via bounded queue in other thread. - createBoundedTableStatistics(Double) - Static method in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
- createBucket(String, Bucket) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
-
Creates a
Bucket
under the specified project in Cloud Storage or propagates an exception. - createBuilder(S3Options) - Method in interface org.apache.beam.sdk.io.aws2.options.S3ClientBuilderFactory
- createBuilder(S3Options) - Method in class org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
- createBuilder(BlobstoreOptions) - Method in class org.apache.beam.sdk.io.azure.blobstore.DefaultBlobstoreClientBuilderFactory
- createBuilder(BlobstoreOptions) - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreClientBuilderFactory
- createBytesXMLMessage(Solace.Record, boolean, DeliveryMode) - Static method in class org.apache.beam.sdk.io.solace.broker.MessageProducerUtils
-
Create a
BytesXMLMessage
to be published in Solace. - createCatalog(String, String, Map<String, String>) - Method in interface org.apache.beam.sdk.extensions.sql.meta.catalog.CatalogManager
-
Creates and stores a catalog of a particular type.
- createCatalog(String, String, Map<String, String>) - Method in class org.apache.beam.sdk.extensions.sql.meta.catalog.EmptyCatalogManager
- createCatalog(String, String, Map<String, String>) - Method in class org.apache.beam.sdk.extensions.sql.meta.catalog.InMemoryCatalogManager
- createCatalogItems() - Static method in class org.apache.beam.sdk.extensions.ml.RecommendationAIIO
- createClassLoader(List<String>) - Method in class org.apache.beam.sdk.extensions.sql.impl.JavaUdfLoader
- createClob() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- createCombineFn(AggregateCall, Schema.Field, String) - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.AggregationCombineFnAdapter
-
Creates either a UDAF or a built-in
Combine.CombineFn
. - createCombineFnAnalyticsFunctions(AggregateCall, Schema.Field, String) - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.AggregationCombineFnAdapter
-
Creates either a UDAF or a built-in
Combine.CombineFn
for Analytic Functions. - createComparator(boolean, ExecutionConfig) - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeInformation
- createComparator(boolean, ExecutionConfig) - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueTypeInformation
- createConstantCombineFn() - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.AggregationCombineFnAdapter
- createConstructorCreator(Class<? super T>, Constructor<T>, Schema, List<FieldValueTypeInformation>, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.POJOUtils
- createConstructorCreator(Class<T>, Constructor<T>, Schema, List<FieldValueTypeInformation>, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
- createContextual(DeserializationContext, BeanProperty) - Method in class org.apache.beam.sdk.options.ValueProvider.Deserializer
- CREATED - Enum constant in enum class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.State
- createDatabase(String) - Method in interface org.apache.beam.sdk.extensions.sql.meta.catalog.Catalog
-
Creates a database with this name.
- createDatabase(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.catalog.InMemoryCatalog
- createDatabase(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.iceberg.IcebergCatalog
- createDataCatalogClient(DataCatalogPipelineOptions) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
- CreateDataflowView<ElemT,
ViewT> - Class in org.apache.beam.runners.dataflow -
A
DataflowRunner
marker class for creating aPCollectionView
. - createDataset(String, String, DatasetProperties) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- createDataset(String, String, String, String, Long) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Create a
Dataset
with the givenlocation
,description
and default expiration time for tables in the dataset (ifnull
, tables don't expire). - createDataset(String, String, String, String, Long) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
-
Create a
Dataset
with the givenlocation
,description
and default expiration time for tables in the dataset (ifnull
, tables don't expire). - createDataset(String, String, String, String, Long) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- createDataset(List<WindowedValue<T>>, Encoder<WindowedValue<T>>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
- createDatasetFromRDD(SparkSession, BoundedSource<T>, Supplier<PipelineOptions>, Encoder<WindowedValue<T>>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.io.BoundedDatasetFactory
- createDatasetFromRows(SparkSession, BoundedSource<T>, Supplier<PipelineOptions>, Encoder<WindowedValue<T>>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.io.BoundedDatasetFactory
- createDecompressingChannel(ReadableByteChannel) - Method in enum class org.apache.beam.sdk.io.CompressedSource.CompressionMode
-
Deprecated.
- createDecompressingChannel(ReadableByteChannel) - Method in interface org.apache.beam.sdk.io.CompressedSource.DecompressingChannelFactory
-
Given a channel, create a channel that decompresses the content read from the channel.
- createDefault() - Static method in class org.apache.beam.sdk.coders.CoderRegistry
-
Creates a CoderRegistry containing registrations for all standard coders part of the core Java Apache Beam SDK and also any registrations provided by
coder registrars
. - createDefault() - Static method in class org.apache.beam.sdk.fn.channel.ManagedChannelFactory
-
Creates a
ManagedChannel
relying on theManagedChannelBuilder
to choose the channel type. - createDefault() - Static method in class org.apache.beam.sdk.fn.server.ServerFactory
-
Create a default
ServerFactory.InetSocketAddressServerFactory
. - createDefault() - Static method in interface org.apache.beam.sdk.fn.server.ServerFactory.UrlFactory
- createDefault() - Static method in class org.apache.beam.sdk.schemas.SchemaRegistry
- createDefault() - Static method in class org.apache.beam.sdk.transforms.GroupIntoBatches.BatchingParams
- createDicomStore(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Create a DicomStore.
- createDicomStore(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- createDicomStore(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- createDicomStore(String, String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Create a DicomStore with a PubSub listener.
- CreateDisposition - Enum Class in org.apache.beam.sdk.io.snowflake.enums
-
Enum containing all supported dispositions for table.
- createEnumerator(SplitEnumeratorContext<FlinkSourceSplit<T>>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSource
- createEnumerator(SplitEnumeratorContext<FlinkSourceSplit<T>>, boolean) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSource
- createEnvironment(RunnerApi.Environment, String) - Method in class org.apache.beam.runners.fnexecution.environment.DockerEnvironmentFactory
-
Creates a new, active
RemoteEnvironment
backed by a local Docker container. - createEnvironment(RunnerApi.Environment, String) - Method in class org.apache.beam.runners.fnexecution.environment.EmbeddedEnvironmentFactory
- createEnvironment(RunnerApi.Environment, String) - Method in interface org.apache.beam.runners.fnexecution.environment.EnvironmentFactory
-
Creates an active
RunnerApi.Environment
and returns a handle to it. - createEnvironment(RunnerApi.Environment, String) - Method in class org.apache.beam.runners.fnexecution.environment.ExternalEnvironmentFactory
-
Creates a new, active
RemoteEnvironment
backed by an unmanaged worker. - createEnvironment(RunnerApi.Environment, String) - Method in class org.apache.beam.runners.fnexecution.environment.ProcessEnvironmentFactory
-
Creates a new, active
RemoteEnvironment
backed by a forked process. - createEnvironment(RunnerApi.Environment, String) - Method in class org.apache.beam.runners.fnexecution.environment.StaticRemoteEnvironmentFactory
- createEnvironmentFactory(GrpcFnServer<FnApiControlClientPoolService>, GrpcFnServer<GrpcLoggingService>, GrpcFnServer<ArtifactRetrievalService>, GrpcFnServer<StaticGrpcProvisionService>, ControlClientPool, IdGenerator) - Method in class org.apache.beam.runners.fnexecution.environment.DockerEnvironmentFactory.Provider
- createEnvironmentFactory(GrpcFnServer<FnApiControlClientPoolService>, GrpcFnServer<GrpcLoggingService>, GrpcFnServer<ArtifactRetrievalService>, GrpcFnServer<StaticGrpcProvisionService>, ControlClientPool, IdGenerator) - Method in class org.apache.beam.runners.fnexecution.environment.EmbeddedEnvironmentFactory.Provider
- createEnvironmentFactory(GrpcFnServer<FnApiControlClientPoolService>, GrpcFnServer<GrpcLoggingService>, GrpcFnServer<ArtifactRetrievalService>, GrpcFnServer<StaticGrpcProvisionService>, ControlClientPool, IdGenerator) - Method in interface org.apache.beam.runners.fnexecution.environment.EnvironmentFactory.Provider
-
Creates
EnvironmentFactory
for the provided GrpcServices. - createEnvironmentFactory(GrpcFnServer<FnApiControlClientPoolService>, GrpcFnServer<GrpcLoggingService>, GrpcFnServer<ArtifactRetrievalService>, GrpcFnServer<StaticGrpcProvisionService>, ControlClientPool, IdGenerator) - Method in class org.apache.beam.runners.fnexecution.environment.ExternalEnvironmentFactory.Provider
- createEnvironmentFactory(GrpcFnServer<FnApiControlClientPoolService>, GrpcFnServer<GrpcLoggingService>, GrpcFnServer<ArtifactRetrievalService>, GrpcFnServer<StaticGrpcProvisionService>, ControlClientPool, IdGenerator) - Method in class org.apache.beam.runners.fnexecution.environment.ProcessEnvironmentFactory.Provider
- createEnvironmentFactory(GrpcFnServer<FnApiControlClientPoolService>, GrpcFnServer<GrpcLoggingService>, GrpcFnServer<ArtifactRetrievalService>, GrpcFnServer<StaticGrpcProvisionService>, ControlClientPool, IdGenerator) - Method in class org.apache.beam.runners.fnexecution.environment.StaticRemoteEnvironmentFactory.Provider
- createEpoll() - Static method in class org.apache.beam.sdk.fn.channel.ManagedChannelFactory
-
Creates a
ManagedChannelFactory
backed by anEpollDomainSocketChannel
if the address is aDomainSocketAddress
. - createEpollDomainSocket() - Static method in class org.apache.beam.sdk.fn.server.ServerFactory
-
Create a
ServerFactory.EpollDomainSocket
. - createEpollSocket() - Static method in class org.apache.beam.sdk.fn.server.ServerFactory
-
Create a
ServerFactory.EpollSocket
. - createFactory() - Method in class org.apache.beam.sdk.extensions.sql.impl.JdbcDriver
- createFactoryForCreateSubscription() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- createFactoryForGetSchema(PubsubClient.TopicPath, PubsubClient.SchemaPath, Schema) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- createFactoryForPublish(PubsubClient.TopicPath, Iterable<PubsubClient.OutgoingMessage>, Iterable<PubsubClient.OutgoingMessage>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
-
Return a factory for testing publishers.
- createFactoryForPull(Clock, PubsubClient.SubscriptionPath, int, Iterable<PubsubClient.IncomingMessage>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
-
Return a factory for testing subscribers.
- createFactoryForPullAndPublish(PubsubClient.SubscriptionPath, PubsubClient.TopicPath, Clock, int, Iterable<PubsubClient.IncomingMessage>, Iterable<PubsubClient.OutgoingMessage>, Iterable<PubsubClient.OutgoingMessage>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
-
Returns a factory for a test that is expected to both publish and pull messages over the course of the test.
- createFhirStore(String, String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Create FHIR Store.
- createFhirStore(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- createFhirStore(String, String, String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- createFhirStore(String, String, String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Create FHIR Store with a PubSub topic listener.
- createFile() - Method in class org.apache.beam.sdk.jmh.io.TextSourceBenchmark.Data
-
Generates a random file with
NUM_LINES
between 60 and 120 characters each. - createForSubrangeOfFile(String, long, long) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource
-
Deprecated.Used by Dataflow worker
- createForSubrangeOfFile(MatchResult.Metadata, long, long) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource
- createForSubrangeOfFile(MatchResult.Metadata, long, long) - Method in class org.apache.beam.sdk.io.BlockBasedSource
-
Creates a
BlockBasedSource
for the specified range in a single file. - createForSubrangeOfFile(MatchResult.Metadata, long, long) - Method in class org.apache.beam.sdk.io.CompressedSource
-
Creates a
CompressedSource
for a subrange of a file. - createForSubrangeOfFile(MatchResult.Metadata, long, long) - Method in class org.apache.beam.sdk.io.FileBasedSource
-
Creates and returns a new
FileBasedSource
of the same type as the currentFileBasedSource
backed by a given file and an offset range. - createForSubrangeOfFile(MatchResult.Metadata, long, long) - Method in class org.apache.beam.sdk.io.TextSource
- createForSubrangeOfFile(MatchResult.Metadata, long, long) - Method in class org.apache.beam.sdk.io.xml.XmlSource
- createFrom(String) - Static method in class org.apache.beam.sdk.fn.channel.SocketAddressFactory
-
Parse a
SocketAddress
from the given string. - createGetter(FieldValueTypeInformation, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
- createGetterConversions(StackManipulation) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.DefaultTypeConversionsFactory
- createGetterConversions(StackManipulation) - Method in interface org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversionsFactory
- createHL7v2Message(String, Message) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Creates an HL7v2 message.
- createHL7v2Message(String, Message) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- createHL7v2Store(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Create hl 7 v 2 store hl 7 v 2 store.
- createHL7v2Store(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- createImplementor(Method) - Static method in class org.apache.beam.sdk.extensions.sql.impl.ScalarFunctionImpl
- createInProcess() - Static method in class org.apache.beam.sdk.fn.channel.ManagedChannelFactory
-
Creates a
ManagedChannel
using an in-process channel. - createInput(Pipeline, Map<String, PCollection<?>>) - Method in class org.apache.beam.sdk.expansion.service.ExpansionServiceSchemaTransformProvider
- createInput(Pipeline, Map<String, PCollection<?>>) - Method in interface org.apache.beam.sdk.expansion.service.TransformProvider
- createInputFormatInstance() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.HadoopInputFormatBoundedSource
-
Creates instance of InputFormat class.
- createInputSplits(int) - Method in class org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat
- createInputSplits(int) - Method in class org.apache.beam.runners.flink.translation.wrappers.SourceInputFormat
- createInstance() - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeSerializer
- createInstance() - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueSerializer
- createInstance() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.FlinkStateNamespaceKeySerializer
- createInternal(WindowingStrategy) - Static method in class org.apache.beam.sdk.transforms.windowing.Window.Assign
- createIterator() - Method in class org.apache.beam.sdk.fn.stream.PrefetchableIterables.Default
- createJCSMPSendMultipleEntry(List<Solace.Record>, boolean, SerializableFunction<Solace.Record, Destination>, DeliveryMode) - Static method in class org.apache.beam.sdk.io.solace.broker.MessageProducerUtils
-
Create a
JCSMPSendMultipleEntry
array to be published in Solace. - createJob(Job) - Method in class org.apache.beam.runners.dataflow.DataflowClient
-
Creates the Dataflow
Job
. - createJobInvocation(String, String, ListeningExecutorService, RunnerApi.Pipeline, FlinkPipelineOptions, PortablePipelineRunner) - Method in class org.apache.beam.runners.flink.FlinkJobInvoker
- createJobServerFactory(JobServerDriver.ServerConfiguration) - Static method in class org.apache.beam.runners.jobsubmission.JobServerDriver
- createJobService() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver
- createKafkaRead() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
- createMessagesArray(Iterable<Solace.Record>, boolean) - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
- createMetadata(MetaT) - Static method in class org.apache.beam.sdk.values.PCollectionViews.ValueOrMetadata
- createMetadataTable() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
-
Create the metadata table if it does not exist yet.
- createNamespace(String) - Method in class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig
- createNClob() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- createNewDataset(String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
-
Creates a new dataset.
- createNewDataset(String, String, Long) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
-
Creates a new dataset with defaultTableExpirationMs.
- createNewDataset(String, String, Long, String) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
-
Creates a new dataset with defaultTableExpirationMs and in a specified location (GCP region).
- createNewTable(String, String, Table) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
- CreateOptions - Class in org.apache.beam.sdk.io.fs
-
An abstract class that contains common configuration options for creating resources.
- CreateOptions() - Constructor for class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.CreateOptions
- CreateOptions() - Constructor for class org.apache.beam.sdk.io.fs.CreateOptions
- CreateOptions.Builder<BuilderT> - Class in org.apache.beam.sdk.io.fs
-
An abstract builder for
CreateOptions
. - CreateOptions.StandardCreateOptions - Class in org.apache.beam.sdk.io.fs
-
A standard configuration options with builder.
- CreateOptions.StandardCreateOptions.Builder - Class in org.apache.beam.sdk.io.fs
-
Builder for
CreateOptions.StandardCreateOptions
. - createOrUpdateReadChangeStreamMetadataTable(String, String, String) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO
-
Utility method to create or update Read Change Stream metadata table.
- createOutboundAggregator(Supplier<String>, boolean) - Method in interface org.apache.beam.runners.fnexecution.data.FnDataService
-
Creates a
BeamFnDataOutboundAggregator
for buffering and sending outbound data and timers over the data plane. - createOutboundAggregator(Supplier<String>, boolean) - Method in class org.apache.beam.runners.fnexecution.data.GrpcDataService
- createOutputIterator(Iterator<WindowedValue<FnInputT>>, SparkProcessContext<K, FnInputT, FnOutputT>) - Method in interface org.apache.beam.runners.spark.translation.SparkInputDataProcessor
-
Creates a transformation which processes input partition data and returns output results as
Iterator
. - createOutputMap(Iterable<String>) - Static method in class org.apache.beam.runners.fnexecution.translation.PipelineTranslatorUtils
-
Creates a mapping from PCollection id to output tag integer.
- createPane(boolean, boolean, PaneInfo.Timing) - Static method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
- createPane(boolean, boolean, PaneInfo.Timing, long, long) - Static method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
-
Factory method to create a
PaneInfo
with the specified parameters. - createPartitionMetadataTable() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Creates the metadata table in the given instance, database configuration, with the constructor specified table name.
- createPipeline(PipelineOptions) - Method in class org.apache.beam.sdk.expansion.service.ExpansionService
- createPipelineOptions(Map<String, String>) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamEnumerableConverter
- createPlanner(JdbcConnection, Collection<RuleSet>) - Method in interface org.apache.beam.sdk.extensions.sql.impl.QueryPlanner.Factory
- createPrepareContext() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- createPrimitiveOutputInternal(Pipeline, WindowingStrategy<?, ?>, PCollection.IsBounded, Coder<T>) - Static method in class org.apache.beam.sdk.values.PCollection
-
For internal use only; no backwards-compatibility guarantees.
- createPrimitiveOutputInternal(Pipeline, WindowingStrategy<?, ?>, PCollection.IsBounded, Coder<T>, TupleTag<?>) - Static method in class org.apache.beam.sdk.values.PCollection
-
For internal use only; no backwards-compatibility guarantees.
- createProcessContext(ValueInSingleWindow<InputT>) - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
Deprecated.Use
TestPipeline
with theDirectRunner
. - createProperties() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
- createPushDownRel(RelDataType, List<String>, BeamSqlTableFilter) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
- createQuery(String, String, String) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher
- createQuery(Expression, Class<T>) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- createQuery(Expression, Type) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- createQueryUsingStandardSql(String, String, String) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher
- createQueueForTopic(String, String) - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthSempClient
- createQueueForTopic(String, String) - Method in interface org.apache.beam.sdk.io.solace.broker.SempClient
-
This is only called when a user requests to read data from a topic.
- createRandomSubscription(PubsubClient.ProjectPath, PubsubClient.TopicPath, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Create a random subscription for
topic
. - createReader(FlinkSourceSplit<T>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.bounded.FlinkBoundedSourceReader
- createReader(FlinkSourceSplit<T>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
-
Create
Source.Reader
for givenFlinkSourceSplit
. - createReader(FlinkSourceSplit<T>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.unbounded.FlinkUnboundedSourceReader
- createReader(PipelineOptions) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.impulse.BeamImpulseSource
- createReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.azure.cosmos.CosmosIO.BoundedCosmosBDSource
- createReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.BoundedSource
-
Returns a new
BoundedSource.BoundedReader
that reads from this source. - createReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BoundedElasticsearchSource
- createReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.FileBasedSource
- createReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
- createReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.HadoopInputFormatBoundedSource
- createReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read.BoundedGridFSSource
- createReader(PipelineOptions, CheckpointMarkT) - Method in class org.apache.beam.sdk.io.UnboundedSource
-
Create a new
UnboundedSource.UnboundedReader
to read from this source, resuming from the given checkpoint if present. - createReader(PipelineOptions, CheckpointMarkImpl) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedSourceImpl
- createReader(PipelineOptions, SolaceCheckpointMark) - Method in class org.apache.beam.sdk.io.solace.read.UnboundedSolaceSource
- createReader(SourceReaderContext) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.bounded.FlinkBoundedSource
- createReader(SourceReaderContext) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.unbounded.FlinkUnboundedSource
- createReadSession(CreateReadSessionRequest) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StorageClient
-
Create a new read session against an existing table.
- createRPCLatencyHistogram(KafkaSinkMetrics.RpcMethod, String) - Static method in class org.apache.beam.sdk.io.kafka.KafkaSinkMetrics
-
Creates a
Histogram
metric to record RPC latency with the name - createSchema(PubsubClient.SchemaPath, String, Schema.Type) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Create
Schema
from Schema definition content. - createSchema(PubsubClient.SchemaPath, String, Schema.Type) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
-
Create
Schema
from Schema definition content. - createSchema(PubsubClient.SchemaPath, String, Schema.Type) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
-
Create
Schema
from Schema definition content. - createSchema(PubsubClient.SchemaPath, String, Schema.Type) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- createSerializer(ExecutionConfig) - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeInformation
- createSerializer(ExecutionConfig) - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueTypeInformation
- createSessionToken(String) - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver
- createSetter(FieldValueTypeInformation, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
- createSetterConversions(StackManipulation) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.DefaultTypeConversionsFactory
- createSetterConversions(StackManipulation) - Method in interface org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversionsFactory
- createSingleFileReader(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource
- createSingleFileReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.BlockBasedSource
-
Creates a
BlockBasedReader
. - createSingleFileReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.CompressedSource
-
Creates a
FileBasedReader
to read a single file. - createSingleFileReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.FileBasedSource
-
Creates and returns an instance of a
FileBasedReader
implementation for the current source assuming the source represents a single file. - createSingleFileReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.TextSource
- createSingleFileReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.xml.XmlSource
- createSingleMessage(Solace.Record, boolean) - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
- createSource - Variable in class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceTransform
- createSourceForSubrange(long, long) - Method in class org.apache.beam.sdk.io.FileBasedSource
- createSourceForSubrange(long, long) - Method in class org.apache.beam.sdk.io.OffsetBasedSource
-
Returns an
OffsetBasedSource
for a subrange of the current source. - createSQLXML() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- createStateBackend(FlinkPipelineOptions) - Method in interface org.apache.beam.runners.flink.FlinkStateBackendFactory
- createStatement() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- createStatement(int, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- createStatement(int, int, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- createStateOnInitialEvent(EventT) - Method in interface org.apache.beam.sdk.extensions.ordered.EventExaminer
-
If the event was the first event for a given key, create the state to hold the required data needed for processing.
- createStaticCreator(Class<T>, Method, Schema, List<FieldValueTypeInformation>, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
- createStaticCreator(Class<T>, Method, Schema, List<FieldValueTypeInformation>, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.POJOUtils
- CreateStream<T> - Class in org.apache.beam.runners.spark.io
-
Create an input stream from Queue.
- createStreamExecutionEnvironment(FlinkPipelineOptions, List<String>, String) - Static method in class org.apache.beam.runners.flink.FlinkExecutionEnvironments
-
If the submitted job is a stream processing job, this method creates the adequate Flink
StreamExecutionEnvironment
depending on the user-specified options. - createStreaming(Class<?>, SerializableFunction<V, Long>, Class<? extends Receiver<V>>) - Static method in class org.apache.beam.sdk.io.cdap.Plugin
-
Creates a streaming plugin instance with default function for getting args for
Receiver
. - createStreaming(Class<?>, SerializableFunction<V, Long>, Class<? extends Receiver<V>>, SerializableFunction<PluginConfig, Object[]>) - Static method in class org.apache.beam.sdk.io.cdap.Plugin
-
Creates a streaming plugin instance.
- CreateStreamingSparkView<ElemT,
ViewT> - Class in org.apache.beam.runners.spark.translation.streaming -
Spark streaming overrides for various view (side input) transforms.
- CreateStreamingSparkView(PCollectionView<ViewT>) - Constructor for class org.apache.beam.runners.spark.translation.streaming.CreateStreamingSparkView
- CreateStreamingSparkView.CreateSparkPCollectionView<ElemT,
ViewT> - Class in org.apache.beam.runners.spark.translation.streaming -
Creates a primitive
PCollectionView
. - CreateStreamingSparkView.Factory<ElemT,
ViewT> - Class in org.apache.beam.runners.spark.translation.streaming - createStringAggOperator(ResolvedNodes.ResolvedFunctionCallBase) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
- createStruct(String, Object[]) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- createStructuralValues(Coder<T>, List<T>) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
-
Testing utilities below depend on standard assertions and matchers to compare elements read by sources.
- createSubscription(PubsubClient.TopicPath, PubsubClient.SubscriptionPath, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Create
subscription
totopic
. - createSubscription(PubsubClient.TopicPath, PubsubClient.SubscriptionPath, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
- createSubscription(PubsubClient.TopicPath, PubsubClient.SubscriptionPath, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
- createSubscription(PubsubClient.TopicPath, PubsubClient.SubscriptionPath, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- createTable(Table) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Creates the specified table if it does not exist.
- createTable(Table) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
-
Creates the specified table if it does not exist.
- createTable(Table) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- createTable(String, Schema, List<String>) - Method in class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig
- createTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
- createTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.iceberg.IcebergTableProvider
- createTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.InMemoryMetaTableProvider
- createTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.ReadOnlyTableProvider
- createTable(Table) - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider
-
Creates a table.
- createTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider
- createTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.store.InMemoryMetaStore
- CreateTableHelpers - Class in org.apache.beam.sdk.io.gcp.bigquery
- CreateTableHelpers() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.CreateTableHelpers
- CreateTables<DestinationT,
ElementT> - Class in org.apache.beam.sdk.io.gcp.bigquery -
Creates any tables needed before performing streaming writes to the tables.
- CreateTables(BigQueryIO.Write.CreateDisposition, DynamicDestinations<?, DestinationT>) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.CreateTables
-
The list of tables created so far, so we don't try the creation each time.
- createTest(String, String, String) - Static method in class org.apache.beam.sdk.io.gcp.datastore.RowToEntity
- createTimestampPolicy(TopicPartition, Optional<Instant>) - Method in interface org.apache.beam.sdk.io.kafka.TimestampPolicyFactory
-
Creates a TimestampPolicy for a partition.
- createTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Create
topic
. - createTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
- createTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
- createTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- createTopic(PubsubClient.TopicPath, PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Create {link TopicPath} with
PubsubClient.SchemaPath
. - createTopic(PubsubClient.TopicPath, PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
- createTopic(PubsubClient.TopicPath, PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
- createTopic(PubsubClient.TopicPath, PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- createTransaction() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
-
Returns a transform that creates a batch transaction.
- CreateTransaction() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
- createTranslationContext(JobInfo, FlinkPipelineOptions, String, List<String>) - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator
-
Creates a batch translation context.
- createTranslationContext(JobInfo, FlinkPipelineOptions, String, List<String>) - Method in interface org.apache.beam.runners.flink.FlinkPortablePipelineTranslator
- createTranslationContext(JobInfo, FlinkPipelineOptions, String, List<String>) - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator
-
Creates a streaming translation context.
- createTranslationContext(JobInfo, FlinkPipelineOptions, ExecutionEnvironment) - Static method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator
- createTranslationContext(JobInfo, FlinkPipelineOptions, StreamExecutionEnvironment) - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator
-
Creates a streaming translation context.
- createTranslationContext(JavaSparkContext, SparkPipelineOptions, JobInfo) - Method in class org.apache.beam.runners.spark.translation.SparkBatchPortablePipelineTranslator
- createTranslationContext(JavaSparkContext, SparkPipelineOptions, JobInfo) - Method in interface org.apache.beam.runners.spark.translation.SparkPortablePipelineTranslator
- createTranslationContext(JavaSparkContext, SparkPipelineOptions, JobInfo) - Method in class org.apache.beam.runners.spark.translation.SparkStreamingPortablePipelineTranslator
- createTranslator() - Static method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator
-
Creates a batch translator.
- createTranslator(Map<String, FlinkBatchPortablePipelineTranslator.PTransformTranslator>) - Static method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator
-
Creates a batch translator.
- createTypeConversion(boolean) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.DefaultTypeConversionsFactory
- createTypeConversion(boolean) - Method in interface org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversionsFactory
- createUnbounded() - Static method in interface org.apache.beam.runners.spark.translation.SparkInputDataProcessor
-
Creates
SparkInputDataProcessor
which does processing in calling thread. - createUnboundedTableStatistics(Double) - Static method in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
- createUrl(String, int) - Method in interface org.apache.beam.sdk.fn.server.ServerFactory.UrlFactory
- createValue(int, T) - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
-
Create a
OneOfType.Value
specifying which field to set and the value to set. - createValue(String, T) - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
-
Create a
OneOfType.Value
specifying which field to set and the value to set. - createValue(EnumerationType.Value, T) - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
-
Create a
OneOfType.Value
specifying which field to set and the value to set. - createWatermarkPolicy() - Method in interface org.apache.beam.sdk.io.aws2.kinesis.WatermarkPolicyFactory
- createWithAllowDuplicates() - Static method in class org.apache.beam.runners.dataflow.internal.DataflowGroupByKey
-
Returns a
DataflowGroupByKey<K, V>
PTransform
that its output can have duplicated elements. - createWithBytesReadConsumer(SeekableByteChannel, Consumer<Integer>) - Static method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingSeekableByteChannel
- createWithBytesWrittenConsumer(SeekableByteChannel, Consumer<Integer>) - Static method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingSeekableByteChannel
- createWithNoOpConsumer(ReadableByteChannel) - Static method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingReadableByteChannel
- createWithNoOpConsumer(SeekableByteChannel) - Static method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingSeekableByteChannel
- createWithNoOpConsumer(WritableByteChannel) - Static method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingWritableByteChannel
- createWithPortSupplier(Supplier<Integer>) - Static method in class org.apache.beam.sdk.fn.server.ServerFactory
-
Create a
ServerFactory.InetSocketAddressServerFactory
that uses ports from a supplier. - createWithUrlFactory(ServerFactory.UrlFactory) - Static method in class org.apache.beam.sdk.fn.server.ServerFactory
-
Create a
ServerFactory.InetSocketAddressServerFactory
that uses the given url factory. - createWithUrlFactoryAndPortSupplier(ServerFactory.UrlFactory, Supplier<Integer>) - Static method in class org.apache.beam.sdk.fn.server.ServerFactory
-
Create a
ServerFactory.InetSocketAddressServerFactory
that uses the given url factory and ports from a supplier. - createWrappingDoFnRunner(DoFnRunner<InputT, OutputT>, StepContext) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- createWrappingDoFnRunner(DoFnRunner<InputT, OutputT>, StepContext) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.ExecutableStageDoFnOperator
- createWrappingDoFnRunner(DoFnRunner<KeyedWorkItem<byte[], KV<InputT, RestrictionT>>, OutputT>, StepContext) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.SplittableDoFnOperator
- createWrappingDoFnRunner(DoFnRunner<KeyedWorkItem<K, InputT>, KV<K, OutputT>>, StepContext) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.WindowDoFnOperator
- createWriteOperation() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSink
- createWriteOperation() - Method in class org.apache.beam.sdk.io.FileBasedSink
-
Return a subclass of
FileBasedSink.WriteOperation
that will manage the write to the sink. - createWriter() - Method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
-
Clients must implement to return a subclass of
FileBasedSink.Writer
. - createWriteStream(String, WriteStream.Type) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.WriteStreamService
-
Create a Write Stream for use with the Storage Write API.
- createWriteStream(String, WriteStream.Type) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.WriteStreamServiceImpl
- createWriteStream(String, WriteStream.Type) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- createZetaSqlFunction(String, SqlTypeName) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
-
Create a dummy SqlFunction of type OTHER_FUNCTION from given function name and return type.
- Creating Tables - Search tag in class org.apache.beam.sdk.io.iceberg.IcebergIO
- Section
- CredentialFactory - Interface in org.apache.beam.sdk.extensions.gcp.auth
-
Construct an oauth credential to be used by the SDK and the SDK workers.
- credentialsProvider() - Method in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration
-
Optional
AwsCredentialsProvider
. - credentialsProvider(AwsCredentialsProvider) - Method in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration.Builder
-
Optional
AwsCredentialsProvider
. - CrossLanguageConfiguration - Class in org.apache.beam.sdk.io.snowflake.crosslanguage
-
Parameters abstract class to expose the transforms to an external SDK.
- CrossLanguageConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.CrossLanguageConfiguration
- CrossLanguageConfiguration() - Constructor for class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- crossProductJoin() - Method in class org.apache.beam.sdk.schemas.transforms.CoGroup.Impl
-
Expand the join into individual rows, similar to SQL joins.
- CsvConfiguration() - Constructor for class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.CsvConfiguration
- csvConfigurationBuilder() - Static method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration
- CsvIO - Class in org.apache.beam.sdk.io.csv
-
PTransform
s for reading and writing CSV files. - CsvIO() - Constructor for class org.apache.beam.sdk.io.csv.CsvIO
- CsvIO.Write<T> - Class in org.apache.beam.sdk.io.csv
-
PTransform
for writing CSV files. - CsvIOParse<T> - Class in org.apache.beam.sdk.io.csv
-
PTransform
for Parsing CSV Record Strings intoSchema
-mapped target types. - CsvIOParse() - Constructor for class org.apache.beam.sdk.io.csv.CsvIOParse
- CsvIOParseError - Class in org.apache.beam.sdk.io.csv
-
CsvIOParseError
is a data class to store errors from CSV record processing. - CsvIOParseError() - Constructor for class org.apache.beam.sdk.io.csv.CsvIOParseError
- CsvIOParseResult<T> - Class in org.apache.beam.sdk.io.csv
- csvLines2BeamRows(CSVFormat, String, Schema) - Static method in class org.apache.beam.sdk.extensions.sql.impl.schema.BeamTableUtils
- CsvSink - Class in org.apache.beam.runners.spark.metrics.sink
-
A
Sink
for Spark's metric system reporting metrics (including Beam step metrics) to a CSV file. - CsvSink(Properties, MetricRegistry) - Constructor for class org.apache.beam.runners.spark.metrics.sink.CsvSink
-
Constructor for Spark 3.2.x and later.
- CsvSink(Properties, MetricRegistry, SecurityManager) - Constructor for class org.apache.beam.runners.spark.metrics.sink.CsvSink
-
Constructor for Spark 3.1.x and earlier.
- CsvToRow(Schema, CSVFormat) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTableProvider.CsvToRow
- CsvWriteConfiguration() - Constructor for class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider.CsvWriteConfiguration
- CsvWriteSchemaTransformFormatProvider - Class in org.apache.beam.sdk.io.fileschematransform
-
A
FileWriteSchemaTransformFormatProvider
for CSV format. - CsvWriteSchemaTransformFormatProvider() - Constructor for class org.apache.beam.sdk.io.fileschematransform.CsvWriteSchemaTransformFormatProvider
- CsvWriteTransformProvider - Class in org.apache.beam.sdk.io.csv.providers
-
An implementation of
TypedSchemaTransformProvider
forCsvIO.write(java.lang.String, org.apache.commons.csv.CSVFormat)
. - CsvWriteTransformProvider() - Constructor for class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider
- CsvWriteTransformProvider.CsvWriteConfiguration - Class in org.apache.beam.sdk.io.csv.providers
-
Configuration for writing to BigQuery with Storage Write API.
- CsvWriteTransformProvider.CsvWriteConfiguration.Builder - Class in org.apache.beam.sdk.io.csv.providers
-
Builder for
CsvWriteTransformProvider.CsvWriteConfiguration
. - CsvWriteTransformProvider.CsvWriteTransform - Class in org.apache.beam.sdk.io.csv.providers
- ctx - Variable in class org.apache.beam.runners.spark.translation.AbstractInOutIterator
- ctxt - Variable in class org.apache.beam.runners.spark.SparkRunner.Evaluator
- ctxt - Variable in class org.apache.beam.runners.spark.translation.GroupByKeyVisitor
- current() - Static method in class org.apache.beam.sdk.io.googleads.GoogleAdsIO
- CURRENT_METADATA_TABLE_VERSION - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
- currentCatalog() - Method in interface org.apache.beam.sdk.extensions.sql.meta.catalog.CatalogManager
-
Produces the currently active catalog.
- currentCatalog() - Method in class org.apache.beam.sdk.extensions.sql.meta.catalog.EmptyCatalogManager
- currentCatalog() - Method in class org.apache.beam.sdk.extensions.sql.meta.catalog.InMemoryCatalogManager
- currentDatabase - Variable in class org.apache.beam.sdk.extensions.sql.meta.catalog.InMemoryCatalog
- currentDatabase() - Method in interface org.apache.beam.sdk.extensions.sql.meta.catalog.Catalog
-
Produces the currently active database.
- currentDatabase() - Method in class org.apache.beam.sdk.extensions.sql.meta.catalog.InMemoryCatalog
- currentEventTime() - Method in interface org.apache.beam.sdk.state.Timers
-
Returns the current event time.
- currentInputWatermarkTime() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
- currentOutputWatermarkTime() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
- currentProcessingTime() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
- currentProcessingTime() - Method in interface org.apache.beam.sdk.state.Timers
-
Returns the current processing time.
- currentRestriction() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.ReadChangeStreamPartitionProgressTracker
-
Returns the streamProgress that was successfully claimed.
- currentRestriction() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
- currentRestriction() - Method in class org.apache.beam.sdk.transforms.PeriodicSequence.OutputRangeTracker
- currentRestriction() - Method in class org.apache.beam.sdk.transforms.splittabledofn.ByteKeyRangeTracker
- currentRestriction() - Method in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
- currentRestriction() - Method in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker
-
Returns a restriction accurately describing the full range of work the current
DoFn.ProcessElement
call will do, including already completed work. - currentSynchronizedProcessingTime() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
- currentSynchronizedProcessingTime() - Method in interface org.apache.beam.sdk.state.Timers
-
Returns the current synchronized processing time or
null
if unknown. - currentWatermark - Variable in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.LogAppendTimePolicy
- currentWatermark() - Method in interface org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimator
-
Return estimated output watermark.
- currentWatermark() - Method in class org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimators.Manual
- currentWatermark() - Method in class org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimators.MonotonicallyIncreasing
- currentWatermark() - Method in class org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimators.WallTime
- custom() - Static method in class org.apache.beam.sdk.io.thrift.ThriftSchema
-
Builds a schema provider that maps any thrift type to a Beam schema, allowing for custom thrift typedef entries (which cannot be resolved using the available metadata) to be manually registered with their corresponding beam types.
- CUSTOM - Enum constant in enum class org.apache.beam.sdk.io.solace.SolaceIO.SubmissionMode
- CUSTOM_AUDIT_JOB_ENTRY_KEY - Static variable in class org.apache.beam.sdk.extensions.gcp.options.GcsOptions.GcsCustomAuditEntries
- CUSTOM_SOURCE_FORMAT - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- CustomCoder<T> - Class in org.apache.beam.sdk.coders
-
An abstract base class that implements all methods of
Coder
exceptCoder.encode(T, java.io.OutputStream)
andCoder.decode(java.io.InputStream)
. - CustomCoder() - Constructor for class org.apache.beam.sdk.coders.CustomCoder
- Customer - Class in org.apache.beam.sdk.extensions.sql.example.model
-
Describes a customer.
- Customer() - Constructor for class org.apache.beam.sdk.extensions.sql.example.model.Customer
- Customer(int, String, String) - Constructor for class org.apache.beam.sdk.extensions.sql.example.model.Customer
- CustomHttpErrors - Class in org.apache.beam.sdk.extensions.gcp.util
-
An optional component to use with the
RetryHttpRequestInitializer
in order to provide custom errors for failing http calls. - CustomHttpErrors.Builder - Class in org.apache.beam.sdk.extensions.gcp.util
-
A Builder which allows building immutable CustomHttpErrors object.
- CustomHttpErrors.MatcherAndError - Class in org.apache.beam.sdk.extensions.gcp.util
-
A simple Tuple class for creating a list of HttpResponseMatcher and HttpResponseCustomError to print for the responses.
- CustomSources - Class in org.apache.beam.runners.dataflow.internal
-
A helper class for supporting sources defined as
Source
. - CustomSources() - Constructor for class org.apache.beam.runners.dataflow.internal.CustomSources
- CustomTableResolver - Interface in org.apache.beam.sdk.extensions.sql.meta
-
Interface that table providers can implement if they require custom table name resolution.
- CustomTimestampPolicyWithLimitedDelay<K,
V> - Class in org.apache.beam.sdk.io.kafka -
A policy for custom record timestamps where timestamps within a partition are expected to be roughly monotonically increasing with a cap on out of order event delays (say 1 minute).
- CustomTimestampPolicyWithLimitedDelay(SerializableFunction<KafkaRecord<K, V>, Instant>, Duration, Optional<Instant>) - Constructor for class org.apache.beam.sdk.io.kafka.CustomTimestampPolicyWithLimitedDelay
-
A policy for custom record timestamps where timestamps are expected to be roughly monotonically increasing with out of order event delays less than
maxDelay
. - Custom timestamps - Search tag in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
- Section
- CustomX509TrustManager - Class in org.apache.beam.sdk.io.splunk
-
A Custom X509TrustManager that trusts a user provided CA and default CA's.
- CustomX509TrustManager(X509Certificate) - Constructor for class org.apache.beam.sdk.io.splunk.CustomX509TrustManager
D
- DAGBuilder - Class in org.apache.beam.runners.jet
-
Utility class for wiring up Jet DAGs based on Beam pipelines.
- DAGBuilder.WiringListener - Interface in org.apache.beam.runners.jet
-
Listener that can be registered with a
DAGBuilder
in order to be notified when edges are being registered. - DaoFactory - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao
- DaoFactory - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
-
Factory class to create data access objects to perform change stream queries and access the metadata tables.
- DaoFactory(BigtableConfig, BigtableConfig, String, String, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.DaoFactory
- DaoFactory(SpannerConfig, String, SpannerConfig, PartitionMetadataTableNames, Options.RpcPriority, String, Dialect, Dialect) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.DaoFactory
-
Constructs a
DaoFactory
with the configuration to be used for the underlying instances. - data() - Method in class org.apache.beam.sdk.io.solace.data.Semp.Queue
- data(String, String) - Static method in class org.apache.beam.sdk.fn.data.LogicalEndpoint
- data(StreamObserver<BeamFnApi.Elements>) - Method in class org.apache.beam.runners.fnexecution.data.GrpcDataService
- Data() - Constructor for class org.apache.beam.sdk.jmh.io.TextSourceBenchmark.Data
- DATA_BUFFER_SIZE_LIMIT - Static variable in class org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator
- DATA_BUFFER_TIME_LIMIT_MS - Static variable in class org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator
- DATA_RECORD_COMMITTED_TO_EMITTED_0MS_TO_1000MS_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Counter for record latencies [0, 1000) ms during the execution of the Connector.
- DATA_RECORD_COMMITTED_TO_EMITTED_1000MS_TO_3000MS_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Counter for record latencies [1000, 3000) ms during the execution of the Connector.
- DATA_RECORD_COMMITTED_TO_EMITTED_3000MS_TO_INF_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Counter for record latencies equal or above 3000ms during the execution of the Connector.
- DATA_RECORD_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Counter for the total number of data records identified during the execution of the Connector.
- database() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.hcatalog.HCatalogTable
- databaseId - Variable in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntityWithSummary
- databaseId - Variable in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKeyWithSummary
- databaseId - Variable in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.WriteWithSummary
- Database Schema Preparation - Search tag in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
- Section
- DataCatalogPipelineOptions - Interface in org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog
-
Pipeline options for Data Catalog table provider.
- DataCatalogPipelineOptionsRegistrar - Class in org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog
- DataCatalogPipelineOptionsRegistrar() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogPipelineOptionsRegistrar
- dataCatalogSegments() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
- dataCatalogSegments(TableReference, BigQueryOptions) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
- DataCatalogTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog
-
Uses DataCatalog to get the source type and schema for a table.
- DataChangeRecord - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
A data change record encodes modifications to Cloud Spanner rows.
- DataChangeRecord(String, Timestamp, String, boolean, String, String, List<ColumnType>, List<Mod>, ModType, ValueCaptureType, long, long, String, boolean, ChangeStreamRecordMetadata) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
Constructs a data change record for a given partition, at a given timestamp, for a given transaction.
- dataChangeRecordAction(ThroughputEstimator<DataChangeRecord>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.ActionFactory
-
Creates and returns a singleton instance of an action class capable of processing
DataChangeRecord
s. - DataChangeRecordAction - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.action
-
This class is part of the process for
ReadChangeStreamPartitionDoFn
SDF. - DataChangeRecordAction(ThroughputEstimator<DataChangeRecord>) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.DataChangeRecordAction
- DataEndpoint<T> - Class in org.apache.beam.sdk.fn.data
- DataEndpoint() - Constructor for class org.apache.beam.sdk.fn.data.DataEndpoint
- DataflowClient - Class in org.apache.beam.runners.dataflow
-
Wrapper around the generated
Dataflow
client to provide common functionality. - DataflowClientFactory() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions.DataflowClientFactory
- DataflowGroupByKey<K,
V> - Class in org.apache.beam.runners.dataflow.internal -
Specialized implementation of
GroupByKey
for translating Redistribute transform into Dataflow service protos. - DataflowGroupByKey.Registrar - Class in org.apache.beam.runners.dataflow.internal
-
Registers
DataflowGroupByKey.DataflowGroupByKeyTranslator
. - DataflowJobAlreadyExistsException - Exception Class in org.apache.beam.runners.dataflow
-
An exception that is thrown if the unique job name constraint of the Dataflow service is broken because an existing job with the same job name is currently active.
- DataflowJobAlreadyExistsException(DataflowPipelineJob, String) - Constructor for exception class org.apache.beam.runners.dataflow.DataflowJobAlreadyExistsException
-
Create a new
DataflowJobAlreadyExistsException
with the specifiedDataflowPipelineJob
and message. - DataflowJobAlreadyUpdatedException - Exception Class in org.apache.beam.runners.dataflow
-
An exception that is thrown if the existing job has already been updated within the Dataflow service and is no longer able to be updated.
- DataflowJobAlreadyUpdatedException(DataflowPipelineJob, String) - Constructor for exception class org.apache.beam.runners.dataflow.DataflowJobAlreadyUpdatedException
-
Create a new
DataflowJobAlreadyUpdatedException
with the specifiedDataflowPipelineJob
and message. - DataflowJobException - Exception Class in org.apache.beam.runners.dataflow
-
A
RuntimeException
that contains information about aDataflowPipelineJob
. - DataflowPipelineDebugOptions - Interface in org.apache.beam.runners.dataflow.options
-
Internal.
- DataflowPipelineDebugOptions.DataflowClientFactory - Class in org.apache.beam.runners.dataflow.options
-
Returns the default Dataflow client built from the passed in PipelineOptions.
- DataflowPipelineDebugOptions.StagerFactory - Class in org.apache.beam.runners.dataflow.options
-
Creates a
Stager
object using the class specified inDataflowPipelineDebugOptions.getStagerClass()
. - DataflowPipelineDebugOptions.UnboundedReaderMaxReadTimeFactory - Class in org.apache.beam.runners.dataflow.options
-
Sets Integer value based on old, deprecated field (
DataflowPipelineDebugOptions.getUnboundedReaderMaxReadTimeSec()
). - DataflowPipelineJob - Class in org.apache.beam.runners.dataflow
-
A DataflowPipelineJob represents a job submitted to Dataflow using
DataflowRunner
. - DataflowPipelineJob(DataflowClient, String, DataflowPipelineOptions, Map<AppliedPTransform<?, ?, ?>, String>) - Constructor for class org.apache.beam.runners.dataflow.DataflowPipelineJob
-
Constructs the job.
- DataflowPipelineJob(DataflowClient, String, DataflowPipelineOptions, Map<AppliedPTransform<?, ?, ?>, String>, RunnerApi.Pipeline) - Constructor for class org.apache.beam.runners.dataflow.DataflowPipelineJob
-
Constructs the job.
- DataflowPipelineOptions - Interface in org.apache.beam.runners.dataflow.options
-
Options that can be used to configure the
DataflowRunner
. - DataflowPipelineOptions.FlexResourceSchedulingGoal - Enum Class in org.apache.beam.runners.dataflow.options
-
Set of available Flexible Resource Scheduling goals.
- DataflowPipelineOptions.StagingLocationFactory - Class in org.apache.beam.runners.dataflow.options
-
Returns a default staging location under
GcpOptions.getGcpTempLocation()
. - DataflowPipelineRegistrar - Class in org.apache.beam.runners.dataflow
- DataflowPipelineRegistrar.Options - Class in org.apache.beam.runners.dataflow
-
Register the
DataflowPipelineOptions
. - DataflowPipelineRegistrar.Runner - Class in org.apache.beam.runners.dataflow
-
Register the
DataflowRunner
. - DataflowPipelineTranslator - Class in org.apache.beam.runners.dataflow
-
DataflowPipelineTranslator
knows how to translatePipeline
objects into Cloud Dataflow Service APIJob
s. - DataflowPipelineTranslator.JobSpecification - Class in org.apache.beam.runners.dataflow
-
The result of a job translation.
- DataflowPipelineWorkerPoolOptions - Interface in org.apache.beam.runners.dataflow.options
-
Options that are used to configure the Dataflow pipeline worker pool.
- DataflowPipelineWorkerPoolOptions.AutoscalingAlgorithmType - Enum Class in org.apache.beam.runners.dataflow.options
-
Type of autoscaling algorithm to use.
- DataflowProfilingAgentConfiguration() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowProfilingOptions.DataflowProfilingAgentConfiguration
- DataflowProfilingOptions - Interface in org.apache.beam.runners.dataflow.options
-
Options for controlling profiling of pipeline execution.
- DataflowProfilingOptions.DataflowProfilingAgentConfiguration - Class in org.apache.beam.runners.dataflow.options
-
Configuration the for profiling agent.
- DataflowRunner - Class in org.apache.beam.runners.dataflow
-
A
PipelineRunner
that executes the operations in the pipeline by first translating them to the Dataflow representation using theDataflowPipelineTranslator
and then submitting them to a Dataflow service for execution. - DataflowRunner(DataflowPipelineOptions) - Constructor for class org.apache.beam.runners.dataflow.DataflowRunner
- DataflowRunner.DataflowTransformTranslator - Class in org.apache.beam.runners.dataflow
- DataflowRunner.StreamingPCollectionViewWriterFn<T> - Class in org.apache.beam.runners.dataflow
-
A marker
DoFn
for writing the contents of aPCollection
to a streamingPCollectionView
backend implementation. - DataflowRunnerHooks - Class in org.apache.beam.runners.dataflow
-
An instance of this class can be passed to the
DataflowRunner
to add user defined hooks to be invoked at various times during pipeline execution. - DataflowRunnerHooks() - Constructor for class org.apache.beam.runners.dataflow.DataflowRunnerHooks
- DataflowRunnerInfo - Class in org.apache.beam.runners.dataflow
-
Populates versioning and other information for
DataflowRunner
. - DataflowServiceException - Exception Class in org.apache.beam.runners.dataflow
-
Signals there was an error retrieving information about a job from the Cloud Dataflow Service.
- DataflowStreamingPipelineOptions - Interface in org.apache.beam.runners.dataflow.options
-
[Internal] Options for configuring StreamingDataflowWorker.
- DataflowStreamingPipelineOptions.EnableWindmillServiceDirectPathFactory - Class in org.apache.beam.runners.dataflow.options
-
EnableStreamingEngine defaults to false unless one of the experiment is set.
- DataflowStreamingPipelineOptions.GlobalConfigRefreshPeriodFactory - Class in org.apache.beam.runners.dataflow.options
-
Read global get config request period from system property 'windmill.global_config_refresh_period'.
- DataflowStreamingPipelineOptions.HarnessUpdateReportingPeriodFactory - Class in org.apache.beam.runners.dataflow.options
-
Read counter reporting period from system property 'windmill.harness_update_reporting_period'.
- DataflowStreamingPipelineOptions.LocalWindmillHostportFactory - Class in org.apache.beam.runners.dataflow.options
-
Factory for creating local Windmill address.
- DataflowStreamingPipelineOptions.MaxStackTraceDepthToReportFactory - Class in org.apache.beam.runners.dataflow.options
-
Read 'MaxStackTraceToReport' from system property 'windmill.max_stack_trace_to_report' or Integer.MAX_VALUE if unspecified.
- DataflowStreamingPipelineOptions.PeriodicStatusPageDirectoryFactory - Class in org.apache.beam.runners.dataflow.options
-
Read 'PeriodicStatusPageOutputDirector' from system property 'windmill.periodic_status_page_directory' or null if unspecified.
- DataflowStreamingPipelineOptions.WindmillServiceStreamingRpcBatchLimitFactory - Class in org.apache.beam.runners.dataflow.options
-
Factory for setting value of WindmillServiceStreamingRpcBatchLimit based on environment.
- DataflowTemplateJob - Class in org.apache.beam.runners.dataflow.util
-
A
DataflowPipelineJob
that is returned when--templateRunner
is set. - DataflowTemplateJob() - Constructor for class org.apache.beam.runners.dataflow.util.DataflowTemplateJob
- DataflowTransformTranslator() - Constructor for class org.apache.beam.runners.dataflow.DataflowRunner.DataflowTransformTranslator
- DataflowTransport - Class in org.apache.beam.runners.dataflow.util
-
Helpers for cloud communication.
- DataflowTransport() - Constructor for class org.apache.beam.runners.dataflow.util.DataflowTransport
- DataflowWorkerHarnessOptions - Interface in org.apache.beam.runners.dataflow.options
-
Options that are used exclusively within the Dataflow worker harness.
- DataflowWorkerLoggingOptions - Interface in org.apache.beam.runners.dataflow.options
-
Deprecated.This interface will no longer be the source of truth for worker logging configuration once jobs are executed using a dedicated SDK harness instead of user code being co-located alongside Dataflow worker code. Consider set corresponding options within
SdkHarnessOptions
to ensure forward compatibility. - DataflowWorkerLoggingOptions.Level - Enum Class in org.apache.beam.runners.dataflow.options
-
Deprecated.The set of log levels that can be used on the Dataflow worker.
- DataflowWorkerLoggingOptions.WorkerLogLevelOverrides - Class in org.apache.beam.runners.dataflow.options
-
Deprecated.Defines a log level override for a specific class, package, or name.
- DataframeTransform - Class in org.apache.beam.sdk.extensions.python.transforms
-
Wrapper for invoking external Python
DataframeTransform
. - DataGeneratorPTransform - Class in org.apache.beam.sdk.extensions.sql.meta.provider.datagen
-
The main PTransform that encapsulates the data generation logic.
- DataGeneratorPTransform(Schema, ObjectNode) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.datagen.DataGeneratorPTransform
- DataGeneratorRowFn - Class in org.apache.beam.sdk.extensions.sql.meta.provider.datagen
-
A stateful DoFn that converts a sequence of Longs into structured Rows.
- DataGeneratorRowFn(Schema, ObjectNode, String) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.datagen.DataGeneratorRowFn
- DataGeneratorTable - Class in org.apache.beam.sdk.extensions.sql.meta.provider.datagen
-
Represents a 'datagen' table within a Beam SQL pipeline.
- DataGeneratorTable(Schema, ObjectNode) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.datagen.DataGeneratorTable
- DataGeneratorTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.datagen
-
The service entry point for the 'datagen' table type.
- DataGeneratorTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.datagen.DataGeneratorTableProvider
- DataInputViewWrapper - Class in org.apache.beam.runners.flink.translation.wrappers
-
Wrapper for
DataInputView
. - DataInputViewWrapper(DataInputView) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.DataInputViewWrapper
- DataOutputViewWrapper - Class in org.apache.beam.runners.flink.translation.wrappers
-
Wrapper for
DataOutputView
. - DataOutputViewWrapper(DataOutputView) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.DataOutputViewWrapper
- dataSchema - Variable in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
- dataset - Variable in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
- Dataset - Interface in org.apache.beam.runners.spark.translation
-
Holder for Spark RDD/DStream.
- datasetExists(String) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- dataSets - Variable in class org.apache.beam.runners.twister2.Twister2TranslationContext
- DatasetServiceImpl(BigQueryOptions) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
- dataSourceConfiguration() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransform
- dataSourceConfiguration() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransform
- DataSourceConfiguration() - Constructor for class org.apache.beam.sdk.io.influxdb.InfluxDbIO.DataSourceConfiguration
- DataSourceConfiguration() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
- DataSourceConfiguration() - Constructor for class org.apache.beam.sdk.io.singlestore.SingleStoreIO.DataSourceConfiguration
- DataSourceConfiguration() - Constructor for class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- DatastoreIO - Class in org.apache.beam.sdk.io.gcp.datastore
-
DatastoreIO
provides an API for reading from and writing to Google Cloud Datastore over different versions of the Cloud Datastore Client libraries. - DatastoreV1 - Class in org.apache.beam.sdk.io.gcp.datastore
-
DatastoreV1
provides an API to Read, Write and DeletePCollections
of Google Cloud Datastore version v1Entity
objects. - DatastoreV1.DeleteEntity - Class in org.apache.beam.sdk.io.gcp.datastore
-
A
PTransform
that deletesEntities
from Cloud Datastore. - DatastoreV1.DeleteEntityWithSummary - Class in org.apache.beam.sdk.io.gcp.datastore
-
A
PTransform
that deletesEntities
from Cloud Datastore and returnsDatastoreV1.WriteSuccessSummary
for each successful write. - DatastoreV1.DeleteKey - Class in org.apache.beam.sdk.io.gcp.datastore
- DatastoreV1.DeleteKeyWithSummary - Class in org.apache.beam.sdk.io.gcp.datastore
-
A
PTransform
that deletesEntities
associated with the givenKeys
from Cloud Datastore and returnsDatastoreV1.WriteSuccessSummary
for each successful delete. - DatastoreV1.Read - Class in org.apache.beam.sdk.io.gcp.datastore
-
A
PTransform
that reads the result rows of a Cloud Datastore query asEntity
objects. - DatastoreV1.Write - Class in org.apache.beam.sdk.io.gcp.datastore
-
A
PTransform
that writesEntity
objects to Cloud Datastore. - DatastoreV1.WriteSuccessSummary - Class in org.apache.beam.sdk.io.gcp.datastore
-
Summary object produced when a number of writes are successfully written to Datastore in a single Mutation.
- DatastoreV1.WriteWithSummary - Class in org.apache.beam.sdk.io.gcp.datastore
-
A
PTransform
that writesEntity
objects to Cloud Datastore and returnsDatastoreV1.WriteSuccessSummary
for each successful write. - DataStoreV1SchemaIOProvider - Class in org.apache.beam.sdk.io.gcp.datastore
-
An implementation of
SchemaIOProvider
for reading and writing payloads withDatastoreIO
. - DataStoreV1SchemaIOProvider() - Constructor for class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider
- DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO - Class in org.apache.beam.sdk.io.gcp.datastore
-
An abstraction to create schema aware IOs.
- DataStoreV1TableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.datastore
-
TableProvider
forDatastoreIO
for consumption by Beam SQL. - DataStoreV1TableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.datastore.DataStoreV1TableProvider
- DataStreamDecoder(Coder<T>, PrefetchableIterator<ByteString>) - Constructor for class org.apache.beam.sdk.fn.stream.DataStreams.DataStreamDecoder
- DataStreams - Class in org.apache.beam.sdk.fn.stream
-
DataStreams.DataStreamDecoder
treats multipleByteString
s as a single input stream decoding values with the supplied iterator. - DataStreams() - Constructor for class org.apache.beam.sdk.fn.stream.DataStreams
- DataStreams.DataStreamDecoder<T> - Class in org.apache.beam.sdk.fn.stream
-
An adapter which converts an
InputStream
to aPrefetchableIterator
ofT
values using the specifiedCoder
. - DataStreams.ElementDelimitedOutputStream - Class in org.apache.beam.sdk.fn.stream
-
An adapter which wraps an
DataStreams.OutputChunkConsumer
as anOutputStream
. - DataStreams.OutputChunkConsumer<T> - Interface in org.apache.beam.sdk.fn.stream
-
A callback which is invoked whenever the
DataStreams.outbound(org.apache.beam.sdk.fn.stream.DataStreams.OutputChunkConsumer<org.apache.beam.vendor.grpc.v1p69p0.com.google.protobuf.ByteString>)
OutputStream
becomes full. - date(Integer, Integer, Integer) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.DateFunctions
- date(DateTime) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.DateFunctions
- date(DateTime, String) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.DateFunctions
- Date - Class in org.apache.beam.sdk.schemas.logicaltypes
-
A date without a time-zone.
- Date() - Constructor for class org.apache.beam.sdk.schemas.logicaltypes.Date
- DATE - Enum constant in enum class org.apache.beam.sdk.io.clickhouse.TableSchema.TypeName
- DATE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
- DATE - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- DATE - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.SqlTypes
-
Beam LogicalType corresponding to ZetaSQL/CalciteSQL DATE type.
- DATE_FIELD_NAME - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.DateTime
- DATE_METHOD - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.BeamBuiltinMethods
- DATE_OP - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
- DATE_TYPES - Static variable in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- DateConversion() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.DateConversion
- DateConversion() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.DateConversion
- DateFunctions - Class in org.apache.beam.sdk.extensions.sql.zetasql.translation.impl
-
DateFunctions.
- DateFunctions() - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.DateFunctions
- DateIncrementAllFn() - Constructor for class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.DateIncrementAllFn
- DateTime - Class in org.apache.beam.sdk.schemas.logicaltypes
-
A datetime without a time-zone.
- DateTime() - Constructor for class org.apache.beam.sdk.schemas.logicaltypes.DateTime
- DATETIME - Enum constant in enum class org.apache.beam.sdk.io.clickhouse.TableSchema.TypeName
- DATETIME - Enum constant in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- DATETIME - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- DATETIME - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.SqlTypes
-
Beam LogicalType corresponding to ZetaSQL DATETIME type.
- DATETIME - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
-
The type of datetime fields.
- DATETIME_SCHEMA - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.DateTime
- DateTimeBundle() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.DateTimeBundle
- DateTimeUtils - Class in org.apache.beam.sdk.extensions.sql.zetasql
-
DateTimeUtils.
- DateTimeUtils() - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.DateTimeUtils
- days(int) - Static method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows
-
Returns a
WindowFn
that windows elements into periods measured by days. - DB2 - Enum constant in enum class org.apache.beam.io.debezium.Connectors
- DDL_EXECUTOR - Static variable in class org.apache.beam.sdk.extensions.sql.impl.parser.BeamSqlParser
-
Ddl Executor.
- DEAD_LETTER - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read
-
The tag for the deadletter output of FHIR resources.
- DEAD_LETTER - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search
-
The tag for the deadletter output of FHIR Resources.
- DEAD_LETTER - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything
-
The tag for the deadletter output of FHIR Resources from a GetPatientEverything request.
- DEAD_LETTER - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read
-
The tag for the deadletter output of HL7v2 read responses.
- DEAD_LETTER - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read
-
The tag for the deadletter output of HL7v2 Messages.
- DeadLetteredTransform<InputT,
OutputT> - Class in org.apache.beam.sdk.schemas.io - DeadLetteredTransform(SimpleFunction<InputT, OutputT>, String) - Constructor for class org.apache.beam.sdk.schemas.io.DeadLetteredTransform
- deadLetterQueue - Search tag in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
- Section
- DebeziumIO - Class in org.apache.beam.io.debezium
-
Utility class which exposes an implementation
DebeziumIO.read()
and a Debezium configuration. - DebeziumIO.ConnectorConfiguration - Class in org.apache.beam.io.debezium
-
A POJO describing a Debezium configuration.
- DebeziumIO.Read<T> - Class in org.apache.beam.io.debezium
-
Implementation of
DebeziumIO.read()
. - DebeziumReadSchemaTransformConfiguration() - Constructor for class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration
- DebeziumReadSchemaTransformProvider - Class in org.apache.beam.io.debezium
-
A schema-aware transform provider for
DebeziumIO
. - DebeziumReadSchemaTransformProvider() - Constructor for class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider
- DebeziumReadSchemaTransformProvider(Boolean, Integer, Long) - Constructor for class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider
- DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration - Class in org.apache.beam.io.debezium
- DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration.Builder - Class in org.apache.beam.io.debezium
- debeziumRecordInstant(SourceRecord) - Static method in class org.apache.beam.io.debezium.KafkaConnectUtils
- DebeziumSDFDatabaseHistory() - Constructor for class org.apache.beam.io.debezium.KafkaSourceConsumerFn.DebeziumSDFDatabaseHistory
- DebeziumTransformRegistrar - Class in org.apache.beam.io.debezium
-
Exposes
DebeziumIO.Read
as an external transform for cross-language usage. - DebeziumTransformRegistrar() - Constructor for class org.apache.beam.io.debezium.DebeziumTransformRegistrar
- DebeziumTransformRegistrar.ReadBuilder - Class in org.apache.beam.io.debezium
- DebeziumTransformRegistrar.ReadBuilder.Configuration - Class in org.apache.beam.io.debezium
- DEBUG - Enum constant in enum class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.Level
-
Deprecated.Level for logging diagnostic messages.
- DEBUG - Enum constant in enum class org.apache.beam.sdk.options.SdkHarnessOptions.LogLevel
-
LogLevel for logging diagnostic messages.
- dec() - Method in class org.apache.beam.runners.jet.metrics.CounterImpl
- dec() - Method in interface org.apache.beam.sdk.metrics.Counter
- dec() - Method in class org.apache.beam.sdk.metrics.DelegatingCounter
- dec() - Method in class org.apache.beam.sdk.metrics.NoOpCounter
- dec(long) - Method in class org.apache.beam.runners.jet.metrics.CounterImpl
- dec(long) - Method in interface org.apache.beam.sdk.metrics.Counter
- dec(long) - Method in class org.apache.beam.sdk.metrics.DelegatingCounter
- dec(long) - Method in class org.apache.beam.sdk.metrics.NoOpCounter
- decActivePartitionReadCounter() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Decrements the
ChangeStreamMetrics.ACTIVE_PARTITION_READ_COUNT
by 1 if the metric is enabled. - DECIMAL - Enum constant in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- DECIMAL - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
- DECIMAL - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
-
The type of decimal fields.
- decode(InputStream) - Method in class org.apache.beam.sdk.extensions.ordered.combiner.SequenceRangeAccumulator.SequenceRangeAccumulatorCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.ReadableFileCoder
- decode(InputStream) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.FooterCoder
- decode(InputStream) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmRecordCoder
- decode(InputStream) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmShardCoder
- decode(InputStream) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.KeyPrefixCoder
- decode(InputStream) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.MetadataKeyCoder
- decode(InputStream) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
- decode(InputStream) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.SingletonKeyedWorkItemCoder
- decode(InputStream) - Method in class org.apache.beam.runners.fnexecution.wire.ByteStringCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.BigEndianLongCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.BigEndianShortCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.BitSetCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.BooleanCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.ByteCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.Coder
-
Decodes a value of type
T
from the given input stream in the given context. - decode(InputStream) - Method in class org.apache.beam.sdk.coders.DelegateCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.DoubleCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.DurationCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.FloatCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.InstantCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.KvCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.MapCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.NullableCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.OptionalCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.SerializableCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.ShardedKeyCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.SnappyCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.SortedMapCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.TextualIntegerCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.TimestampPrefixingWindowCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.VarIntCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.VarLongCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.VoidCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.ZstdCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.HyperLogLogPlusCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow.Coder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.amqp.AmqpMessageCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.aws2.dynamodb.AttributeValueCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.CountingSource.CounterMarkCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy.ParamsCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocumentCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResultCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.fs.MetadataCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.fs.MetadataCoderV2
- decode(InputStream) - Method in class org.apache.beam.sdk.io.fs.ResourceIdCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertErrorCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.RowMutation.RowMutationCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV2
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV3
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResultCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameterCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOErrorCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2MessageCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadResponseCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.JsonArrayCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdAndOrderingKeyCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithMessageIdCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithTopicCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.OffsetByteRangeCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.SubscriptionPartitionCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UuidCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.hadoop.WritableCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.kafka.ProducerRecordCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.kafka.TopicPartitionCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.pulsar.PulsarMessageCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.range.OffsetRange.Coder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.splunk.SplunkEventCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.thrift.ThriftCoder
-
Decodes a value of type
T
from the given input stream using providedThriftCoder.protocolFactory
. - decode(InputStream) - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.schemas.SchemaCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.testing.TestStream.TestStreamCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow.Coder
- decode(InputStream) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo.PaneInfoCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow.Coder
- decode(InputStream) - Method in class org.apache.beam.sdk.values.PCollectionViews.ValueOrMetadataCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
- decode(InputStream) - Method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.values.WindowedValues.FullWindowedValueCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.values.WindowedValues.ParamWindowedValueCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.values.WindowedValues.ValueOnlyWindowedValueCoder
-
Deprecated.
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.SingletonKeyedWorkItemCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.runners.fnexecution.wire.ByteStringCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.BitSetCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.Coder
-
Deprecated.only implement and call
Coder.decode(InputStream)
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.DelegateCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.KvCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.MapCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.NullableCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.SortedMapCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.TextualIntegerCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow.Coder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.WindowedValues.FullWindowedValueCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.WindowedValues.ParamWindowedValueCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.WindowedValues.ValueOnlyWindowedValueCoder
-
Deprecated.
- decodeFromChunkBoundaryToChunkBoundary() - Method in class org.apache.beam.sdk.fn.stream.DataStreams.DataStreamDecoder
-
Skips any remaining bytes in the current
ByteString
moving to the nextByteString
in the underlyingByteString
iterator
and decoding elements till at the next boundary. - decodeKey(ByteBuffer, Coder<K>) - Static method in class org.apache.beam.runners.flink.translation.wrappers.streaming.FlinkKeyUtils
-
Decodes a key from a ByteBuffer containing a byte array.
- decodePacked32TimeSeconds(int) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Decodes
bitFieldTimeSeconds
as aLocalTime
with seconds precision. - decodePacked32TimeSecondsAsJavaTime(int) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Decodes
bitFieldTimeSeconds
as aLocalTime
with seconds precision. - decodePacked64DatetimeMicros(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Decodes
bitFieldDatetimeMicros
as aLocalDateTime
with microseconds precision. - decodePacked64DatetimeMicrosAsJavaTime(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Decodes
bitFieldDatetimeMicros
as aLocalDateTime
with microseconds precision. - decodePacked64DatetimeSeconds(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Decodes
bitFieldDatetimeSeconds
as aLocalDateTime
with seconds precision. - decodePacked64DatetimeSecondsAsJavaTime(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Decodes
bitFieldDatetimeSeconds
as aLocalDateTime
with seconds precision. - decodePacked64TimeMicros(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Decodes
bitFieldTimeMicros
as aLocalTime
with microseconds precision. - decodePacked64TimeMicrosAsJavaTime(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Decodes
bitFieldTimeMicros
as aLocalTime
with microseconds precision. - decodePacked64TimeNanos(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Decodes
bitFieldTimeNanos
as aLocalTime
with nanoseconds precision. - decodePacked64TimeNanosAsJavaTime(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Decodes
bitFieldTimeNanos
as aLocalTime
with nanoseconds precision. - decodePane(byte) - Static method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
- decodeQueryResult(String) - Static method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
- decodeTimerDataTimerId(String) - Static method in class org.apache.beam.runners.fnexecution.control.TimerReceiverFactory
-
Decodes a string into the transform and timer family ids.
- decodeToIterable(List<T>) - Method in class org.apache.beam.sdk.coders.CollectionCoder
-
Builds an instance of
IterableT
, this coder's associatedIterable
-like subtype, from a list of decoded elements. - decodeToIterable(List<T>) - Method in class org.apache.beam.sdk.coders.DequeCoder
- decodeToIterable(List<T>) - Method in class org.apache.beam.sdk.coders.IterableCoder
- decodeToIterable(List<T>) - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
-
Builds an instance of
IterableT
, this coder's associatedIterable
-like subtype, from a list of decoded elements. - decodeToIterable(List<T>) - Method in class org.apache.beam.sdk.coders.ListCoder
- decodeToIterable(List<T>) - Method in class org.apache.beam.sdk.coders.SetCoder
-
Builds an instance of
IterableT
, this coder's associatedIterable
-like subtype, from a list of decoded elements. - decodeToIterable(List<T>, long, InputStream) - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
-
Builds an instance of
IterableT
, this coder's associatedIterable
-like subtype, from a list of decoded elements with theInputStream
at the position where this coder detected the end of the stream. - decodeWindowedValue(byte[], Coder) - Static method in class org.apache.beam.runners.jet.Utils
- DecodingFnDataReceiver<T> - Class in org.apache.beam.sdk.fn.data
-
A receiver of encoded data, decoding it and passing it onto a downstream consumer.
- DecodingFnDataReceiver(Coder<T>, FnDataReceiver<T>) - Constructor for class org.apache.beam.sdk.fn.data.DecodingFnDataReceiver
- decPartitionStreamCount() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
-
Decrements the
ChangeStreamMetrics.PARTITION_STREAM_COUNT
by 1. - DECRBY - Enum constant in enum class org.apache.beam.sdk.io.redis.RedisIO.Write.Method
-
Use DECRBY command.
- decrementingLongs() - Static method in class org.apache.beam.sdk.fn.IdGenerators
-
Returns an
IdGenerators
that will provide successive decrementing longs. - DedupingOperator<T> - Class in org.apache.beam.runners.flink.translation.wrappers.streaming.io
-
Remove values with duplicate ids.
- DedupingOperator(PipelineOptions) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.io.DedupingOperator
- deduplicate() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions
- deduplicate(UuidDeduplicationOptions) - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteIO
-
Remove duplicates from the PTransform from a read.
- Deduplicate - Class in org.apache.beam.sdk.transforms
-
A set of
PTransform
s which deduplicate input records over a time domain and threshold. - Deduplicate.KeyedValues<K,
V> - Class in org.apache.beam.sdk.transforms -
Deduplicates keyed values using the key over a specified time domain and threshold.
- Deduplicate.Values<T> - Class in org.apache.beam.sdk.transforms
-
Deduplicates values over a specified time domain and threshold.
- Deduplicate.WithRepresentativeValues<T,
IdT> - Class in org.apache.beam.sdk.transforms -
A
PTransform
that uses aSerializableFunction
to obtain a representative value for each input element used for deduplication. - Deduplication - Search tag in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO
- Section
- deepEquals(Object) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.AbstractBeamCalcRel
- deepEquals(Object, Object, Schema.FieldType) - Static method in class org.apache.beam.sdk.values.Row.Equals
- deepHashCode(Object, Schema.FieldType) - Static method in class org.apache.beam.sdk.values.Row.Equals
- DEF - Static variable in interface org.apache.beam.sdk.extensions.sql.impl.planner.NodeStatsMetadata
- Default - Annotation Interface in org.apache.beam.sdk.options
-
Default
represents a set of annotations that can be used to annotate getter properties onPipelineOptions
with information representing the default value to be returned if no value is specified. - Default() - Constructor for class org.apache.beam.sdk.fn.stream.PrefetchableIterables.Default
- DEFAULT - Enum constant in enum class org.apache.beam.sdk.io.clickhouse.TableSchema.DefaultType
- DEFAULT - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead.Method
-
The default behavior if no method is explicitly set.
- DEFAULT - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.Method
-
The default behavior if no method is explicitly set.
- DEFAULT - Static variable in class org.apache.beam.sdk.extensions.sbe.SbeSchema.IrOptions
- DEFAULT - Static variable in interface org.apache.beam.sdk.extensions.sql.meta.catalog.Catalog
- DEFAULT - Static variable in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamBigQuerySqlDialect
- DEFAULT_ADVANCE_TIMEOUT_IN_MILLIS - Static variable in class org.apache.beam.sdk.io.solace.broker.SolaceMessageReceiver
- DEFAULT_ATTRIBUTE - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.internal.Uuid
- DEFAULT_BUFFER_LIMIT_BYTES - Static variable in class org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator
- DEFAULT_BUFFER_LIMIT_TIME_MS - Static variable in class org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator
- DEFAULT_BUFFER_SIZE - Static variable in class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
- DEFAULT_BYTE_ARRAY_CODER - Static variable in class org.apache.beam.sdk.io.TFRecordIO
-
The default coder, which returns each record of the input file as a byte array.
- DEFAULT_CALC - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner
- DEFAULT_CHANGE_STREAM_NAME - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
-
The default change stream name for a change stream query is the empty
String
. - DEFAULT_CONTEXT - Static variable in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamBigQuerySqlDialect
- DEFAULT_DEDUPLICATE_DURATION - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions
- DEFAULT_DURATION - Static variable in class org.apache.beam.sdk.transforms.Deduplicate
-
The default duration is 10 mins.
- DEFAULT_INCLUSIVE_END_AT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
-
The default end timestamp for a change stream query is
ChangeStreamsConstants.MAX_INCLUSIVE_END_AT
. - DEFAULT_INCLUSIVE_START_AT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
-
The default start timestamp for a change stream query is
Timestamp.MIN_VALUE
. - DEFAULT_INITIAL_BACKOFF - Static variable in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO
- DEFAULT_MASTER_URL - Static variable in interface org.apache.beam.runners.spark.SparkCommonPipelineOptions
- DEFAULT_MAX_CUMULATIVE_BACKOFF - Static variable in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO
- DEFAULT_MAX_ELEMENTS_TO_OUTPUT - Static variable in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler
- DEFAULT_MAX_INSERT_BLOCK_SIZE - Static variable in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO
- DEFAULT_MAX_INVOCATION_HISTORY - Static variable in class org.apache.beam.runners.jobsubmission.InMemoryJobService
-
The default maximum number of completed invocations to keep.
- DEFAULT_MAX_NUM_ELEMENTS - Static variable in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
-
The cost (in time and space) to compute quantiles to a given accuracy is a function of the total number of elements in the data set.
- DEFAULT_MAX_RETRIES - Static variable in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO
- DEFAULT_METADATA_TABLE_NAME - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
- DEFAULT_OUTBOUND_BUFFER_LIMIT_BYTES - Static variable in class org.apache.beam.sdk.fn.stream.DataStreams
- DEFAULT_PRECISION - Static variable in class org.apache.beam.sdk.extensions.zetasketch.HllCount
-
The default
precision
value used inHllCount.Init.Builder.withPrecision(int)
is 15. - DEFAULT_RPC_PRIORITY - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
-
The default priority for a change stream query is
Options.RpcPriority.HIGH
. - DEFAULT_SCHEMA_FIELD_NAME - Static variable in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- DEFAULT_SCHEMA_RECORD_NAME - Static variable in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- DEFAULT_SCHEME - Static variable in class org.apache.beam.sdk.io.FileSystems
- DEFAULT_SESSION_DURATION_SECS - Static variable in class org.apache.beam.sdk.io.aws2.auth.StsAssumeRoleForFederatedCredentialsProvider
- DEFAULT_TIME_DOMAIN - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions
- DEFAULT_TIME_DOMAIN - Static variable in class org.apache.beam.sdk.transforms.Deduplicate
-
The default is the
processing time domain
. - DEFAULT_TIMEOUT - Static variable in class org.apache.beam.io.requestresponse.RequestResponseIO
-
The default
Duration
to wait until completion of user code. - DEFAULT_UNWINDOWED_SHARD_TEMPLATE - Static variable in class org.apache.beam.sdk.io.DefaultFilenamePolicy
-
The default sharding name template.
- DEFAULT_UPLOAD_BUFFER_SIZE_BYTES - Static variable in class org.apache.beam.sdk.io.aws2.options.S3Options.S3UploadBufferSizeBytesFactory
- DEFAULT_USES_RESHUFFLE - Static variable in class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceTransform
- DEFAULT_UUID_EXTRACTOR - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions
- DEFAULT_VPN_NAME - Static variable in class org.apache.beam.sdk.io.solace.broker.SessionService
- DEFAULT_WATERMARK_REFRESH_RATE - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
-
The default period for which we will re-compute the watermark of the
DetectNewPartitionsDoFn
stage. - DEFAULT_WINDOWED_SHARD_TEMPLATE - Static variable in class org.apache.beam.sdk.io.DefaultFilenamePolicy
-
The default windowed sharding name template used when writing windowed files.
- DEFAULT_WRITER_CLIENTS_PER_WORKER - Static variable in class org.apache.beam.sdk.io.solace.SolaceIO
- DEFAULT_WRITER_DELIVERY_MODE - Static variable in class org.apache.beam.sdk.io.solace.SolaceIO
- DEFAULT_WRITER_NUM_SHARDS - Static variable in class org.apache.beam.sdk.io.solace.SolaceIO
- DEFAULT_WRITER_PUBLISH_LATENCY_METRICS - Static variable in class org.apache.beam.sdk.io.solace.SolaceIO
- DEFAULT_WRITER_SUBMISSION_MODE - Static variable in class org.apache.beam.sdk.io.solace.SolaceIO
- DEFAULT_WRITER_TYPE - Static variable in class org.apache.beam.sdk.io.solace.SolaceIO
- Default.Boolean - Annotation Interface in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified boolean primitive value.
- Default.Byte - Annotation Interface in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified byte primitive value.
- Default.Character - Annotation Interface in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified char primitive value.
- Default.Class - Annotation Interface in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified
Class
value. - Default.Double - Annotation Interface in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified double primitive value.
- Default.Enum - Annotation Interface in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified enum.
- Default.Float - Annotation Interface in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified float primitive value.
- Default.InstanceFactory - Annotation Interface in org.apache.beam.sdk.options
-
Value must be of type
DefaultValueFactory
and have a default constructor. - Default.Integer - Annotation Interface in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified int primitive value.
- Default.Long - Annotation Interface in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified long primitive value.
- Default.Short - Annotation Interface in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified short primitive value.
- Default.String - Annotation Interface in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified
String
value. - DefaultAutoscaler - Class in org.apache.beam.sdk.io.jms
-
Default implementation of
AutoScaler
. - DefaultAutoscaler() - Constructor for class org.apache.beam.sdk.io.jms.DefaultAutoscaler
- DefaultBlobstoreClientBuilderFactory - Class in org.apache.beam.sdk.io.azure.blobstore
-
Construct BlobServiceClientBuilder with given values of Azure client properties.
- DefaultBlobstoreClientBuilderFactory() - Constructor for class org.apache.beam.sdk.io.azure.blobstore.DefaultBlobstoreClientBuilderFactory
- DefaultCoder - Annotation Interface in org.apache.beam.sdk.coders
-
The
DefaultCoder
annotation specifies aCoder
class to handle encoding and decoding instances of the annotated class. - DefaultCoder.DefaultCoderProviderRegistrar - Class in org.apache.beam.sdk.coders
-
A
CoderProviderRegistrar
that registers aCoderProvider
which can use the@DefaultCoder
annotation to providecoder providers
that createsCoder
s. - DefaultCoder.DefaultCoderProviderRegistrar.DefaultCoderProvider - Class in org.apache.beam.sdk.coders
-
A
CoderProvider
that uses the@DefaultCoder
annotation to providecoder providers
that createCoder
s. - DefaultCoderCloudObjectTranslatorRegistrar - Class in org.apache.beam.runners.dataflow.util
-
The
CoderCloudObjectTranslatorRegistrar
containing the default collection ofCoder
Cloud Object Translators
. - DefaultCoderCloudObjectTranslatorRegistrar() - Constructor for class org.apache.beam.runners.dataflow.util.DefaultCoderCloudObjectTranslatorRegistrar
- DefaultCoderProvider() - Constructor for class org.apache.beam.sdk.coders.DefaultCoder.DefaultCoderProviderRegistrar.DefaultCoderProvider
- DefaultCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.coders.DefaultCoder.DefaultCoderProviderRegistrar
- DefaultConcludeTransform() - Constructor for class org.apache.beam.sdk.testing.PAssert.DefaultConcludeTransform
- defaultConfig(JdbcConnection, Collection<RuleSet>) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner
- DefaultErrorHandler() - Constructor for class org.apache.beam.sdk.transforms.errorhandling.ErrorHandler.DefaultErrorHandler
- DefaultExecutableStageContext - Class in org.apache.beam.runners.fnexecution.control
-
Implementation of a
ExecutableStageContext
. - defaultFactory() - Static method in interface org.apache.beam.sdk.io.aws2.common.ClientBuilderFactory
-
The default
ClientBuilderFactory
instance. - DefaultFilenamePolicy - Class in org.apache.beam.sdk.io
-
A default
FileBasedSink.FilenamePolicy
for windowed and unwindowed files. - DefaultFilenamePolicy.Params - Class in org.apache.beam.sdk.io
-
Encapsulates constructor parameters to
DefaultFilenamePolicy
. - DefaultFilenamePolicy.ParamsCoder - Class in org.apache.beam.sdk.io
-
A Coder for
DefaultFilenamePolicy.Params
. - DefaultGcpRegionFactory - Class in org.apache.beam.runners.dataflow.options
-
Factory for a default value for Google Cloud region according to https://cloud.google.com/compute/docs/gcloud-compute/#default-properties.
- DefaultGcpRegionFactory() - Constructor for class org.apache.beam.runners.dataflow.options.DefaultGcpRegionFactory
- DefaultGoogleAdsClientFactory - Class in org.apache.beam.sdk.io.googleads
-
The default way to construct a
GoogleAdsClient
. - DefaultGoogleAdsClientFactory() - Constructor for class org.apache.beam.sdk.io.googleads.DefaultGoogleAdsClientFactory
- DefaultJobBundleFactory - Class in org.apache.beam.runners.fnexecution.control
-
A
JobBundleFactory
for which the implementation can specify a customEnvironmentFactory
for environment management. - DefaultJobBundleFactory.ServerInfo - Class in org.apache.beam.runners.fnexecution.control
-
A container for EnvironmentFactory and its corresponding Grpc servers.
- DefaultJobBundleFactory.WrappedSdkHarnessClient - Class in org.apache.beam.runners.fnexecution.control
-
Holder for an
SdkHarnessClient
along with its associated state and data servers. - DefaultJobServerConfigFactory() - Constructor for class org.apache.beam.runners.portability.testing.TestPortablePipelineOptions.DefaultJobServerConfigFactory
- DefaultMaxCacheMemoryUsageMb() - Constructor for class org.apache.beam.sdk.options.SdkHarnessOptions.DefaultMaxCacheMemoryUsageMb
- DefaultMaxCacheMemoryUsageMbFactory() - Constructor for class org.apache.beam.sdk.options.SdkHarnessOptions.DefaultMaxCacheMemoryUsageMbFactory
- defaultNaming(String, String) - Static method in class org.apache.beam.sdk.io.FileIO.Write
- defaultNaming(ValueProvider<String>, ValueProvider<String>) - Static method in class org.apache.beam.sdk.io.FileIO.Write
-
Defines a default
FileIO.Write.FileNaming
which will use the prefix and suffix supplied to create a name based on the window, pane, number of shards, shard index, and compression. - defaultOptions() - Static method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
Factory method to return a new instance of
RpcQosOptions
with all default values. - DefaultPipelineOptionsRegistrar - Class in org.apache.beam.sdk.options
-
A
PipelineOptionsRegistrar
containing thePipelineOptions
subclasses available by default. - DefaultPipelineOptionsRegistrar() - Constructor for class org.apache.beam.sdk.options.DefaultPipelineOptionsRegistrar
- DefaultProjectFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.DefaultProjectFactory
- DefaultRateLimiter(BackOff, BackOff) - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory.DefaultRateLimiter
- DefaultRateLimiter(Duration, Duration, Duration) - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory.DefaultRateLimiter
- DefaultRetryStrategy() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcIO.DefaultRetryStrategy
- defaults() - Static method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- Defaults() - Constructor for class org.apache.beam.sdk.Pipeline.PipelineVisitor.Defaults
- DefaultS3ClientBuilderFactory - Class in org.apache.beam.sdk.io.aws2.s3
-
Construct S3ClientBuilder with default values of S3 client properties like path style access, accelerated mode, etc.
- DefaultS3ClientBuilderFactory() - Constructor for class org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
- DefaultS3FileSystemSchemeRegistrar - Class in org.apache.beam.sdk.io.aws2.s3
-
Registers the "s3" uri schema to be handled by
S3FileSystem
. - DefaultS3FileSystemSchemeRegistrar() - Constructor for class org.apache.beam.sdk.io.aws2.s3.DefaultS3FileSystemSchemeRegistrar
- DefaultSchema - Annotation Interface in org.apache.beam.sdk.schemas.annotations
-
The
DefaultSchema
annotation specifies aSchemaProvider
class to handle obtaining a schema and row for the specified class. - DefaultSchema.DefaultSchemaProvider - Class in org.apache.beam.sdk.schemas.annotations
-
SchemaProvider
for default schemas. - DefaultSchema.DefaultSchemaProviderRegistrar - Class in org.apache.beam.sdk.schemas.annotations
-
Registrar for default schemas.
- DefaultSchemaProvider() - Constructor for class org.apache.beam.sdk.schemas.annotations.DefaultSchema.DefaultSchemaProvider
- DefaultSchemaProviderRegistrar() - Constructor for class org.apache.beam.sdk.schemas.annotations.DefaultSchema.DefaultSchemaProviderRegistrar
- DefaultSequenceCombiner<EventKeyT,
EventT, - Class in org.apache.beam.sdk.extensions.ordered.combinerStateT> -
Default global sequence combiner.
- DefaultSequenceCombiner(EventExaminer<EventT, StateT>) - Constructor for class org.apache.beam.sdk.extensions.ordered.combiner.DefaultSequenceCombiner
- DefaultStopPipelineWatermarkFactory() - Constructor for class org.apache.beam.runners.spark.TestSparkPipelineOptions.DefaultStopPipelineWatermarkFactory
- DefaultTableFilter - Class in org.apache.beam.sdk.extensions.sql.meta
-
This default implementation of
BeamSqlTableFilter
interface. - DefaultTableFilter(List<RexNode>) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.DefaultTableFilter
- DefaultTrigger - Class in org.apache.beam.sdk.transforms.windowing
-
A trigger that is equivalent to
Repeatedly.forever(AfterWatermark.pastEndOfWindow())
. - defaultType() - Method in class org.apache.beam.sdk.io.clickhouse.TableSchema.Column
- DefaultTypeConversionsFactory() - Constructor for class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.DefaultTypeConversionsFactory
- defaultValue() - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
- defaultValue() - Method in class org.apache.beam.sdk.io.clickhouse.TableSchema.Column
- defaultValue() - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
-
Returns the default value when there are no values added to the accumulator.
- defaultValue() - Method in interface org.apache.beam.sdk.transforms.CombineFnBase.GlobalCombineFn
-
Returns the default value when there are no values added to the accumulator.
- defaultValue() - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
- defaultValue() - Method in class org.apache.beam.sdk.transforms.View.AsSingleton
-
Returns the default value of this transform, or null if there isn't one.
- DefaultValueFactory<T> - Interface in org.apache.beam.sdk.options
-
An interface used with the
Default.InstanceFactory
annotation to specify the class that will be an instance factory to produce default values for a given getter onPipelineOptions
. - Defining Your Own PipelineOptions - Search tag in interface org.apache.beam.sdk.options.PipelineOptions
- Section
- DEFLATE - Enum constant in enum class org.apache.beam.sdk.io.CompressedSource.CompressionMode
-
Deprecated.
- DEFLATE - Enum constant in enum class org.apache.beam.sdk.io.Compression
-
Deflate compression.
- DEFLATE - Enum constant in enum class org.apache.beam.sdk.io.FileBasedSink.CompressionType
-
Deprecated.
- DEFLATE - Enum constant in enum class org.apache.beam.sdk.io.TextIO.CompressionType
-
Deprecated.
- DEFLATE - Enum constant in enum class org.apache.beam.sdk.io.xml.XmlIO.Read.CompressionType
-
Deprecated.
- deidentify(String, String, DeidentifyConfig) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
-
Deidentify FHIR resources.
- deidentify(ValueProvider<String>, ValueProvider<String>, ValueProvider<DeidentifyConfig>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
-
Deidentify FHIR resources.
- deidentify(DoFn.ProcessContext) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Deidentify.DeidentifyFn
- Deidentify(ValueProvider<String>, ValueProvider<String>, ValueProvider<DeidentifyConfig>) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Deidentify
- deidentifyFhirStore(String, String, DeidentifyConfig) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Deidentify a GCP FHIR Store and write the result into a new FHIR Store.
- deidentifyFhirStore(String, String, DeidentifyConfig) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- DeidentifyFn(ValueProvider<String>, ValueProvider<DeidentifyConfig>) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Deidentify.DeidentifyFn
- delay(Duration) - Static method in class org.apache.beam.sdk.transforms.windowing.TimestampTransform
-
For internal use only; no backwards-compatibility guarantees.
- Delay() - Constructor for class org.apache.beam.sdk.transforms.windowing.TimestampTransform.Delay
- DelayIntervalRateLimiter() - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory.DelayIntervalRateLimiter
- DelayIntervalRateLimiter(Supplier<Duration>) - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory.DelayIntervalRateLimiter
- delegate() - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.ParDoSingle
- delegate(HasDisplayData) - Method in interface org.apache.beam.sdk.transforms.display.DisplayData.Builder
-
Register display data from the specified component on behalf of the current component.
- delegateBasedUponType(EnumMap<BeamFnApi.StateKey.TypeCase, StateRequestHandler>) - Static method in class org.apache.beam.runners.fnexecution.state.StateRequestHandlers
-
Returns a
StateRequestHandler
which delegates to the supplied handler depending on theBeamFnApi.StateRequest
stype
. - DelegateCoder<T,
IntermediateT> - Class in org.apache.beam.sdk.coders -
A
DelegateCoder<T, IntermediateT>
wraps aCoder
forIntermediateT
and encodes/decodes values of typeT
by converting to/fromIntermediateT
and then encoding/decoding using the underlyingCoder<IntermediateT>
. - DelegateCoder(Coder<IntermediateT>, DelegateCoder.CodingFunction<T, IntermediateT>, DelegateCoder.CodingFunction<IntermediateT, T>, TypeDescriptor<T>) - Constructor for class org.apache.beam.sdk.coders.DelegateCoder
- DelegateCoder.CodingFunction<InputT,
OutputT> - Interface in org.apache.beam.sdk.coders -
A
CodingFunction<InputT, OutputT>
is a serializable function fromInputT
toOutputT
that may throw anyException
. - DelegatingCounter - Class in org.apache.beam.sdk.metrics
-
Implementation of
Counter
that delegates to the instance for the current context. - DelegatingCounter(MetricName) - Constructor for class org.apache.beam.sdk.metrics.DelegatingCounter
-
Create a
DelegatingCounter
withperWorkerCounter
andprocessWideContainer
set to false. - DelegatingCounter(MetricName, boolean) - Constructor for class org.apache.beam.sdk.metrics.DelegatingCounter
-
Create a
DelegatingCounter
withperWorkerCounter
set to false. - DelegatingCounter(MetricName, boolean, boolean) - Constructor for class org.apache.beam.sdk.metrics.DelegatingCounter
- DelegatingDistribution - Class in org.apache.beam.sdk.metrics
-
Implementation of
Distribution
that delegates to the instance for the current context. - DelegatingDistribution(MetricName) - Constructor for class org.apache.beam.sdk.metrics.DelegatingDistribution
- DelegatingDistribution(MetricName, boolean) - Constructor for class org.apache.beam.sdk.metrics.DelegatingDistribution
- DelegatingGauge - Class in org.apache.beam.sdk.metrics
-
Implementation of
Gauge
that delegates to the instance for the current context. - DelegatingGauge(MetricName) - Constructor for class org.apache.beam.sdk.metrics.DelegatingGauge
-
Create a
DelegatingGauge
withperWorkerGauge
andprocessWideContainer
set to false. - DelegatingGauge(MetricName, boolean) - Constructor for class org.apache.beam.sdk.metrics.DelegatingGauge
- DelegatingHistogram - Class in org.apache.beam.sdk.metrics
-
Implementation of
Histogram
that delegates to the instance for the current context. - DelegatingHistogram(MetricName, HistogramData.BucketType, boolean) - Constructor for class org.apache.beam.sdk.metrics.DelegatingHistogram
- delete() - Static method in class org.apache.beam.sdk.io.cassandra.CassandraIO
-
Provide a
CassandraIO.Write
PTransform
to delete data to a Cassandra database. - delete(Collection<ClassLoaderFileSystem.ClassLoaderResourceId>) - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem
- delete(Collection<ResourceId>, MoveOptions...) - Static method in class org.apache.beam.sdk.io.FileSystems
-
Deletes a collection of resources.
- delete(Collection<ResourceIdT>) - Method in class org.apache.beam.sdk.io.FileSystem
-
Deletes a collection of resources.
- DELETE - Enum constant in enum class org.apache.beam.sdk.io.cassandra.CassandraIO.MutationType
- DELETE - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.RowMutationInformation.MutationType
- DELETE - Enum constant in enum class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ModType
- DELETE_URN - Static variable in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar
- deleteAsync(T) - Method in interface org.apache.beam.sdk.io.cassandra.Mapper
-
This method is called for each delete event.
- DeleteBuilder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.DeleteBuilder
- deleteDataset(String, String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Deletes the dataset specified by the datasetId value.
- deleteDataset(String, String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
-
Deletes the dataset specified by the datasetId value.
- deleteDataset(String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
- deleteDataset(String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- deleteDicomStore(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Delete a Dicom Store.
- deleteDicomStore(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- deleteEntity() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1
-
Returns an empty
DatastoreV1.DeleteEntity
builder. - deleteFhirStore(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Delete Fhir store.
- deleteFhirStore(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- deleteFile() - Method in class org.apache.beam.sdk.jmh.io.TextSourceBenchmark.Data
- deleteHL7v2Message(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Deletes an HL7v2 message.
- deleteHL7v2Message(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- deleteHL7v2Store(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Deletes an HL7v2 store.
- deleteHL7v2Store(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- deleteKey() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1
-
Returns an empty
DatastoreV1.DeleteKey
builder. - deleteNewPartition(NewPartition) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
-
This is the 2nd step of 2 phase delete.
- deletePartitionMetadataTable(List<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Drops the metadata table.
- deleteSchema(PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Delete
PubsubClient.SchemaPath
. - deleteSchema(PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
-
Delete
PubsubClient.SchemaPath
. - deleteSchema(PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
-
Delete
PubsubClient.SchemaPath
. - deleteSchema(PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
-
Delete
PubsubClient.SchemaPath
. - deleteStaticCaches() - Static method in class org.apache.beam.runners.flink.translation.utils.Workarounds
- deleteStreamPartitionRow(Range.ByteStringRange) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
-
This is the 2nd step of 2 phase delete of StreamPartition.
- deleteSubscription(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Delete
subscription
. - deleteSubscription(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
- deleteSubscription(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
- deleteSubscription(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- deleteTable(TableReference) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Deletes the table specified by tableId from the dataset.
- deleteTable(TableReference) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
-
Deletes the table specified by tableId from the dataset.
- deleteTable(TableReference) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- deleteTable(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
- deleteTimer(StateNamespace, String, String) - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
- deleteTimer(StateNamespace, String, String, TimeDomain) - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
- deleteTimer(TimerInternals.TimerData) - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
- deleteTimer(TimerInternals.TimerData) - Method in class org.apache.beam.runners.spark.translation.streaming.ParDoStateUpdateFn.SparkTimerInternalsIterator
- deleteTimer(Instant, TimeDomain) - Method in interface org.apache.beam.sdk.state.Timers
-
Removes the timer set in this context for the
timestamp
andtimeDomain
. - deleteTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
- deleteTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
- deleteTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
- deleteTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- delimitElement() - Method in class org.apache.beam.sdk.fn.stream.DataStreams.ElementDelimitedOutputStream
- dependencies(Row, PipelineOptions) - Method in interface org.apache.beam.sdk.schemas.transforms.SchemaTransformProvider
-
List the dependencies needed for this transform.
- dependencies(Row, PipelineOptions) - Method in class org.apache.beam.sdk.schemas.transforms.TypedSchemaTransformProvider
- Dependencies - Search tag in class org.apache.beam.io.debezium.DebeziumIO
- Section
- Dependency - Class in org.apache.beam.sdk.expansion.service
- Dependency() - Constructor for class org.apache.beam.sdk.expansion.service.Dependency
- dependsOnlyOnEarliestTimestamp() - Method in enum class org.apache.beam.sdk.transforms.windowing.TimestampCombiner
-
Returns
true
if the result of combination of many output timestamps actually depends only on the earliest. - dependsOnlyOnWindow() - Method in enum class org.apache.beam.sdk.transforms.windowing.TimestampCombiner
-
Returns
true
if the result does not depend on what outputs were combined but only the window they are in. - DequeCoder<T> - Class in org.apache.beam.sdk.coders
- DequeCoder(Coder<T>) - Constructor for class org.apache.beam.sdk.coders.DequeCoder
- deregister() - Method in interface org.apache.beam.runners.fnexecution.state.StateDelegator.Registration
-
De-registers the handler for all future requests for state for the registered process bundle instruction id.
- deriveIterableValueCoder(WindowedValues.FullWindowedValueCoder) - Static method in class org.apache.beam.runners.jet.Utils
- deriveRowType() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnnestRel
- deriveRowType() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.BeamZetaSqlUnnestRel
- deriveRowType() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.ZetaSqlUnnest
- deriveUncollectRowType(RelNode, boolean) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.ZetaSqlUnnest
-
Returns the row type returned by applying the 'UNNEST' operation to a relational expression.
- describe(Set<Class<? extends PipelineOptions>>) - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
-
Outputs the set of options available to be set for the passed in
PipelineOptions
interfaces. - describeMismatchSafely(BigqueryMatcher.TableAndQuery, Description) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher
- describeMismatchSafely(ShardedFile, Description) - Method in class org.apache.beam.sdk.testing.FileChecksumMatcher
- describeMismatchSafely(T, Description) - Method in class org.apache.beam.sdk.testing.JsonMatcher
- describePipelineOptions(JobApi.DescribePipelineOptionsRequest, StreamObserver<JobApi.DescribePipelineOptionsResponse>) - Method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
- describeTo(Description) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher
- describeTo(Description) - Method in class org.apache.beam.sdk.testing.FileChecksumMatcher
- describeTo(Description) - Method in class org.apache.beam.sdk.testing.JsonMatcher
- describeTo(Description) - Method in class org.apache.beam.sdk.testing.RegexMatcher
- describeTo(Description) - Method in class org.apache.beam.sdk.testing.TestPipelineOptions.AlwaysPassMatcher
- description() - Method in class org.apache.beam.sdk.extensions.sql.expansion.SqlTransformSchemaTransformProvider
- description() - Method in class org.apache.beam.sdk.io.aws2.sqs.providers.SqsReadSchemaTransformProvider
- description() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider
- description() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
- description() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
- description() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider
- description() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider
- description() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider
- description() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider
- description() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider
- description() - Method in class org.apache.beam.sdk.io.jdbc.providers.ReadFromMySqlSchemaTransformProvider
- description() - Method in class org.apache.beam.sdk.io.jdbc.providers.ReadFromOracleSchemaTransformProvider
- description() - Method in class org.apache.beam.sdk.io.jdbc.providers.ReadFromPostgresSchemaTransformProvider
- description() - Method in class org.apache.beam.sdk.io.jdbc.providers.ReadFromSqlServerSchemaTransformProvider
- description() - Method in class org.apache.beam.sdk.io.jdbc.providers.WriteToMySqlSchemaTransformProvider
- description() - Method in class org.apache.beam.sdk.io.jdbc.providers.WriteToOracleSchemaTransformProvider
- description() - Method in class org.apache.beam.sdk.io.jdbc.providers.WriteToPostgresSchemaTransformProvider
- description() - Method in class org.apache.beam.sdk.io.jdbc.providers.WriteToSqlServerSchemaTransformProvider
- description() - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider
- description() - Method in interface org.apache.beam.sdk.schemas.transforms.SchemaTransformProvider
-
Returns a description regarding the
SchemaTransform
represented by theSchemaTransformProvider
. - Description - Annotation Interface in org.apache.beam.sdk.options
-
Descriptions are used to generate human readable output when the
--help
command is specified. - deserialize(byte[]) - Method in interface org.apache.beam.sdk.schemas.io.payloads.PayloadSerializer
- deserialize(byte[], DataInputView) - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueSerializer
- deserialize(JsonParser, DeserializationContext) - Method in class org.apache.beam.sdk.options.ValueProvider.Deserializer
- deserialize(String, byte[]) - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantDeserializer
- deserialize(StateNamespace, DataInputView) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.FlinkStateNamespaceKeySerializer
- deserialize(DataInputView) - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeSerializer
- deserialize(DataInputView) - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueSerializer
- deserialize(DataInputView) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.FlinkStateNamespaceKeySerializer
- deserialize(T, DataInputView) - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeSerializer
- deserializeAwsCredentialsProvider(String) - Static method in class org.apache.beam.sdk.io.aws2.options.AwsSerializableUtils
- DeserializeBytesIntoPubsubMessagePayloadOnly() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages.DeserializeBytesIntoPubsubMessagePayloadOnly
- deserializeObject(byte[]) - Static method in class org.apache.beam.runners.flink.translation.utils.SerdeUtils
- deserializeOneOf(Expression, List<Encoder<T>>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers
- DeserializerProvider<T> - Interface in org.apache.beam.sdk.io.kafka
-
Provides a configured
Deserializer
instance and its associatedCoder
. - deserializeTimers(Collection<byte[]>, TimerInternals.TimerDataCoderV2) - Static method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
- desiredBundleSizeBytes - Variable in class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceTransform
- desiredRequestParallelism - Variable in class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromBytes
- desiredRequestParallelism - Variable in class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromBytesWithContext
- desiredRequestParallelism - Variable in class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromGcsUri
- desiredRequestParallelism - Variable in class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromGcsUriWithContext
- Destination() - Constructor for class org.apache.beam.sdk.io.solace.data.Solace.Destination
- DESTINATION - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.providers.PortableBigQueryDestinations
- Detailed description - Search tag in org.apache.beam.sdk.io.BoundedSource.BoundedReader.splitAtFraction(double)
- Section
- Detailed description - Search tag in org.apache.beam.sdk.io.BoundedSource.BoundedReader.splitAtFraction(double)
- Section
- detect(String) - Static method in enum class org.apache.beam.sdk.io.Compression
- DETECT_NEW_PARTITION_SUFFIX - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
- detectNewPartitionsAction(ChangeStreamMetrics, MetadataTableDao, Instant, GenerateInitialPartitionsAction, ResumeFromPreviousPipelineAction, ProcessNewPartitionsAction) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ActionFactory
-
Creates and returns a singleton instance of an action class for processing
DetectNewPartitionsDoFn
. - detectNewPartitionsAction(PartitionMetadataDao, PartitionMetadataMapper, WatermarkCache, ChangeStreamMetrics, Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.ActionFactory
-
Creates and returns a single instance of an action class capable of detecting and scheduling new partitions to be queried.
- DetectNewPartitionsAction - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.action
-
This class processes
DetectNewPartitionsDoFn
. - DetectNewPartitionsAction - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.action
-
This class is responsible for scheduling partitions.
- DetectNewPartitionsAction(ChangeStreamMetrics, MetadataTableDao, Instant, GenerateInitialPartitionsAction, ResumeFromPreviousPipelineAction, ProcessNewPartitionsAction) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.DetectNewPartitionsAction
- DetectNewPartitionsAction(PartitionMetadataDao, PartitionMetadataMapper, WatermarkCache, ChangeStreamMetrics, Duration) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.DetectNewPartitionsAction
-
Constructs an action class for detecting / scheduling new partitions.
- DetectNewPartitionsDoFn - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn
- DetectNewPartitionsDoFn - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn
-
A SplittableDoFn (SDF) that is responsible for scheduling partitions to be queried.
- DetectNewPartitionsDoFn(DaoFactory, MapperFactory, ActionFactory, CacheFactory, ChangeStreamMetrics) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
-
This class needs a
DaoFactory
to build DAOs to access the partition metadata tables. - DetectNewPartitionsDoFn(Instant, ActionFactory, DaoFactory, ChangeStreamMetrics) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.DetectNewPartitionsDoFn
- DetectNewPartitionsRangeTracker - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction
-
This restriction tracker delegates most of its behavior to an internal
TimestampRangeTracker
. - DetectNewPartitionsRangeTracker(TimestampRange) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.DetectNewPartitionsRangeTracker
- DetectNewPartitionsState - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.model
-
Metadata of the progress of
DetectNewPartitionsDoFn
from the metadata table. - DetectNewPartitionsState(Instant, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.DetectNewPartitionsState
- DetectNewPartitionsTracker - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction
- DetectNewPartitionsTracker(long) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.DetectNewPartitionsTracker
- detectStreamingMode(Pipeline, StreamingOptions) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator
-
Analyse the pipeline to determine if we have to switch to streaming mode for the pipeline translation and update
StreamingOptions
accordingly. - DicomIO - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The DicomIO connectors allows Beam pipelines to make calls to the Dicom API of the Google Cloud Healthcare API (https://cloud.google.com/healthcare/docs/how-tos#dicom-guide).
- DicomIO() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.DicomIO
- DicomIO.ReadStudyMetadata - Class in org.apache.beam.sdk.io.gcp.healthcare
-
This class makes a call to the retrieve metadata endpoint (https://cloud.google.com/healthcare/docs/how-tos/dicomweb#retrieving_metadata).
- DicomIO.ReadStudyMetadata.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
- dicomStorePath - Variable in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
- DicomWebPath() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
- DIRECT_READ - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead.Method
-
Read the contents of a table directly using the BigQuery storage API.
- Direct and persistent messages, and latency metrics - Search tag in class org.apache.beam.sdk.io.solace.SolaceIO
- Section
- DirectOptions - Interface in org.apache.beam.runners.direct
-
Options that can be used to configure the
DirectRunner
. - DirectOptions.AvailableParallelismFactory - Class in org.apache.beam.runners.direct
-
A
DefaultValueFactory
that returns the result ofRuntime.availableProcessors()
from theDirectOptions.AvailableParallelismFactory.create(PipelineOptions)
method. - DIRECTORY_CONTAINER - Static variable in class org.apache.beam.sdk.io.ShardNameTemplate
-
Shard is a file within a directory.
- DirectRegistrar - Class in org.apache.beam.runners.direct
- DirectRegistrar.Options - Class in org.apache.beam.runners.direct
-
Registers the
DirectOptions
. - DirectRegistrar.Runner - Class in org.apache.beam.runners.direct
-
Registers the
DirectRunner
. - DirectRunner - Class in org.apache.beam.runners.direct
- DirectRunner() - Constructor for class org.apache.beam.sdk.options.PipelineOptions.DirectRunner
- DirectRunner.DirectPipelineResult - Class in org.apache.beam.runners.direct
-
The result of running a
Pipeline
with theDirectRunner
. - DirectStreamObserver<T> - Class in org.apache.beam.sdk.fn.stream
-
A
StreamObserver
which uses synchronization on the underlyingCallStreamObserver
to provide thread safety. - DirectStreamObserver(Phaser, CallStreamObserver<T>) - Constructor for class org.apache.beam.sdk.fn.stream.DirectStreamObserver
- DirectTestOptions - Interface in org.apache.beam.runners.direct
-
Internal-only options for tweaking the behavior of the
PipelineOptions.DirectRunner
in ways that users should never do. - DISALLOW - Enum constant in enum class org.apache.beam.sdk.io.fs.EmptyMatchTreatment
-
Filepatterns matching no resources are disallowed.
- DISALLOW_COMBINER_LIFTING - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- DISALLOWED_CONSUMER_PROPERTIES - Static variable in class org.apache.beam.sdk.io.kafka.KafkaIOUtils
- discard() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator
- discardDataset(Dataset) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- DISCARDING_FIRED_PANES - Enum constant in enum class org.apache.beam.sdk.values.WindowingStrategy.AccumulationMode
- discardingFiredPanes() - Method in class org.apache.beam.sdk.transforms.windowing.Window
-
Returns a new
Window
PTransform
that uses the registered WindowFn and Triggering behavior, and that discards elements in a pane after they are triggered. - discoverSchemaTransform(ExpansionApi.DiscoverSchemaTransformRequest, StreamObserver<ExpansionApi.DiscoverSchemaTransformResponse>) - Method in class org.apache.beam.sdk.expansion.service.ExpansionService
- dispatchBag(Coder<?>) - Method in interface org.apache.beam.sdk.state.StateSpec.Cases
- dispatchBag(Coder<?>) - Method in class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
- dispatchCombining(Combine.CombineFn<?, ?, ?>, Coder<?>) - Method in interface org.apache.beam.sdk.state.StateSpec.Cases
- dispatchCombining(Combine.CombineFn<?, ?, ?>, Coder<?>) - Method in class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
- dispatchDefault() - Method in class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
- dispatchMap(Coder<?>, Coder<?>) - Method in interface org.apache.beam.sdk.state.StateSpec.Cases
- dispatchMap(Coder<?>, Coder<?>) - Method in class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
- dispatchMultimap(Coder<?>, Coder<?>) - Method in interface org.apache.beam.sdk.state.StateSpec.Cases
- dispatchMultimap(Coder<?>, Coder<?>) - Method in class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
- dispatchOrderedList(Coder<?>) - Method in interface org.apache.beam.sdk.state.StateSpec.Cases
- dispatchSet(Coder<?>) - Method in interface org.apache.beam.sdk.state.StateSpec.Cases
- dispatchSet(Coder<?>) - Method in class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
- dispatchValue(Coder<?>) - Method in interface org.apache.beam.sdk.state.StateSpec.Cases
- dispatchValue(Coder<?>) - Method in class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
- DISPLAY_DATA - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- displayData - Variable in class org.apache.beam.sdk.transforms.PTransform
- DisplayData - Class in org.apache.beam.sdk.transforms.display
-
Static display data associated with a pipeline component.
- DisplayData.Builder - Interface in org.apache.beam.sdk.transforms.display
-
Utility to build up display data from a component and its included subcomponents.
- DisplayData.Identifier - Class in org.apache.beam.sdk.transforms.display
-
Unique identifier for a display data item within a component.
- DisplayData.Item - Class in org.apache.beam.sdk.transforms.display
-
Items
are the unit of display data. - DisplayData.ItemSpec<T> - Class in org.apache.beam.sdk.transforms.display
-
Specifies an
DisplayData.Item
to register as display data. - DisplayData.Path - Class in org.apache.beam.sdk.transforms.display
-
Structured path of registered display data within a component hierarchy.
- DisplayData.Type - Enum Class in org.apache.beam.sdk.transforms.display
-
Display data type.
- Distinct<T> - Class in org.apache.beam.sdk.transforms
-
Distinct<T>
takes aPCollection<T>
and returns aPCollection<T>
that has all distinct elements of the input. - Distinct() - Constructor for class org.apache.beam.sdk.transforms.Distinct
- Distinct.WithRepresentativeValues<T,
IdT> - Class in org.apache.beam.sdk.transforms -
A
Distinct
PTransform
that uses aSerializableFunction
to obtain a representative value for each input element. - distribution(Class<?>, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
-
Create a metric that records various statistics about the distribution of reported values.
- distribution(String, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
-
Create a metric that records various statistics about the distribution of reported values.
- Distribution - Interface in org.apache.beam.sdk.metrics
-
A metric that reports information about the distribution of reported values.
- DistributionImpl - Class in org.apache.beam.runners.jet.metrics
-
Implementation of
Distribution
. - DistributionImpl(MetricName) - Constructor for class org.apache.beam.runners.jet.metrics.DistributionImpl
- DistributionResult - Class in org.apache.beam.sdk.metrics
-
The result of a
Distribution
metric. - DistributionResult() - Constructor for class org.apache.beam.sdk.metrics.DistributionResult
- DIVIDE - Enum constant in enum class org.apache.beam.sdk.extensions.sql.impl.cep.CEPKind
- divideBy(RelOptCost) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
- DLP_PAYLOAD_LIMIT_BYTES - Static variable in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
- DLP_PAYLOAD_LIMIT_BYTES - Static variable in class org.apache.beam.sdk.extensions.ml.DLPInspectText
- DLP_PAYLOAD_LIMIT_BYTES - Static variable in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
- DLPDeidentifyText - Class in org.apache.beam.sdk.extensions.ml
-
A
PTransform
connecting to Cloud DLP (https://cloud.google.com/dlp/docs/libraries) and deidentifying text according to provided settings. - DLPDeidentifyText() - Constructor for class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
- DLPDeidentifyText.Builder - Class in org.apache.beam.sdk.extensions.ml
- DLPInspectText - Class in org.apache.beam.sdk.extensions.ml
-
A
PTransform
connecting to Cloud DLP (https://cloud.google.com/dlp/docs/libraries) and inspecting text for identifying data according to provided settings. - DLPInspectText() - Constructor for class org.apache.beam.sdk.extensions.ml.DLPInspectText
- DLPInspectText.Builder - Class in org.apache.beam.sdk.extensions.ml
- DLPReidentifyText - Class in org.apache.beam.sdk.extensions.ml
-
A
PTransform
connecting to Cloud DLP (https://cloud.google.com/dlp/docs/libraries) and inspecting text for identifying data according to provided settings. - DLPReidentifyText() - Constructor for class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
- DLPReidentifyText.Builder - Class in org.apache.beam.sdk.extensions.ml
- DlqProvider - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
- DlqProvider() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.DlqProvider
- DO_NOT_CLONE - Enum constant in enum class org.apache.beam.sdk.transforms.DoFnTester.CloningBehavior
-
Deprecated.
- DO_NOT_ENTER_TRANSFORM - Enum constant in enum class org.apache.beam.sdk.Pipeline.PipelineVisitor.CompositeBehavior
- doChecks(PAssert.PAssertionSite, ActualT, SerializableFunction<ActualT, Void>) - Static method in class org.apache.beam.sdk.testing.PAssert
- DockerEnvironmentFactory - Class in org.apache.beam.runners.fnexecution.environment
-
An
EnvironmentFactory
that creates docker containers by shelling out to docker. - DockerEnvironmentFactory.Provider - Class in org.apache.beam.runners.fnexecution.environment
-
Provider for DockerEnvironmentFactory.
- docToBulk() - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO
- DocToBulk() - Constructor for class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocToBulk
- Document() - Constructor for class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Document
- doesMetadataTableExist() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
- doFn - Variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- DoFn<InputT,
OutputT> - Class in org.apache.beam.sdk.transforms -
The argument to
ParDo
providing the code to use to process elements of the inputPCollection
. - DoFn() - Constructor for class org.apache.beam.sdk.transforms.DoFn
- DoFn.AlwaysFetched - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for declaring that a state parameter is always fetched.
- DoFn.BoundedPerElement - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation on a splittable
DoFn
specifying that theDoFn
performs a bounded amount of work per input element, so applying it to a boundedPCollection
will produce also a boundedPCollection
. - DoFn.BundleFinalizer - Interface in org.apache.beam.sdk.transforms
-
A parameter that is accessible during
@StartBundle
,@ProcessElement
and@FinishBundle
that allows the caller to register a callback that will be invoked after the bundle has been successfully completed and the runner has commit the output. - DoFn.BundleFinalizer.Callback - Interface in org.apache.beam.sdk.transforms
-
An instance of a function that will be invoked after bundle finalization.
- DoFn.Element - Annotation Interface in org.apache.beam.sdk.transforms
-
Parameter annotation for the input element for
DoFn.ProcessElement
,DoFn.GetInitialRestriction
,DoFn.GetSize
,DoFn.SplitRestriction
,DoFn.GetInitialWatermarkEstimatorState
,DoFn.NewWatermarkEstimator
, andDoFn.NewTracker
methods. - DoFn.FieldAccess - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for specifying specific fields that are accessed in a Schema PCollection.
- DoFn.FinishBundle - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for the method to use to finish processing a batch of elements.
- DoFn.FinishBundleContext - Class in org.apache.beam.sdk.transforms
-
Information accessible while within the
DoFn.FinishBundle
method. - DoFn.GetInitialRestriction - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for the method that maps an element to an initial restriction for a splittable
DoFn
. - DoFn.GetInitialWatermarkEstimatorState - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for the method that maps an element and restriction to initial watermark estimator state for a splittable
DoFn
. - DoFn.GetRestrictionCoder - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for the method that returns the coder to use for the restriction of a splittable
DoFn
. - DoFn.GetSize - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for the method that returns the corresponding size for an element and restriction pair.
- DoFn.GetWatermarkEstimatorStateCoder - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for the method that returns the coder to use for the watermark estimator state of a splittable
DoFn
. - DoFn.Key - Annotation Interface in org.apache.beam.sdk.transforms
-
Parameter annotation for dereferencing input element key in
KV
pair. - DoFn.MultiOutputReceiver - Interface in org.apache.beam.sdk.transforms
-
Receives tagged output for a multi-output function.
- DoFn.NewTracker - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for the method that creates a new
RestrictionTracker
for the restriction of a splittableDoFn
. - DoFn.NewWatermarkEstimator - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for the method that creates a new
WatermarkEstimator
for the watermark state of a splittableDoFn
. - DoFn.OnTimer - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for registering a callback for a timer.
- DoFn.OnTimerContext - Class in org.apache.beam.sdk.transforms
-
Information accessible when running a
DoFn.OnTimer
method. - DoFn.OnTimerFamily - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for registering a callback for a timerFamily.
- DoFn.OnWindowExpiration - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for the method to use for performing actions on window expiration.
- DoFn.OnWindowExpirationContext - Class in org.apache.beam.sdk.transforms
- DoFn.OutputReceiver<T> - Interface in org.apache.beam.sdk.transforms
-
Receives values of the given type.
- DoFn.ProcessContext - Class in org.apache.beam.sdk.transforms
-
Information accessible when running a
DoFn.ProcessElement
method. - DoFn.ProcessContinuation - Class in org.apache.beam.sdk.transforms
-
When used as a return value of
DoFn.ProcessElement
, indicates whether there is more work to be done for the current element. - DoFn.ProcessElement - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for the method to use for processing elements.
- DoFn.RequiresStableInput - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation that may be added to a
DoFn.ProcessElement
,DoFn.OnTimer
, orDoFn.OnWindowExpiration
method to indicate that the runner must ensure that the observable contents of the inputPCollection
or mutable state must be stable upon retries. - DoFn.RequiresTimeSortedInput - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation that may be added to a
DoFn.ProcessElement
method to indicate that the runner must ensure that the observable contents of the inputPCollection
is sorted by time, in ascending order. - DoFn.Restriction - Annotation Interface in org.apache.beam.sdk.transforms
-
Parameter annotation for the restriction for
DoFn.GetSize
,DoFn.SplitRestriction
,DoFn.GetInitialWatermarkEstimatorState
,DoFn.NewWatermarkEstimator
, andDoFn.NewTracker
methods. - DoFn.Setup - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for the method to use to prepare an instance for processing bundles of elements.
- DoFn.SideInput - Annotation Interface in org.apache.beam.sdk.transforms
-
Parameter annotation for the SideInput for a
DoFn.ProcessElement
method. - DoFn.SplitRestriction - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for the method that splits restriction of a splittable
DoFn
into multiple parts to be processed in parallel. - DoFn.StartBundle - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for the method to use to prepare an instance for processing a batch of elements.
- DoFn.StartBundleContext - Class in org.apache.beam.sdk.transforms
-
Information accessible while within the
DoFn.StartBundle
method. - DoFn.StateId - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for declaring and dereferencing state cells.
- DoFn.Teardown - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for the method to use to clean up this instance before it is discarded.
- DoFn.TimerFamily - Annotation Interface in org.apache.beam.sdk.transforms
-
Parameter annotation for the TimerMap for a
DoFn.ProcessElement
method. - DoFn.TimerId - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for declaring and dereferencing timers.
- DoFn.Timestamp - Annotation Interface in org.apache.beam.sdk.transforms
-
Parameter annotation for the input element timestamp for
DoFn.ProcessElement
,DoFn.GetInitialRestriction
,DoFn.GetSize
,DoFn.SplitRestriction
,DoFn.GetInitialWatermarkEstimatorState
,DoFn.NewWatermarkEstimator
, andDoFn.NewTracker
methods. - DoFn.TruncateRestriction - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation for the method that truncates the restriction of a splittable
DoFn
into a bounded one. - DoFn.UnboundedPerElement - Annotation Interface in org.apache.beam.sdk.transforms
-
Annotation on a splittable
DoFn
specifying that theDoFn
performs an unbounded amount of work per input element, so applying it to a boundedPCollection
will produce an unboundedPCollection
. - DoFn.WatermarkEstimatorState - Annotation Interface in org.apache.beam.sdk.transforms
-
Parameter annotation for the watermark estimator state for the
DoFn.NewWatermarkEstimator
method. - DoFn.WindowedContext - Class in org.apache.beam.sdk.transforms
-
Information accessible to all methods in this
DoFn
where the context is in some window. - DoFnFunction<OutputT,
InputT> - Class in org.apache.beam.runners.twister2.translators.functions -
DoFn function.
- DoFnFunction() - Constructor for class org.apache.beam.runners.twister2.translators.functions.DoFnFunction
- DoFnFunction(Twister2TranslationContext, DoFn<InputT, OutputT>, Coder<InputT>, Map<TupleTag<?>, Coder<?>>, List<TupleTag<?>>, WindowingStrategy<?, ?>, Map<PCollectionView<?>, WindowingStrategy<?, ?>>, TupleTag<OutputT>, DoFnSchemaInformation, Map<TupleTag<?>, Integer>, Map<String, PCollectionView<?>>) - Constructor for class org.apache.beam.runners.twister2.translators.functions.DoFnFunction
- DoFnOperator<PreInputT,
InputT, - Class in org.apache.beam.runners.flink.translation.wrappers.streamingOutputT> -
Flink operator for executing
DoFns
. - DoFnOperator(DoFn<InputT, OutputT>, String, Coder<WindowedValue<InputT>>, Map<TupleTag<?>, Coder<?>>, TupleTag<OutputT>, List<TupleTag<?>>, DoFnOperator.OutputManagerFactory<OutputT>, WindowingStrategy<?, ?>, Map<Integer, PCollectionView<?>>, Collection<PCollectionView<?>>, PipelineOptions, Coder<?>, KeySelector<WindowedValue<InputT>, ?>, DoFnSchemaInformation, Map<String, PCollectionView<?>>) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
-
Constructor for DoFnOperator.
- DoFnOperator.BufferedOutputManager<OutputT> - Class in org.apache.beam.runners.flink.translation.wrappers.streaming
-
A
WindowedValueReceiver
that can buffer its outputs. - DoFnOperator.FlinkStepContext - Class in org.apache.beam.runners.flink.translation.wrappers.streaming
-
StepContext
for runningDoFns
on Flink. - DoFnOperator.MultiOutputOutputManagerFactory<OutputT> - Class in org.apache.beam.runners.flink.translation.wrappers.streaming
-
Implementation of
DoFnOperator.OutputManagerFactory
that creates anDoFnOperator.BufferedOutputManager
that can write to multiple logical outputs by Flink side output. - DoFnOutputReceivers - Class in org.apache.beam.sdk.transforms
-
Common
DoFn.OutputReceiver
andDoFn.MultiOutputReceiver
classes. - DoFnOutputReceivers() - Constructor for class org.apache.beam.sdk.transforms.DoFnOutputReceivers
- doFnRunner - Variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- DoFnRunnerWithMetrics<InputT,
OutputT> - Class in org.apache.beam.runners.spark.translation -
DoFnRunner decorator which registers
MetricsContainerImpl
. - DoFnRunnerWithMetrics(String, DoFnRunner<InputT, OutputT>, MetricsContainerStepMapAccumulator) - Constructor for class org.apache.beam.runners.spark.translation.DoFnRunnerWithMetrics
- DoFnRunnerWithMetricsUpdate<InputT,
OutputT> - Class in org.apache.beam.runners.flink.metrics -
DoFnRunner
decorator which registersMetricsContainerImpl
. - DoFnRunnerWithMetricsUpdate(String, DoFnRunner<InputT, OutputT>, FlinkMetricContainer) - Constructor for class org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate
- DoFns - Search tag in class org.apache.beam.sdk.transforms.ParDo
- Section
- DoFnSchemaInformation - Class in org.apache.beam.sdk.transforms
-
Represents information about how a DoFn extracts schemas.
- DoFnSchemaInformation() - Constructor for class org.apache.beam.sdk.transforms.DoFnSchemaInformation
- DoFnSchemaInformation.Builder - Class in org.apache.beam.sdk.transforms
-
The builder object.
- DoFnTester<InputT,
OutputT> - Class in org.apache.beam.sdk.transforms -
Deprecated.Use
TestPipeline
with theDirectRunner
. - DoFnTester.CloningBehavior - Enum Class in org.apache.beam.sdk.transforms
-
Deprecated.Use
TestPipeline
with theDirectRunner
. - doHoldLock(Range.ByteStringRange, String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
-
Return true if the uuid holds the lock of the partition.
- DONE - Enum constant in enum class org.apache.beam.sdk.PipelineResult.State
-
The job has successfully completed.
- doPartitionsOverlap(Range.ByteStringRange, Range.ByteStringRange) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ByteStringRangeHelper
-
Returns true if the two ByteStringRange overlaps, otherwise false.
- dotExpression() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- dotExpression() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.FieldSpecifierContext
- dotExpressionComponent() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- dotExpressionComponent() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionContext
- dotExpressionComponent(int) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionContext
- DotExpressionComponentContext() - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionComponentContext
- DotExpressionComponentContext(ParserRuleContext, int) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionComponentContext
- DotExpressionContext(ParserRuleContext, int) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionContext
- DOUBLE - Enum constant in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- DOUBLE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
- DOUBLE - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
-
The type of double fields.
- DOUBLE_NAN_WRAPPER - Static variable in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamBigQuerySqlDialect
- DOUBLE_NEGATIVE_INF_WRAPPER - Static variable in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamBigQuerySqlDialect
- DOUBLE_POSITIVE_INF_WRAPPER - Static variable in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamBigQuerySqlDialect
- DoubleCoder - Class in org.apache.beam.sdk.coders
-
A
DoubleCoder
encodesDouble
values in 8 bytes using Java serialization. - doubles() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
The
TypeDescriptor
for Double. - doublesGlobally() - Static method in class org.apache.beam.sdk.transforms.Max
-
Returns a
PTransform
that takes an inputPCollection<Double>
and returns aPCollection<Double>
whose contents is the maximum of the inputPCollection
's elements, orDouble.NEGATIVE_INFINITY
if there are no elements. - doublesGlobally() - Static method in class org.apache.beam.sdk.transforms.Min
-
Returns a
PTransform
that takes an inputPCollection<Double>
and returns aPCollection<Double>
whose contents is the minimum of the inputPCollection
's elements, orDouble.POSITIVE_INFINITY
if there are no elements. - doublesGlobally() - Static method in class org.apache.beam.sdk.transforms.Sum
-
Returns a
PTransform
that takes an inputPCollection<Double>
and returns aPCollection<Double>
whose contents is the sum of the inputPCollection
's elements, or0
if there are no elements. - doublesPerKey() - Static method in class org.apache.beam.sdk.transforms.Max
-
Returns a
PTransform
that takes an inputPCollection<KV<K, Double>>
and returns aPCollection<KV<K, Double>>
that contains an output element mapping each distinct key in the inputPCollection
to the maximum of the values associated with that key in the inputPCollection
. - doublesPerKey() - Static method in class org.apache.beam.sdk.transforms.Min
-
Returns a
PTransform
that takes an inputPCollection<KV<K, Double>>
and returns aPCollection<KV<K, Double>>
that contains an output element mapping each distinct key in the inputPCollection
to the minimum of the values associated with that key in the inputPCollection
. - doublesPerKey() - Static method in class org.apache.beam.sdk.transforms.Sum
-
Returns a
PTransform
that takes an inputPCollection<KV<K, Double>>
and returns aPCollection<KV<K, Double>>
that contains an output element mapping each distinct key in the inputPCollection
to the sum of the values associated with that key in the inputPCollection
. - doubleToByteArray(double) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigtableUtils
- drive() - Method in interface org.apache.beam.runners.local.ExecutionDriver
- DriverConfiguration() - Constructor for class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverConfiguration
- Driver configuration - Search tag in class org.apache.beam.sdk.io.neo4j.Neo4jIO
- Section
- dropCatalog(String) - Method in interface org.apache.beam.sdk.extensions.sql.meta.catalog.CatalogManager
-
Drops the catalog with this name.
- dropCatalog(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.catalog.EmptyCatalogManager
- dropCatalog(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.catalog.InMemoryCatalogManager
- dropDatabase(String, boolean) - Method in interface org.apache.beam.sdk.extensions.sql.meta.catalog.Catalog
-
Drops the database with this name.
- dropDatabase(String, boolean) - Method in class org.apache.beam.sdk.extensions.sql.meta.catalog.InMemoryCatalog
- dropDatabase(String, boolean) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.iceberg.IcebergCatalog
- dropExpiredTimers(SparkTimerInternals, WindowingStrategy<?, W>) - Static method in class org.apache.beam.runners.spark.util.TimerUtils
- DropFields - Class in org.apache.beam.sdk.schemas.transforms
-
A transform to drop fields from a schema.
- DropFields() - Constructor for class org.apache.beam.sdk.schemas.transforms.DropFields
- DropFields.Inner<T> - Class in org.apache.beam.sdk.schemas.transforms
-
Implementation class for DropFields.
- dropNamespace(String, boolean) - Method in class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig
- dropping(List<String>) - Method in class org.apache.beam.sdk.io.iceberg.IcebergIO.ReadRows
- dropTable(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
- dropTable(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.iceberg.IcebergTableProvider
- dropTable(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.InMemoryMetaTableProvider
- dropTable(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.ReadOnlyTableProvider
- dropTable(String) - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider
-
Drops a table.
- dropTable(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider
- dropTable(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.store.InMemoryMetaStore
- dropTable(String) - Method in class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig
- dropTable(SqlParserPos, boolean, SqlIdentifier) - Static method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlDdlNodes
-
Creates a DROP TABLE.
- dryRunQuery(String, JobConfigurationQuery, String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
-
Dry runs the query in the given project.
- dryRunQuery(String, JobConfigurationQuery, String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
- dStreamValues(JavaPairDStream<T1, T2>) - Static method in class org.apache.beam.runners.spark.translation.TranslationUtils
-
Transform a pair stream into a value stream.
- duplicate - Enum constant in enum class org.apache.beam.sdk.extensions.ordered.UnprocessedEvent.Reason
- duplicate() - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeSerializer
- duplicate() - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueComparator
- duplicate() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.FlinkStateNamespaceKeySerializer
- DURATION - Enum constant in enum class org.apache.beam.sdk.transforms.display.DisplayData.Type
- DurationCoder - Class in org.apache.beam.sdk.coders
- DurationConvert() - Constructor for class org.apache.beam.sdk.extensions.protobuf.ProtoSchemaLogicalTypes.DurationConvert
- durationMilliSec - Variable in class org.apache.beam.sdk.transforms.PeriodicSequence.SequenceDefinition
- DYNAMIC_DESTINATIONS - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider
- DYNAMIC_DESTINATIONS - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
- DynamicAvroDestinations<UserT,
DestinationT, - Class in org.apache.beam.sdk.extensions.avro.ioOutputT> -
A specialization of
FileBasedSink.DynamicDestinations
forAvroIO
. - DynamicAvroDestinations() - Constructor for class org.apache.beam.sdk.extensions.avro.io.DynamicAvroDestinations
- DynamicDestinations<T,
DestinationT> - Class in org.apache.beam.sdk.io.gcp.bigquery -
This class provides the most general way of specifying dynamic BigQuery table destinations.
- DynamicDestinations - Interface in org.apache.beam.sdk.io.iceberg
- DynamicDestinations() - Constructor for class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
- DynamicDestinations() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
- Dynamic destinations - Search tag in class org.apache.beam.sdk.io.FileIO
- Section
- Dynamic Destinations - Search tag in class org.apache.beam.sdk.io.iceberg.IcebergIO
- Section
- DynamicFileDestinations - Class in org.apache.beam.sdk.io
-
Some helper classes that derive from
FileBasedSink.DynamicDestinations
. - DynamicFileDestinations() - Constructor for class org.apache.beam.sdk.io.DynamicFileDestinations
- DynamicProtoCoder - Class in org.apache.beam.sdk.extensions.protobuf
-
A
Coder
using Google Protocol Buffers binary format. - dynamicWrite() - Static method in class org.apache.beam.sdk.io.mqtt.MqttIO
- Dynamic Writing to a MQTT Broker - Search tag in class org.apache.beam.sdk.io.mqtt.MqttIO
- Section
- DynamoDBIO - Class in org.apache.beam.sdk.io.aws2.dynamodb
-
IO to read from and write to DynamoDB tables.
- DynamoDBIO() - Constructor for class org.apache.beam.sdk.io.aws2.dynamodb.DynamoDBIO
- DynamoDBIO.Read<T> - Class in org.apache.beam.sdk.io.aws2.dynamodb
-
Read data from DynamoDB using
DynamoDBIO.Read.getScanRequestFn()
and emit an element of typeDynamoDBIO.Read
for eachScanResponse
using the mapping functionDynamoDBIO.Read.getScanResponseMapperFn()
. - DynamoDBIO.Write<T> - Class in org.apache.beam.sdk.io.aws2.dynamodb
-
Write a PCollection
data into DynamoDB.
E
- EARLIEST - Enum constant in enum class org.apache.beam.sdk.io.iceberg.IcebergIO.ReadRows.StartingStrategy
- EARLIEST - Enum constant in enum class org.apache.beam.sdk.transforms.windowing.TimestampCombiner
-
The policy of taking at the earliest of a set of timestamps.
- EARLY - Enum constant in enum class org.apache.beam.sdk.transforms.windowing.PaneInfo.Timing
-
Pane was fired before the input watermark had progressed after the end of the window.
- EarlyBinder(KeyedStateBackend, SerializablePipelineOptions, Coder<? extends BoundedWindow>) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.EarlyBinder
- eitherOf(Watch.Growth.TerminationCondition<InputT, FirstStateT>, Watch.Growth.TerminationCondition<InputT, SecondStateT>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
-
Returns a
Watch.Growth.TerminationCondition
that holds when at least one of the given two conditions holds. - ElasticsearchIO - Class in org.apache.beam.sdk.io.elasticsearch
-
Transforms for reading and writing data from/to Elasticsearch.
- ElasticsearchIO.BoundedElasticsearchSource - Class in org.apache.beam.sdk.io.elasticsearch
-
A
BoundedSource
reading from Elasticsearch. - ElasticsearchIO.BulkIO - Class in org.apache.beam.sdk.io.elasticsearch
-
A
PTransform
writing Bulk API entities created byElasticsearchIO.DocToBulk
to an Elasticsearch cluster. - ElasticsearchIO.ConnectionConfiguration - Class in org.apache.beam.sdk.io.elasticsearch
-
A POJO describing a connection configuration to Elasticsearch.
- ElasticsearchIO.DocToBulk - Class in org.apache.beam.sdk.io.elasticsearch
-
A
PTransform
converting docs to their Bulk API counterparts. - ElasticsearchIO.Document - Class in org.apache.beam.sdk.io.elasticsearch
- ElasticsearchIO.DocumentCoder - Class in org.apache.beam.sdk.io.elasticsearch
- ElasticsearchIO.Read - Class in org.apache.beam.sdk.io.elasticsearch
-
A
PTransform
reading data from Elasticsearch. - ElasticsearchIO.RetryConfiguration - Class in org.apache.beam.sdk.io.elasticsearch
-
A POJO encapsulating a configuration for retry behavior when issuing requests to ES.
- ElasticsearchIO.Write - Class in org.apache.beam.sdk.io.elasticsearch
-
A
PTransform
writing data to Elasticsearch. - ElasticsearchIO.Write.BooleanFieldValueExtractFn - Interface in org.apache.beam.sdk.io.elasticsearch
- ElasticsearchIO.Write.FieldValueExtractFn - Interface in org.apache.beam.sdk.io.elasticsearch
- ElasticsearchIOITCommon - Class in org.apache.beam.sdk.io.elasticsearch
-
Manipulates test data used by the
ElasticsearchIO
integration tests. - ElasticsearchIOITCommon() - Constructor for class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOITCommon
- ElasticsearchIOITCommon.ElasticsearchPipelineOptions - Interface in org.apache.beam.sdk.io.elasticsearch
-
Pipeline options for elasticsearch tests.
- element() - Method in class org.apache.beam.runners.twister2.utils.Twister2AssignContext
- element() - Method in class org.apache.beam.sdk.transforms.DoFn.ProcessContext
-
Returns the input element to be processed.
- element() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn.AssignContext
-
Returns the current element.
- element() - Method in class org.apache.beam.sdk.transforms.WithFailures.ExceptionElement
- ELEMENT - Enum constant in enum class org.apache.beam.sdk.testing.TestStream.EventType
- elementCoder() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.SideInputSpec
- elementCoder() - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.IterableSideInputHandler
-
Returns the
Coder
to use for the elements of the resulting values iterable. - elementCountAtLeast(int) - Static method in class org.apache.beam.sdk.transforms.windowing.AfterPane
-
Creates a trigger that fires when the pane contains at least
countElems
elements. - ElementDelimitedOutputStream(DataStreams.OutputChunkConsumer<ByteString>, int) - Constructor for class org.apache.beam.sdk.fn.stream.DataStreams.ElementDelimitedOutputStream
- ElementEvent() - Constructor for class org.apache.beam.sdk.testing.TestStream.ElementEvent
- elements() - Static method in class org.apache.beam.sdk.transforms.ToString
- elementsIterable() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.SingletonKeyedWorkItem
- elementsRead() - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
-
Counter of elements read by a source.
- elementsReadBySplit(String) - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
-
Counter of elements read by a source split.
- elementsWritten() - Static method in class org.apache.beam.sdk.metrics.SinkMetrics
-
Counter of elements written to a sink.
- ElemToBytesFunction<V> - Class in org.apache.beam.runners.twister2.translators.functions
-
Map to tuple function.
- ElemToBytesFunction() - Constructor for class org.apache.beam.runners.twister2.translators.functions.ElemToBytesFunction
- ElemToBytesFunction(WindowedValues.WindowedValueCoder<V>) - Constructor for class org.apache.beam.runners.twister2.translators.functions.ElemToBytesFunction
- EMBEDDED_ARTIFACT_URN - Static variable in class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
- EmbeddedEnvironmentFactory - Class in org.apache.beam.runners.fnexecution.environment
-
An
EnvironmentFactory
that communicates to aFnHarness
which is executing in the same process. - EmbeddedEnvironmentFactory.Provider - Class in org.apache.beam.runners.fnexecution.environment
-
Provider of EmbeddedEnvironmentFactory.
- empty() - Static method in class org.apache.beam.runners.local.StructuralKey
-
Get the empty
StructuralKey
. - empty() - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.SparkSideInputReader
- empty() - Static method in class org.apache.beam.sdk.expansion.service.ExpansionServiceConfig
- empty() - Static method in class org.apache.beam.sdk.metrics.BoundedTrieResult
- empty() - Static method in class org.apache.beam.sdk.metrics.GaugeResult
- empty() - Static method in class org.apache.beam.sdk.metrics.StringSetResult
- empty() - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
-
Asserts that the iterable in question is empty.
- empty() - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
- empty() - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.empty()
. - empty() - Static method in class org.apache.beam.sdk.transforms.join.CoGbkResult
-
Returns an empty
CoGbkResult
. - empty() - Static method in class org.apache.beam.sdk.transforms.Requirements
-
Describes an empty set of requirements.
- empty() - Static method in class org.apache.beam.sdk.values.TupleTagList
-
Returns an empty
TupleTagList
. - empty(Coder<T>) - Static method in class org.apache.beam.sdk.transforms.Create
-
Returns a new
Create.Values
transform that produces an emptyPCollection
. - empty(Pipeline) - Static method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
-
Returns an empty
KeyedPCollectionTuple<K>
on the given pipeline. - empty(Pipeline) - Static method in class org.apache.beam.sdk.values.PCollectionList
-
Returns an empty
PCollectionList
that is part of the givenPipeline
. - empty(Pipeline) - Static method in class org.apache.beam.sdk.values.PCollectionRowTuple
-
Returns an empty
PCollectionRowTuple
that is part of the givenPipeline
. - empty(Pipeline) - Static method in class org.apache.beam.sdk.values.PCollectionTuple
-
Returns an empty
PCollectionTuple
that is part of the givenPipeline
. - empty(Schema) - Static method in class org.apache.beam.sdk.transforms.Create
-
Returns a new
Create.Values
transform that produces an emptyPCollection
of rows. - empty(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.transforms.Create
-
Returns a new
Create.Values
transform that produces an emptyPCollection
. - EMPTY - Enum constant in enum class org.apache.beam.sdk.io.snowflake.enums.WriteDisposition
- EMPTY - Static variable in class org.apache.beam.sdk.extensions.ordered.ContiguousSequenceRange
- EMPTY - Static variable in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration
- EMPTY - Static variable in class org.apache.beam.sdk.io.range.ByteKey
-
An empty key.
- EMPTY - Static variable in class org.apache.beam.sdk.schemas.utils.SchemaZipFold.Context
- EMPTY_BYTE_ARRAY - Static variable in class org.apache.beam.runners.spark.util.TimerUtils
- EMPTY_ROW - Static variable in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.AggregationCombineFnAdapter
- EMPTY_SCHEMA - Static variable in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.AggregationCombineFnAdapter
- emptyArray() - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.emptyArray()
. - emptyBatch() - Method in class org.apache.beam.runners.spark.io.CreateStream
-
Adds an empty batch.
- EmptyCatalogManager - Class in org.apache.beam.sdk.extensions.sql.meta.catalog
- EmptyCatalogManager() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.catalog.EmptyCatalogManager
- EmptyCheckpointMark - Class in org.apache.beam.runners.spark.io
-
Passing null values to Spark's Java API may cause problems because of Guava preconditions.
- emptyIterable() - Static method in class org.apache.beam.sdk.fn.stream.PrefetchableIterables
-
Returns an empty
PrefetchableIterable
. - emptyIterable() - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.emptyIterable()
. - emptyIterator() - Static method in class org.apache.beam.sdk.fn.stream.PrefetchableIterators
-
Returns an empty
PrefetchableIterator
. - emptyList() - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.utils.ScalaInterop
- EmptyListDefault() - Constructor for class org.apache.beam.sdk.transforms.resourcehints.ResourceHintsOptions.EmptyListDefault
- EmptyListenersList() - Constructor for class org.apache.beam.runners.spark.SparkContextOptions.EmptyListenersList
- EmptyMatchTreatment - Enum Class in org.apache.beam.sdk.io.fs
-
Options for allowing or disallowing filepatterns that match no resources in
FileSystems.match(java.util.List<java.lang.String>)
. - emptyProperties() - Static method in class org.apache.beam.sdk.extensions.sql.TableUtils
- emptyVoidFunction() - Static method in class org.apache.beam.runners.spark.translation.TranslationUtils
- ENABLE_CUSTOM_PUBSUB_SINK - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
- ENABLE_CUSTOM_PUBSUB_SOURCE - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
- enableAbandonedNodeEnforcement(boolean) - Method in class org.apache.beam.sdk.testing.TestPipeline
-
Enables the abandoned node detection.
- enableAutoRunIfMissing(boolean) - Method in class org.apache.beam.sdk.testing.TestPipeline
-
If enabled, a
pipeline.run()
statement will be added automatically in case it is missing in the test. - Enable client side metrics - Search tag in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO
- Section
- enableSSL() - Method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
-
Enable SSL connection to Redis server.
- EnableStreamingEngineFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.EnableStreamingEngineFactory
- EnableWindmillServiceDirectPathFactory() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions.EnableWindmillServiceDirectPathFactory
- EncodableThrowable - Class in org.apache.beam.sdk.values
-
A wrapper around a
Throwable
for use with coders. - encode(byte[], OutputStream) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
- encode(byte[], OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
- encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.NullableCoder
- encode(T, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.NullableCoder
- encode(HyperLogLogPlus, OutputStream) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.HyperLogLogPlusCoder
- encode(TableRow, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
- encode(TableRow, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
- encode(JsonArray, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.JsonArrayCoder
- encode(ByteString, OutputStream) - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
- encode(ByteString, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
- encode(IterableT, OutputStream) - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
- encode(Boolean, OutputStream) - Method in class org.apache.beam.sdk.coders.BooleanCoder
- encode(Byte, OutputStream) - Method in class org.apache.beam.sdk.coders.ByteCoder
- encode(Double, OutputStream) - Method in class org.apache.beam.sdk.coders.DoubleCoder
- encode(Float, OutputStream) - Method in class org.apache.beam.sdk.coders.FloatCoder
- encode(Integer, OutputStream) - Method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
- encode(Integer, OutputStream) - Method in class org.apache.beam.sdk.coders.TextualIntegerCoder
- encode(Integer, OutputStream) - Method in class org.apache.beam.sdk.coders.VarIntCoder
- encode(Integer, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.TextualIntegerCoder
- encode(Long, OutputStream) - Method in class org.apache.beam.sdk.coders.BigEndianLongCoder
- encode(Long, OutputStream) - Method in class org.apache.beam.sdk.coders.VarLongCoder
- encode(Short, OutputStream) - Method in class org.apache.beam.sdk.coders.BigEndianShortCoder
- encode(String, OutputStream) - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
- encode(String, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
- encode(Void, OutputStream) - Method in class org.apache.beam.sdk.coders.VoidCoder
- encode(BigDecimal, OutputStream) - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
- encode(BigDecimal, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
- encode(BigInteger, OutputStream) - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
- encode(BigInteger, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
- encode(BitSet, OutputStream) - Method in class org.apache.beam.sdk.coders.BitSetCoder
- encode(BitSet, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.BitSetCoder
- encode(Map<K, V>, OutputStream) - Method in class org.apache.beam.sdk.coders.MapCoder
- encode(Map<K, V>, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.MapCoder
- encode(Optional<T>, OutputStream) - Method in class org.apache.beam.sdk.coders.OptionalCoder
- encode(SortedMap<K, V>, OutputStream) - Method in class org.apache.beam.sdk.coders.SortedMapCoder
- encode(SortedMap<K, V>, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.SortedMapCoder
- encode(K, OutputStream) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.MetadataKeyCoder
- encode(KeyedWorkItem<K, ElemT>, OutputStream) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.SingletonKeyedWorkItemCoder
- encode(KeyedWorkItem<K, ElemT>, OutputStream, Coder.Context) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.SingletonKeyedWorkItemCoder
- encode(IsmFormat.Footer, OutputStream) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.FooterCoder
- encode(IsmFormat.IsmRecord<V>, OutputStream) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmRecordCoder
- encode(IsmFormat.IsmShard, OutputStream) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmShardCoder
- encode(IsmFormat.KeyPrefix, OutputStream) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.KeyPrefixCoder
- encode(RandomAccessData, OutputStream) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
- encode(RandomAccessData, OutputStream, Coder.Context) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
- encode(SequenceRangeAccumulator, OutputStream) - Method in class org.apache.beam.sdk.extensions.ordered.combiner.SequenceRangeAccumulator.SequenceRangeAccumulatorCoder
- encode(EncodedBoundedWindow, OutputStream) - Method in class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow.Coder
- encode(CountingSource.CounterMark, OutputStream) - Method in class org.apache.beam.sdk.io.CountingSource.CounterMarkCoder
- encode(DefaultFilenamePolicy.Params, OutputStream) - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy.ParamsCoder
- encode(ElasticsearchIO.Document, OutputStream) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocumentCoder
- encode(FileBasedSink.FileResult<DestinationT>, OutputStream) - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResultCoder
- encode(FileIO.ReadableFile, OutputStream) - Method in class org.apache.beam.sdk.io.ReadableFileCoder
- encode(MatchResult.Metadata, OutputStream) - Method in class org.apache.beam.sdk.io.fs.MetadataCoder
- encode(MatchResult.Metadata, OutputStream) - Method in class org.apache.beam.sdk.io.fs.MetadataCoderV2
- encode(ResourceId, OutputStream) - Method in class org.apache.beam.sdk.io.fs.ResourceIdCoder
- encode(BigQueryInsertError, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
- encode(BigQueryStorageApiInsertError, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertErrorCoder
- encode(RowMutation, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.RowMutation.RowMutationCoder
- encode(TableDestination, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoder
- encode(TableDestination, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV2
- encode(TableDestination, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV3
- encode(BigtableWriteResult, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResultCoder
- encode(FhirSearchParameter<T>, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameterCoder
- encode(HealthcareIOError<T>, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOErrorCoder
- encode(HL7v2Message, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2MessageCoder
- encode(HL7v2ReadResponse, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadResponseCoder
- encode(PubsubMessage, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
- encode(PubsubMessage, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdAndOrderingKeyCoder
- encode(PubsubMessage, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdCoder
- encode(PubsubMessage, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
- encode(PubsubMessage, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithMessageIdCoder
- encode(PubsubMessage, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithTopicCoder
- encode(PubsubMessage, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
- encode(PubsubMessage, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
- encode(OffsetByteRange, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.OffsetByteRangeCoder
- encode(SubscriptionPartition, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.SubscriptionPartitionCoder
- encode(Uuid, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UuidCoder
- encode(KafkaRecord<K, V>, OutputStream) - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
- encode(PulsarMessage, OutputStream) - Method in class org.apache.beam.sdk.io.pulsar.PulsarMessageCoder
- encode(OffsetRange, OutputStream) - Method in class org.apache.beam.sdk.io.range.OffsetRange.Coder
- encode(SplunkEvent, OutputStream) - Method in class org.apache.beam.sdk.io.splunk.SplunkEventCoder
- encode(TestStream<T>, OutputStream) - Method in class org.apache.beam.sdk.testing.TestStream.TestStreamCoder
- encode(CoGbkResult, OutputStream) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
- encode(RawUnionValue, OutputStream) - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
- encode(RawUnionValue, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
- encode(GlobalWindow, OutputStream) - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow.Coder
- encode(IntervalWindow, OutputStream) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
- encode(PaneInfo, OutputStream) - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo.PaneInfoCoder
- encode(FailsafeValueInSingleWindow<T, ErrorT>, OutputStream) - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow.Coder
- encode(FailsafeValueInSingleWindow<T, ErrorT>, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow.Coder
- encode(KV<K, V>, OutputStream) - Method in class org.apache.beam.sdk.coders.KvCoder
- encode(KV<K, V>, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.KvCoder
- encode(PCollectionViews.ValueOrMetadata<T, MetaT>, OutputStream) - Method in class org.apache.beam.sdk.values.PCollectionViews.ValueOrMetadataCoder
- encode(ShardedKey<KeyT>, OutputStream) - Method in class org.apache.beam.sdk.coders.ShardedKeyCoder
- encode(TimestampedValue<T>, OutputStream) - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
- encode(ValueInSingleWindow<T>, OutputStream) - Method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
- encode(ValueInSingleWindow<T>, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
- encode(ValueWithRecordId<ValueT>, OutputStream) - Method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
- encode(ValueWithRecordId<ValueT>, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
- encode(WindowedValue<T>, OutputStream) - Method in class org.apache.beam.sdk.values.WindowedValues.FullWindowedValueCoder
- encode(WindowedValue<T>, OutputStream) - Method in class org.apache.beam.sdk.values.WindowedValues.ParamWindowedValueCoder
- encode(WindowedValue<T>, OutputStream) - Method in class org.apache.beam.sdk.values.WindowedValues.ValueOnlyWindowedValueCoder
-
Deprecated.
- encode(WindowedValue<T>, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.WindowedValues.FullWindowedValueCoder
- encode(WindowedValue<T>, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.WindowedValues.ParamWindowedValueCoder
- encode(WindowedValue<T>, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.WindowedValues.ValueOnlyWindowedValueCoder
-
Deprecated.
- encode(ByteString, OutputStream) - Method in class org.apache.beam.runners.fnexecution.wire.ByteStringCoder
- encode(ByteString, OutputStream, Coder.Context) - Method in class org.apache.beam.runners.fnexecution.wire.ByteStringCoder
- encode(ProducerRecord<K, V>, OutputStream) - Method in class org.apache.beam.sdk.io.kafka.ProducerRecordCoder
- encode(TopicPartition, OutputStream) - Method in class org.apache.beam.sdk.io.kafka.TopicPartitionCoder
- encode(Message, OutputStream) - Method in class org.apache.beam.sdk.io.amqp.AmqpMessageCoder
- encode(Instant, OutputStream) - Method in class org.apache.beam.sdk.coders.InstantCoder
- encode(ReadableDuration, OutputStream) - Method in class org.apache.beam.sdk.coders.DurationCoder
- encode(AttributeValue, OutputStream) - Method in class org.apache.beam.sdk.io.aws2.dynamodb.AttributeValueCoder
- encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.Coder
-
Encodes the given value of type
T
onto the given output stream. - encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.DelegateCoder
- encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
- encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.SerializableCoder
- encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.SnappyCoder
- encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
- encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.TimestampPrefixingWindowCoder
- encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.ZstdCoder
- encode(T, OutputStream) - Method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
- encode(T, OutputStream) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
- encode(T, OutputStream) - Method in class org.apache.beam.sdk.io.hadoop.WritableCoder
- encode(T, OutputStream) - Method in class org.apache.beam.sdk.io.thrift.ThriftCoder
-
Encodes the given value of type
T
onto the given output stream using providedThriftCoder.protocolFactory
. - encode(T, OutputStream) - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
- encode(T, OutputStream) - Method in class org.apache.beam.sdk.schemas.SchemaCoder
- encode(T, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.Coder
-
Deprecated.only implement and call
Coder.encode(Object value, OutputStream)
- encode(T, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.DelegateCoder
- encode(T, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
- encode(T, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
- encode(T, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
- encode(T, Coder<T>) - Static method in class org.apache.beam.runners.jet.Utils
- encodeAndHash(List<?>, RandomAccessData) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmRecordCoder
-
Computes the shard id for the given key component(s).
- encodeAndHash(List<?>, RandomAccessData, List<Integer>) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmRecordCoder
-
Computes the shard id for the given key component(s).
- encodeAndOwn(byte[], OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
-
Encodes the provided
value
with the identical encoding toByteArrayCoder.encode(byte[], java.io.OutputStream)
, but with optimizations that take ownership of the value. - EncodedBoundedWindow - Class in org.apache.beam.sdk.fn.windowing
-
An encoded
BoundedWindow
used within Runners to track window information without needing to decode the window. - EncodedBoundedWindow() - Constructor for class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow
- EncodedBoundedWindow.Coder - Class in org.apache.beam.sdk.fn.windowing
- encodeDoLoopBlackhole(VarIntBenchmark.Longs, VarIntBenchmark.BlackholeOutput) - Method in class org.apache.beam.sdk.jmh.util.VarIntBenchmark
- encodeDoLoopByteString(VarIntBenchmark.Longs, VarIntBenchmark.ByteStringOutput) - Method in class org.apache.beam.sdk.jmh.util.VarIntBenchmark
- encodeDoLoopTwiddleBlackhole(VarIntBenchmark.Longs, VarIntBenchmark.BlackholeOutput) - Method in class org.apache.beam.sdk.jmh.util.VarIntBenchmark
- encodeDoLoopTwiddleByteString(VarIntBenchmark.Longs, VarIntBenchmark.ByteStringOutput) - Method in class org.apache.beam.sdk.jmh.util.VarIntBenchmark
- EncodedValueComparator - Class in org.apache.beam.runners.flink.translation.types
-
Flink
TypeComparator
for Beam values that have been encoded to byte data by aCoder
. - EncodedValueComparator(boolean) - Constructor for class org.apache.beam.runners.flink.translation.types.EncodedValueComparator
- EncodedValueSerializer - Class in org.apache.beam.runners.flink.translation.types
-
TypeSerializer
for values that were encoded using aCoder
. - EncodedValueSerializer() - Constructor for class org.apache.beam.runners.flink.translation.types.EncodedValueSerializer
- EncodedValueTypeInformation - Class in org.apache.beam.runners.flink.translation.types
-
Flink
TypeInformation
for Beam values that have been encoded to byte data by aCoder
. - EncodedValueTypeInformation() - Constructor for class org.apache.beam.runners.flink.translation.types.EncodedValueTypeInformation
- encodeKey(K, Coder<K>) - Static method in class org.apache.beam.runners.flink.translation.wrappers.streaming.FlinkKeyUtils
-
Encodes a key to a byte array wrapped inside a ByteBuffer.
- encodeLoopBlackhole(VarIntBenchmark.Longs, VarIntBenchmark.BlackholeOutput) - Method in class org.apache.beam.sdk.jmh.util.VarIntBenchmark
- encodeLoopByteString(VarIntBenchmark.Longs, VarIntBenchmark.ByteStringOutput) - Method in class org.apache.beam.sdk.jmh.util.VarIntBenchmark
- encodePacked32TimeSeconds(LocalTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Encodes
time
as a 4-byte integer with seconds precision. - encodePacked32TimeSeconds(LocalTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Encodes
time
as a 4-byte integer with seconds precision. - encodePacked64DatetimeMicros(LocalDateTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Encodes
dateTime
as a 8-byte integer with microseconds precision. - encodePacked64DatetimeMicros(LocalDateTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Encodes
dateTime
as a 8-byte integer with microseconds precision. - encodePacked64DatetimeSeconds(LocalDateTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Encodes
dateTime
as a 8-byte integer with seconds precision. - encodePacked64DatetimeSeconds(LocalDateTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Encodes
dateTime
as a 8-byte integer with seconds precision. - encodePacked64TimeMicros(LocalTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Encodes
time
as a 8-byte integer with microseconds precision. - encodePacked64TimeMicros(LocalTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Encodes
time
as a 8-byte integer with microseconds precision. - encodePacked64TimeNanos(LocalTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Encodes
time
as a 8-byte integer with nanoseconds precision. - encodePacked64TimeNanos(LocalTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Encodes
time
as a 8-byte integer with nanoseconds precision. - encodeQueryResult(Table) - Static method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
- encodeQueryResult(Table, List<TableRow>) - Static method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
- encoderFactory() - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderProvider
- EncoderFactory - Class in org.apache.beam.runners.spark.structuredstreaming.translation.helpers
- EncoderFactory() - Constructor for class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderFactory
- encoderFor(Coder<T>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers
- EncoderHelpers - Class in org.apache.beam.runners.spark.structuredstreaming.translation.helpers
-
Encoders
utility class. - EncoderHelpers() - Constructor for class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers
- EncoderHelpers.Utils - Class in org.apache.beam.runners.spark.structuredstreaming.translation.helpers
-
Encoder / expression utils that are called from generated code.
- encoderOf(Class<? super T>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers
-
Gets or creates a default
Encoder
forEncoderHelpers
. - encoderOf(Coder<T>) - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderProvider
- encoderOf(Coder<T>, EncoderProvider.Factory<T>) - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderProvider
- encoderOf(Coder<T>, EncoderProvider.Factory<T>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
- EncoderProvider - Interface in org.apache.beam.runners.spark.structuredstreaming.translation.helpers
- EncoderProvider.Factory<T> - Interface in org.apache.beam.runners.spark.structuredstreaming.translation.helpers
- encodeToTimerDataTimerId(String, String) - Static method in class org.apache.beam.runners.fnexecution.control.TimerReceiverFactory
-
Encodes transform and timer family ids into a single string which retains the human readable format
len(transformId):transformId:timerId
. - encodeUnrolledBlackhole(VarIntBenchmark.Longs, VarIntBenchmark.BlackholeOutput) - Method in class org.apache.beam.sdk.jmh.util.VarIntBenchmark
- encodeUnrolledByteString(VarIntBenchmark.Longs, VarIntBenchmark.ByteStringOutput) - Method in class org.apache.beam.sdk.jmh.util.VarIntBenchmark
- ENCODING - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- EncodingException - Exception Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator
-
Represents an error during encoding (serializing) a class.
- EncodingException - Exception Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator
-
Represents an error during encoding (serializing) a class.
- EncodingException(Throwable) - Constructor for exception class org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator.EncodingException
- EncodingException(Throwable) - Constructor for exception class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.EncodingException
- end() - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
-
Returns the end of this window, exclusive.
- END_CURSOR - Static variable in class org.apache.beam.sdk.io.redis.RedisCursor
- END_OF_WINDOW - Enum constant in enum class org.apache.beam.sdk.transforms.windowing.TimestampCombiner
-
The policy of using the end of the window, regardless of input timestamps.
- endpoint() - Method in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration
-
Optional service endpoint to use AWS compatible services instead, e.g.
- endpoint(URI) - Method in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration.Builder
-
Optional service endpoint to use AWS compatible services instead, e.g.
- ENDS_WITH - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
- ENDS_WITH_METHOD - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.BeamBuiltinMethods
- endsWith(String) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- endsWith(String) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.endsWith(java.lang.String)
. - endsWith(String, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinStringFunctions
- endsWith(String, String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
- endsWith(Path) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- Enhanced Fan-Out - Search tag in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO
- Section
- Enhanced Fan-Out and KinesisIO state management - Search tag in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO
- Section
- Enhanced Fan-Out and other KinesisIO settings - Search tag in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO
- Section
- ensureUsableAsCloudPubsub() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.CloudPubsubTransforms
-
Ensure that all messages that pass through can be converted to Cloud Pub/Sub messages using the standard transformation methods in the client library.
- ENTER_TRANSFORM - Enum constant in enum class org.apache.beam.sdk.Pipeline.PipelineVisitor.CompositeBehavior
- enterArrayQualifier(FieldSpecifierNotationParser.ArrayQualifierContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
Enter a parse tree produced by
FieldSpecifierNotationParser.arrayQualifier()
. - enterArrayQualifier(FieldSpecifierNotationParser.ArrayQualifierContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
-
Enter a parse tree produced by
FieldSpecifierNotationParser.arrayQualifier()
. - enterArrayQualifierList(FieldSpecifierNotationParser.ArrayQualifierListContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
Enter a parse tree produced by the
arrayQualifierList
labeled alternative inFieldSpecifierNotationParser.qualifierList()
. - enterArrayQualifierList(FieldSpecifierNotationParser.ArrayQualifierListContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
-
Enter a parse tree produced by the
arrayQualifierList
labeled alternative inFieldSpecifierNotationParser.qualifierList()
. - enterCompositeTransform(TransformHierarchy.Node) - Method in class org.apache.beam.runners.flink.translation.utils.LookupPipelineVisitor
- enterCompositeTransform(TransformHierarchy.Node) - Method in class org.apache.beam.runners.spark.SparkNativePipelineVisitor
- enterCompositeTransform(TransformHierarchy.Node) - Method in class org.apache.beam.runners.spark.SparkRunner.Evaluator
- enterCompositeTransform(TransformHierarchy.Node) - Method in class org.apache.beam.runners.spark.translation.GroupByKeyVisitor
- enterCompositeTransform(TransformHierarchy.Node) - Method in class org.apache.beam.sdk.Pipeline.PipelineVisitor.Defaults
- enterCompositeTransform(TransformHierarchy.Node) - Method in interface org.apache.beam.sdk.Pipeline.PipelineVisitor
-
Called for each composite transform after all topological predecessors have been visited but before any of its component transforms.
- enterDotExpression(FieldSpecifierNotationParser.DotExpressionContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
Enter a parse tree produced by
FieldSpecifierNotationParser.dotExpression()
. - enterDotExpression(FieldSpecifierNotationParser.DotExpressionContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
-
Enter a parse tree produced by
FieldSpecifierNotationParser.dotExpression()
. - enterEveryRule(ParserRuleContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
- enterFieldSpecifier(FieldSpecifierNotationParser.FieldSpecifierContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
Enter a parse tree produced by
FieldSpecifierNotationParser.fieldSpecifier()
. - enterFieldSpecifier(FieldSpecifierNotationParser.FieldSpecifierContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
-
Enter a parse tree produced by
FieldSpecifierNotationParser.fieldSpecifier()
. - enterMapQualifier(FieldSpecifierNotationParser.MapQualifierContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
Enter a parse tree produced by
FieldSpecifierNotationParser.mapQualifier()
. - enterMapQualifier(FieldSpecifierNotationParser.MapQualifierContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
-
Enter a parse tree produced by
FieldSpecifierNotationParser.mapQualifier()
. - enterMapQualifierList(FieldSpecifierNotationParser.MapQualifierListContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
Enter a parse tree produced by the
mapQualifierList
labeled alternative inFieldSpecifierNotationParser.qualifierList()
. - enterMapQualifierList(FieldSpecifierNotationParser.MapQualifierListContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
-
Enter a parse tree produced by the
mapQualifierList
labeled alternative inFieldSpecifierNotationParser.qualifierList()
. - enterPipeline(Pipeline) - Method in class org.apache.beam.sdk.Pipeline.PipelineVisitor.Defaults
- enterPipeline(Pipeline) - Method in interface org.apache.beam.sdk.Pipeline.PipelineVisitor
-
Called before visiting anything values or transforms, as many uses of a visitor require access to the
Pipeline
object itself. - enterQualifiedComponent(FieldSpecifierNotationParser.QualifiedComponentContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
Enter a parse tree produced by
FieldSpecifierNotationParser.qualifiedComponent()
. - enterQualifiedComponent(FieldSpecifierNotationParser.QualifiedComponentContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
-
Enter a parse tree produced by
FieldSpecifierNotationParser.qualifiedComponent()
. - enterQualifyComponent(FieldSpecifierNotationParser.QualifyComponentContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
Enter a parse tree produced by the
qualifyComponent
labeled alternative inFieldSpecifierNotationParser.dotExpressionComponent()
. - enterQualifyComponent(FieldSpecifierNotationParser.QualifyComponentContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
-
Enter a parse tree produced by the
qualifyComponent
labeled alternative inFieldSpecifierNotationParser.dotExpressionComponent()
. - enterRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.ArrayQualifierContext
- enterRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.ArrayQualifierListContext
- enterRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionContext
- enterRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.FieldSpecifierContext
- enterRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.MapQualifierContext
- enterRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.MapQualifierListContext
- enterRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifiedComponentContext
- enterRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifyComponentContext
- enterRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.SimpleIdentifierContext
- enterRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.WildcardContext
- enterSimpleIdentifier(FieldSpecifierNotationParser.SimpleIdentifierContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
Enter a parse tree produced by the
simpleIdentifier
labeled alternative inFieldSpecifierNotationParser.dotExpressionComponent()
. - enterSimpleIdentifier(FieldSpecifierNotationParser.SimpleIdentifierContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
-
Enter a parse tree produced by the
simpleIdentifier
labeled alternative inFieldSpecifierNotationParser.dotExpressionComponent()
. - enterWildcard(FieldSpecifierNotationParser.WildcardContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
Enter a parse tree produced by the
wildcard
labeled alternative inFieldSpecifierNotationParser.dotExpressionComponent()
. - enterWildcard(FieldSpecifierNotationParser.WildcardContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
-
Enter a parse tree produced by the
wildcard
labeled alternative inFieldSpecifierNotationParser.dotExpressionComponent()
. - EntityToRow - Class in org.apache.beam.sdk.io.gcp.datastore
- entries() - Method in interface org.apache.beam.sdk.state.MapState
-
Returns an
Iterable
over the key-value pairs contained in this map. - entries() - Method in interface org.apache.beam.sdk.state.MultimapState
-
Returns an
Iterable
over all key-value pairs contained in this multimap. - entrySet() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel.WrappedMap
- entrySet() - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
- enum16(Map<String, Integer>) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- ENUM16 - Enum constant in enum class org.apache.beam.sdk.io.clickhouse.TableSchema.TypeName
- enum8(Map<String, Integer>) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- ENUM8 - Enum constant in enum class org.apache.beam.sdk.io.clickhouse.TableSchema.TypeName
- EnumerationType - Class in org.apache.beam.sdk.schemas.logicaltypes
-
This
Schema.LogicalType
represent an enumeration over a fixed set of values. - EnumerationType.Value - Class in org.apache.beam.sdk.schemas.logicaltypes
-
This class represents a single enum value.
- enumValues() - Method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- ENVIRONMENT_VERSION_JOB_TYPE_KEY - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- ENVIRONMENT_VERSION_MAJOR_KEY - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- EnvironmentFactory - Interface in org.apache.beam.runners.fnexecution.environment
-
Creates
environments
which communicate to anSdkHarnessClient
. - EnvironmentFactory.Provider - Interface in org.apache.beam.runners.fnexecution.environment
-
Provider for a
EnvironmentFactory
andServerFactory
for the environment. - equal(T) - Static method in class org.apache.beam.sdk.transforms.Filter
-
Returns a
PTransform
that takes an inputPCollection<T>
and returns aPCollection<T>
with elements that equals to a given value. - equals(Object) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmRecordCoder
- equals(Object) - Method in class org.apache.beam.runners.dataflow.util.CloudObject
- equals(Object) - Method in class org.apache.beam.runners.dataflow.util.OutputReference
- equals(Object) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
- equals(Object) - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeInformation
- equals(Object) - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeSerializer
- equals(Object) - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueTypeInformation
- equals(Object) - Method in class org.apache.beam.runners.jet.Utils.ByteArrayKey
- equals(Object) - Method in class org.apache.beam.runners.spark.io.EmptyCheckpointMark
- equals(Object) - Method in class org.apache.beam.runners.spark.io.MicrobatchSource
- equals(Object) - Method in class org.apache.beam.runners.spark.util.ByteArray
- equals(Object) - Method in class org.apache.beam.runners.spark.util.TimerUtils.TimerMarker
- equals(Object) - Method in class org.apache.beam.sdk.coders.AtomicCoder
-
.
- equals(Object) - Method in class org.apache.beam.sdk.coders.Coder.Context
-
Deprecated.
- equals(Object) - Method in class org.apache.beam.sdk.coders.DelegateCoder
- equals(Object) - Method in class org.apache.beam.sdk.coders.RowCoder
- equals(Object) - Method in class org.apache.beam.sdk.coders.SerializableCoder
- equals(Object) - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
- equals(Object) - Method in class org.apache.beam.sdk.coders.StructuralByteArray
- equals(Object) - Method in class org.apache.beam.sdk.coders.StructuredCoder
- equals(Object) - Method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
- equals(Object) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory
- equals(Object) - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsResourceId
- equals(Object) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- equals(Object) - Method in class org.apache.beam.sdk.extensions.protobuf.DynamicProtoCoder
- equals(Object) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
- equals(Object) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoDomain
- equals(Object) - Method in class org.apache.beam.sdk.extensions.sql.example.model.Customer
- equals(Object) - Method in class org.apache.beam.sdk.extensions.sql.example.model.Order
- equals(Object) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
- equals(Object) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
- equals(Object) - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy.Params
- equals(Object) - Method in class org.apache.beam.sdk.io.FileIO.ReadableFile
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertError
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.DetectNewPartitionsState
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.InitialPipelineState
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.WriteSuccessSummary
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteFailure
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteSuccessSummary
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.ProjectPath
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.TopicPath
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
- equals(Object) - Method in class org.apache.beam.sdk.io.hadoop.WritableCoder
- equals(Object) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
- equals(Object) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Write
- equals(Object) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.WriteRowMutations
- equals(Object) - Method in class org.apache.beam.sdk.io.jms.JmsRecord
- equals(Object) - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
- equals(Object) - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
- equals(Object) - Method in class org.apache.beam.sdk.io.range.ByteKey
- equals(Object) - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
- equals(Object) - Method in class org.apache.beam.sdk.io.range.OffsetRange
- equals(Object) - Method in class org.apache.beam.sdk.io.redis.RedisCursor
- equals(Object) - Method in class org.apache.beam.sdk.io.solace.broker.SessionService
-
You need to override this method to be able to compare these objects by value.
- equals(Object) - Method in class org.apache.beam.sdk.io.solace.broker.SessionServiceFactory
-
You need to override this method to be able to compare these objects by value.
- equals(Object) - Method in class org.apache.beam.sdk.io.solace.read.SolaceCheckpointMark
- equals(Object) - Method in class org.apache.beam.sdk.io.tika.ParseResult
- equals(Object) - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
- equals(Object) - Method in class org.apache.beam.sdk.options.ValueProvider.NestedValueProvider
- equals(Object) - Method in class org.apache.beam.sdk.options.ValueProvider.RuntimeValueProvider
- equals(Object) - Method in class org.apache.beam.sdk.options.ValueProvider.StaticValueProvider
- equals(Object) - Method in class org.apache.beam.sdk.schemas.CachingFactory
- equals(Object) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProvider
-
Deprecated.
- equals(Object) - Method in class org.apache.beam.sdk.schemas.JavaBeanSchema
- equals(Object) - Method in class org.apache.beam.sdk.schemas.JavaBeanSchema.GetterTypeSupplier
- equals(Object) - Method in class org.apache.beam.sdk.schemas.JavaBeanSchema.SetterTypeSupplier
- equals(Object) - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType.Value
- equals(Object) - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType.Value
- equals(Object) - Method in class org.apache.beam.sdk.schemas.Schema
-
Returns true if two Schemas have the same fields in the same order.
- equals(Object) - Method in class org.apache.beam.sdk.schemas.Schema.Field
- equals(Object) - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
- equals(Object) - Method in class org.apache.beam.sdk.schemas.Schema.Options
- equals(Object) - Method in class org.apache.beam.sdk.schemas.SchemaCoder
- equals(Object) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
- equals(Object) - Method in class org.apache.beam.sdk.testing.PAssert.PAssertionSite
- equals(Object) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
-
Deprecated.
Object.equals(Object)
is not supported on PAssert objects. If you meant to test object equality, use a variant ofPAssert.PCollectionContentsAssert.containsInAnyOrder(T...)
instead. - equals(Object) - Method in class org.apache.beam.sdk.testing.SuccessOrFailure
- equals(Object) - Method in class org.apache.beam.sdk.testing.TestStream
- equals(Object) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique
-
Deprecated.
- equals(Object) - Method in class org.apache.beam.sdk.transforms.CombineFns.CoCombineResult
- equals(Object) - Method in class org.apache.beam.sdk.transforms.display.DisplayData
- equals(Object) - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Path
- equals(Object) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
- equals(Object) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
- equals(Object) - Method in class org.apache.beam.sdk.transforms.join.RawUnionValue
- equals(Object) - Method in class org.apache.beam.sdk.transforms.PeriodicSequence.SequenceDefinition
- equals(Object) - Method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHint
- equals(Object) - Method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHints
- equals(Object) - Method in class org.apache.beam.sdk.transforms.Watch.Growth.PollResult
- equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.AfterPane
- equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
- equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.AfterSynchronizedProcessingTime
- equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.FromEndOfWindow
- equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
- equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow
- equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
- equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
- equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
- equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
- equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
- equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
- equals(Object) - Method in class org.apache.beam.sdk.values.KV
- equals(Object) - Method in class org.apache.beam.sdk.values.PCollectionList
- equals(Object) - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
- equals(Object) - Method in class org.apache.beam.sdk.values.PCollectionTuple
- equals(Object) - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryListFromMultimapViewFn
- equals(Object) - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryListViewFn
- equals(Object) - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableBackedListViewFn
- equals(Object) - Method in class org.apache.beam.sdk.values.PCollectionViews.ListViewFn
-
Deprecated.
- equals(Object) - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
- equals(Object) - Method in class org.apache.beam.sdk.values.Row
- equals(Object) - Method in class org.apache.beam.sdk.values.RowWithGetters
- equals(Object) - Method in class org.apache.beam.sdk.values.ShardedKey
- equals(Object) - Method in class org.apache.beam.sdk.values.TimestampedValue
- equals(Object) - Method in class org.apache.beam.sdk.values.TupleTag
- equals(Object) - Method in class org.apache.beam.sdk.values.TypeDescriptor
-
Two type descriptor are equal if and only if they represent the same type.
- equals(Object) - Method in class org.apache.beam.sdk.values.TypeParameter
- equals(Object) - Method in class org.apache.beam.sdk.values.ValueWithRecordId
- equals(Object) - Method in class org.apache.beam.sdk.values.WindowingStrategy
- equals(Object) - Method in class org.apache.beam.runners.flink.adapter.FlinkKey
- equals(Object) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.FlinkStateNamespaceKeySerializer
- equals(Object) - Method in class org.apache.beam.sdk.coders.ZstdCoder
- equals(Object) - Method in class org.apache.beam.sdk.extensions.ordered.combiner.SequenceRangeAccumulator
- equals(Object) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus
- equals(Object) - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
- equals(Object) - Method in class org.apache.beam.sdk.io.cassandra.RingRange
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.StreamPartitionWithWatermark
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadParameter
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadResponse
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataTableNames
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartition
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.HeartbeatRecord
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.Mod
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionEndRecord
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionEventRecord
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionStartRecord
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.TypeCode
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRange
- equals(Object) - Method in class org.apache.beam.sdk.values.EncodableThrowable
- equals(WindowedValue<T>, WindowedValue<T>) - Static method in class org.apache.beam.sdk.values.WindowedValues
- equals(RelOptCost) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
- Equals() - Constructor for class org.apache.beam.sdk.values.Row.Equals
- EQUALS - Enum constant in enum class org.apache.beam.sdk.extensions.sql.impl.cep.CEPKind
- equalTo(Coder<T>, T) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.equalTo(Object)
. - equalTo(T) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.equalTo(Object)
. - equalToReference(byte[]) - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueComparator
- equivalent(Schema) - Method in class org.apache.beam.sdk.schemas.Schema
-
Returns true if two Schemas have the same fields, but possibly in different orders.
- equivalent(Schema.FieldType, Schema.EquivalenceNullablePolicy) - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
-
Check whether two types are equivalent.
- ERROR - Enum constant in enum class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.Level
-
Deprecated.Level for logging error messages.
- ERROR - Enum constant in enum class org.apache.beam.sdk.io.fs.MatchResult.Status
- ERROR - Enum constant in enum class org.apache.beam.sdk.io.snowflake.enums.StreamingLogLevel
- ERROR - Enum constant in enum class org.apache.beam.sdk.options.PipelineOptions.CheckEnabled
- ERROR - Enum constant in enum class org.apache.beam.sdk.options.SdkHarnessOptions.LogLevel
-
LogLevel for logging error messages.
- ERROR - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils
- ERROR_MESSAGE - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata
-
TupleTag for any error response.
- ERROR_MSG_QUERY_FN - Static variable in class org.apache.beam.sdk.io.mongodb.MongoDbIO
- ERROR_ROW_SCHEMA - Static variable in class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn
- ERROR_ROW_WITH_ERR_MSG_SCHEMA - Static variable in class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn
- ERROR_SCHEMA - Static variable in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformProvider
- ERROR_SCHEMA - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformProvider
- ERROR_SCHEMA - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider
- ERROR_SCHEMA - Static variable in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider
- ERROR_TAG - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformProvider
- ERROR_TAG - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformProvider
- ERROR_TAG - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
- ERROR_TAG - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
- ERROR_TAG - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider
- ERROR_TAG - Static variable in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider
- ERROR_TAG - Static variable in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider
- ERROR_TAG - Static variable in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider
- ERROR_TAG - Static variable in class org.apache.beam.sdk.io.TFRecordReadSchemaTransformProvider
- ERROR_TAG - Static variable in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformProvider
- errorCodeFn - Variable in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
- ErrorContainer<T> - Interface in org.apache.beam.sdk.io.gcp.bigquery
-
ErrorContainer interface.
- ErrorCounterFn(String, SerializableFunction<Row, byte[]>, Schema, boolean) - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.ErrorCounterFn
- ErrorCounterFn(String, SerializableFunction<Row, byte[]>, Schema, boolean, List<String>, Schema) - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.ErrorCounterFn
- ErrorFn(String, Schema, Schema, boolean) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.ErrorFn
- ErrorFn(String, SerializableFunction<byte[], Row>, Schema, boolean) - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.ErrorFn
- ErrorFn(String, SerializableFunction<byte[], Row>, Schema, boolean) - Constructor for class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider.ErrorFn
- ErrorFn(String, SerializableFunction<byte[], Row>, Schema, boolean) - Constructor for class org.apache.beam.sdk.io.TFRecordReadSchemaTransformProvider.ErrorFn
- ErrorFn(String, SerializableFunction<byte[], Row>, Schema, List<String>, String, Schema, boolean) - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.ErrorFn
- ErrorFn(String, SerializableFunction<Row, byte[]>, Schema, boolean) - Constructor for class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformProvider.ErrorFn
- ErrorHandler<ErrorT,
OutputT> - Interface in org.apache.beam.sdk.transforms.errorhandling -
An Error Handler is a utility object used for plumbing error PCollections to a configured sink Error Handlers must be closed before a pipeline is run to properly pipe error collections to the sink, and the pipeline will be rejected if any handlers aren't closed.
- ErrorHandler.BadRecordErrorHandler<OutputT> - Class in org.apache.beam.sdk.transforms.errorhandling
- ErrorHandler.DefaultErrorHandler<ErrorT,
OutputT> - Class in org.apache.beam.sdk.transforms.errorhandling -
A default, placeholder error handler that exists to allow usage of .addErrorCollection() without effects.
- ErrorHandler.PTransformErrorHandler<ErrorT,
OutputT> - Class in org.apache.beam.sdk.transforms.errorhandling - ErrorHandler.PTransformErrorHandler.WriteErrorMetrics<ErrorT> - Class in org.apache.beam.sdk.transforms.errorhandling
- ErrorHandler.PTransformErrorHandler.WriteErrorMetrics.CountErrors<ErrorT> - Class in org.apache.beam.sdk.transforms.errorhandling
- ErrorHandling - Class in org.apache.beam.sdk.schemas.transforms.providers
- ErrorHandling() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.ErrorHandling
- ErrorHandling() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.ErrorHandling
- ErrorHandling() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.ErrorHandling
- ErrorHandling() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.ErrorHandling
- ErrorHandling.Builder - Class in org.apache.beam.sdk.schemas.transforms.providers
- errorRecord(Schema, byte[], Throwable) - Static method in class org.apache.beam.sdk.schemas.transforms.providers.ErrorHandling
- errorRecord(Schema, Row, Throwable) - Static method in class org.apache.beam.sdk.schemas.transforms.providers.ErrorHandling
- errorSchema(Schema) - Static method in class org.apache.beam.sdk.schemas.transforms.providers.ErrorHandling
- errorSchemaBytes() - Static method in class org.apache.beam.sdk.schemas.transforms.providers.ErrorHandling
- estimate() - Method in interface org.apache.beam.sdk.transforms.splittabledofn.GrowableOffsetRangeTracker.RangeEndEstimator
- estimateCount(T, Coder<T>) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.Sketch
-
Utility class to retrieve the estimate frequency of an element from a
CountMinSketch
. - estimateFractionForKey(ByteKey) - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
-
Returns the fraction of this range
[startKey, endKey)
that is in the interval[startKey, key)
. - estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.AbstractBeamCalcRel
- estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamAggregationRel
- estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIntersectRel
- estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSinkRel
- estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
- estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
- estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMatchRel
- estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMinusRel
- estimateNodeStats(BeamRelMetadataQuery) - Method in interface org.apache.beam.sdk.extensions.sql.impl.rel.BeamRelNode
-
This method is called by
org.apache.beam.sdk.extensions.sql.impl.planner.RelMdNodeStats
. - estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel
- estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamTableFunctionScanRel
- estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUncollectRel
- estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnionRel
- estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnnestRel
- estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamValuesRel
- estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamWindowRel
- estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.BeamZetaSqlUncollectRel
- estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.BeamZetaSqlUnnestRel
- estimateRowCount(PipelineOptions) - Method in class org.apache.beam.sdk.io.TextRowCountEstimator
-
Estimates the number of non empty rows.
- estimateRowCount(RelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
- eval(BatchTSetEnvironment, SinkTSet<?>) - Method in class org.apache.beam.runners.twister2.BeamBatchWorker
- eval(SinkTSet<?>) - Method in class org.apache.beam.runners.twister2.Twister2BatchTranslationContext
- eval(SinkTSet<?>) - Method in class org.apache.beam.runners.twister2.Twister2StreamTranslationContext
- eval(SinkTSet<?>) - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
- eval(Row) - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.PatternCondition
- evaluate() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.EvaluationContext
-
Trigger evaluation of all leaf datasets.
- evaluate(String, Dataset<T>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.EvaluationContext
-
The purpose of this utility is to mark the evaluation of Spark actions, both during Pipeline translation, when evaluation is required, and when finally evaluating the pipeline.
- evaluate(ParDo.MultiOutput<KV<KeyT, ValueT>, OutputT>, EvaluationContext) - Method in class org.apache.beam.runners.spark.translation.streaming.StatefulStreamingParDoEvaluator
- evaluate(TransformT, EvaluationContext) - Method in interface org.apache.beam.runners.spark.translation.TransformEvaluator
- EvaluationContext - Class in org.apache.beam.runners.spark.structuredstreaming.translation
-
The
EvaluationContext
is the result of a pipelinetranslation
and can be used to evaluate / run the pipeline. - EvaluationContext - Class in org.apache.beam.runners.spark.translation
-
The EvaluationContext allows us to define pipeline instructions and translate between
PObject<T>
s orPCollection<T>
s and Ts or DStreams/RDDs of Ts. - EvaluationContext(JavaSparkContext, Pipeline, PipelineOptions) - Constructor for class org.apache.beam.runners.spark.translation.EvaluationContext
- EvaluationContext(JavaSparkContext, Pipeline, PipelineOptions, JavaStreamingContext) - Constructor for class org.apache.beam.runners.spark.translation.EvaluationContext
- Evaluator(SparkPipelineTranslator, EvaluationContext) - Constructor for class org.apache.beam.runners.spark.SparkRunner.Evaluator
- event() - Method in class org.apache.beam.sdk.io.splunk.SplunkEvent
- EVENT_TIME - Enum constant in enum class org.apache.beam.sdk.state.TimeDomain
-
The
TimeDomain.EVENT_TIME
domain corresponds to the timestamps on the elements. - EventExaminer<EventT,
StateT> - Interface in org.apache.beam.sdk.extensions.ordered -
Classes extending this interface will be called by
OrderedEventProcessor
to examine every incoming event. - eventStore() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportUserEvents
- eventStore() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIPredict
- eventStore() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIWriteUserEvent
- Event Timestamps and Watermark - Search tag in class org.apache.beam.sdk.io.kafka.KafkaIO
- Section
- eventually(Matcher<Iterable<? extends Row>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery.RowsAssertion
- ever() - Static method in class org.apache.beam.sdk.transforms.windowing.Never
-
Returns a trigger which never fires.
- every(Duration) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
-
Returns a new
SlidingWindows
with the original size, that assigns timestamps into half-open intervals of the form [N * period, N * period + size), where 0 is the epoch. - EXACTLY_ONCE - Static variable in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- Example - Search tag in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO
- Section
- Example - Search tag in class org.apache.beam.sdk.io.mqtt.MqttIO
- Section
- Example - Search tag in interface org.apache.beam.sdk.io.range.RangeTracker
- Section
- Example - Search tag in org.apache.beam.sdk.io.csv.CsvIOParse.withCustomRecordParsing(String, SerializableFunction<String, OutputT>)
- Section
- Example: Matching a PCollection of filepatterns arriving from Kafka - Search tag in class org.apache.beam.sdk.io.FileIO
- Section
- Example: Returning filenames and contents of compressed files matching a filepattern - Search tag in class org.apache.beam.sdk.io.FileIO
- Section
- Example: Watching a single filepattern for new files - Search tag in class org.apache.beam.sdk.io.FileIO
- Section
- Example: Writing CSV files - Search tag in class org.apache.beam.sdk.io.FileIO
- Section
- Example: Writing CSV files to different directories and with different headers - Search tag in class org.apache.beam.sdk.io.FileIO
- Section
- Example 1: Approximate Count of Ints PCollection<Integer> and specify precision - Search tag in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct
- Section
- Example 1: basic use - Search tag in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct
- Section
- Example 1: Create long-type sketch for a PCollection<Long> and specify precision - Search tag in class org.apache.beam.sdk.extensions.zetasketch.HllCount
- Section
- Example 1: default use - Search tag in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies
- Section
- Example 1: Default use - Search tag in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles
- Section
- Example 1: globally default use - Search tag in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct
- Section
- Example 2: Approximate Count of Key Value PCollection<KV<Integer,Foo>> - Search tag in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct
- Section
- Example 2: Create bytes-type sketch for a PCollection<KV<String, byte[]>> - Search tag in class org.apache.beam.sdk.extensions.zetasketch.HllCount
- Section
- Example 2: per key default use - Search tag in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct
- Section
- Example 2: tune accuracy parameters - Search tag in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies
- Section
- Example 2: tune accuracy parameters - Search tag in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles
- Section
- Example 2: use the CombineFn in a stateful ParDo - Search tag in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct
- Section
- Example 3: Approximate Count of Key Value PCollection<KV<Integer,Foo>> - Search tag in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct
- Section
- Example 3: Merge existing sketches in a PCollection<byte[]> into a new one - Search tag in class org.apache.beam.sdk.extensions.zetasketch.HllCount
- Section
- Example 3: query the resulting sketch - Search tag in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies
- Section
- Example 3 : Query the resulting structure - Search tag in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles
- Section
- Example 3: tune precision and use sparse representation - Search tag in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct
- Section
- Example 3: use the RetrieveCardinality utility class - Search tag in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct
- Section
- Example 4: Estimates the count of distinct elements in a PCollection<String> - Search tag in class org.apache.beam.sdk.extensions.zetasketch.HllCount
- Section
- Example 4: Using the CombineFn - Search tag in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies
- Section
- Example 4: Using the CombineFn - Search tag in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles
- Section
- Example PubsubIO read usage - Search tag in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
- Section
- Example PubsubIO write usage - Search tag in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
- Section
- Examples - Search tag in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct
- Section
- Examples - Search tag in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies
- Section
- Examples - Search tag in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles
- Section
- Examples - Search tag in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct
- Section
- Examples - Search tag in class org.apache.beam.sdk.extensions.zetasketch.HllCount
- Section
- Example usage - Search tag in org.apache.beam.sdk.io.csv.CsvIO.parse(Class<T>, CSVFormat)
- Section
- Example usage - Search tag in org.apache.beam.sdk.io.csv.CsvIO.parseRows(Schema, CSVFormat)
- Section
- Example usage: - Search tag in class org.apache.beam.sdk.io.csv.CsvIO
- Section
- Example usage: - Search tag in class org.apache.beam.sdk.io.json.JsonIO
- Section
- exceptAll() - Static method in class org.apache.beam.sdk.transforms.Sets
-
Returns a new
PTransform
transform that follows SET ALL semantics which takes aPCollectionList<PCollection<T>>
and returns aPCollection<T>
containing the difference all (exceptAll) of collections done in order for all collections inPCollectionList<T>
. - exceptAll(PCollection<T>) - Static method in class org.apache.beam.sdk.transforms.Sets
-
Returns a new
PTransform
transform that follows SET ALL semantics to compute the difference all (exceptAll) with providedPCollection<T>
. - exceptDistinct() - Static method in class org.apache.beam.sdk.transforms.Sets
-
Returns a
PTransform
that takes aPCollectionList<PCollection<T>>
and returns aPCollection<T>
containing the difference (except) of collections done in order for all collections inPCollectionList<T>
. - exceptDistinct(PCollection<T>) - Static method in class org.apache.beam.sdk.transforms.Sets
-
Returns a new
PTransform
transform that follows SET DISTINCT semantics to compute the difference (except) with providedPCollection<T>
. - exception() - Method in class org.apache.beam.sdk.transforms.WithFailures.ExceptionElement
- exception_thrown - Enum constant in enum class org.apache.beam.sdk.extensions.ordered.UnprocessedEvent.Reason
- ExceptionAsMapHandler() - Constructor for class org.apache.beam.sdk.transforms.WithFailures.ExceptionAsMapHandler
- ExceptionElement() - Constructor for class org.apache.beam.sdk.transforms.WithFailures.ExceptionElement
- exceptionHandler - Variable in class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceTransform
- exceptionsInto(TypeDescriptor<FailureT>) - Method in class org.apache.beam.sdk.transforms.MapKeys
-
Returns a new
SimpleMapWithFailures
transform that catches exceptions raised while mapping elements, with the given type descriptor used for the failure collection but the exception handler yet to be specified usingSimpleMapWithFailures.exceptionsVia(ProcessFunction)
. - exceptionsInto(TypeDescriptor<FailureT>) - Method in class org.apache.beam.sdk.transforms.MapValues
-
Returns a new
SimpleMapWithFailures
transform that catches exceptions raised while mapping elements, with the given type descriptor used for the failure collection but the exception handler yet to be specified usingSimpleMapWithFailures.exceptionsVia(ProcessFunction)
. - exceptionsInto(TypeDescriptor<NewFailureT>) - Method in class org.apache.beam.sdk.extensions.jackson.AsJsons
-
Returns a new
AsJsons.AsJsonsWithFailures
transform that catches exceptions raised while writing JSON elements, with the given type descriptor used for the failure collection but the exception handler yet to be specified usingAsJsons.AsJsonsWithFailures.exceptionsVia(ProcessFunction)
. - exceptionsInto(TypeDescriptor<NewFailureT>) - Method in class org.apache.beam.sdk.extensions.jackson.ParseJsons
-
Returns a new
ParseJsons.ParseJsonsWithFailures
transform that catches exceptions raised while parsing elements, with the given type descriptor used for the failure collection but the exception handler yet to be specified usingParseJsons.ParseJsonsWithFailures.exceptionsVia(ProcessFunction)
. - exceptionsInto(TypeDescriptor<NewFailureT>) - Method in class org.apache.beam.sdk.transforms.FlatMapElements
-
Returns a new
FlatMapElements.FlatMapWithFailures
transform that catches exceptions raised while mapping elements, with the given type descriptor used for the failure collection but the exception handler yet to be specified usingFlatMapElements.FlatMapWithFailures.exceptionsVia(ProcessFunction)
. - exceptionsInto(TypeDescriptor<NewFailureT>) - Method in class org.apache.beam.sdk.transforms.MapElements
-
Returns a new
MapElements.MapWithFailures
transform that catches exceptions raised while mapping elements, with the given type descriptor used for the failure collection but the exception handler yet to be specified usingMapElements.MapWithFailures.exceptionsVia(ProcessFunction)
. - exceptionsVia() - Method in class org.apache.beam.sdk.extensions.jackson.AsJsons
-
Returns a new
AsJsons.AsJsonsWithFailures
transform that catches exceptions raised while writing JSON elements, passing the raised exception instance and the input element being processed through the default exception handlerAsJsons.DefaultExceptionAsMapHandler
and emitting the result to a failure collection. - exceptionsVia() - Method in class org.apache.beam.sdk.extensions.jackson.ParseJsons
-
Returns a new
ParseJsons.ParseJsonsWithFailures
transform that catches exceptions raised while parsing elements, passing the raised exception instance and the input element being processed through the default exception handlerParseJsons.DefaultExceptionAsMapHandler
and emitting the result to a failure collection. - exceptionsVia(InferableFunction<WithFailures.ExceptionElement<InputT>, FailureT>) - Method in class org.apache.beam.sdk.extensions.jackson.AsJsons
-
Returns a new
AsJsons.AsJsonsWithFailures
transform that catches exceptions raised while writing JSON elements, passing the raised exception instance and the input element being processed through the givenexceptionHandler
and emitting the result to a failure collection. - exceptionsVia(InferableFunction<WithFailures.ExceptionElement<InputT>, FailureT>) - Method in class org.apache.beam.sdk.transforms.FlatMapElements
-
Returns a new
FlatMapElements.FlatMapWithFailures
transform that catches exceptions raised while mapping elements, passing the raised exception instance and the input element being processed through the givenexceptionHandler
and emitting the result to a failure collection. - exceptionsVia(InferableFunction<WithFailures.ExceptionElement<InputT>, FailureT>) - Method in class org.apache.beam.sdk.transforms.MapElements
-
Returns a new
MapElements.MapWithFailures
transform that catches exceptions raised while mapping elements, passing the raised exception instance and the input element being processed through the givenexceptionHandler
and emitting the result to a failure collection. - exceptionsVia(InferableFunction<WithFailures.ExceptionElement<String>, FailureT>) - Method in class org.apache.beam.sdk.extensions.jackson.ParseJsons
-
Returns a new
ParseJsons.ParseJsonsWithFailures
transform that catches exceptions raised while parsing elements, passing the raised exception instance and the input element being processed through the givenexceptionHandler
and emitting the result to a failure collection. - exceptionsVia(InferableFunction<WithFailures.ExceptionElement<KV<K, V1>>, FailureT>) - Method in class org.apache.beam.sdk.transforms.MapValues
-
Returns a new
SimpleMapWithFailures
transform that catches exceptions raised while mapping elements, passing the raised exception instance and the input element being processed through the givenexceptionHandler
and emitting the result to a failure collection. - exceptionsVia(InferableFunction<WithFailures.ExceptionElement<KV<K1, V>>, FailureT>) - Method in class org.apache.beam.sdk.transforms.MapKeys
-
Returns a new
SimpleMapWithFailures
transform that catches exceptions raised while mapping elements, passing the raised exception instance and the input element being processed through the givenexceptionHandler
and emitting the result to a failure collection. - exceptionsVia(ProcessFunction<WithFailures.ExceptionElement<InputT>, FailureT>) - Method in class org.apache.beam.sdk.extensions.jackson.AsJsons.AsJsonsWithFailures
-
Returns a new
AsJsons.AsJsonsWithFailures
transform that catches exceptions raised while writing JSON elements, passing the raised exception instance and the input element being processed through the givenexceptionHandler
and emitting the result to a failure collection. - exceptionsVia(ProcessFunction<WithFailures.ExceptionElement<InputT>, FailureT>) - Method in class org.apache.beam.sdk.transforms.FlatMapElements.FlatMapWithFailures
-
Returns a new
FlatMapElements.FlatMapWithFailures
transform that catches exceptions raised while mapping elements, passing the raised exception instance and the input element being processed through the givenexceptionHandler
and emitting the result to a failure collection. - exceptionsVia(ProcessFunction<WithFailures.ExceptionElement<InputT>, FailureT>) - Method in class org.apache.beam.sdk.transforms.MapElements.MapWithFailures
-
Returns a
PTransform
that catches exceptions raised while mapping elements, passing the raised exception instance and the input element being processed through the givenexceptionHandler
and emitting the result to a failure collection. - exceptionsVia(ProcessFunction<WithFailures.ExceptionElement<String>, FailureT>) - Method in class org.apache.beam.sdk.extensions.jackson.ParseJsons.ParseJsonsWithFailures
-
Returns a new
ParseJsons.ParseJsonsWithFailures
transform that catches exceptions raised while parsing elements, passing the raised exception instance and the input element being processed through the givenexceptionHandler
and emitting the result to a failure collection. - ExecutableGraph<ExecutableT,
CollectionT> - Interface in org.apache.beam.runners.direct -
The interface that enables querying of a graph of independently executable stages and the inputs and outputs of those stages.
- ExecutableProcessBundleDescriptor() - Constructor for class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.ExecutableProcessBundleDescriptor
- ExecutableStageContext - Interface in org.apache.beam.runners.fnexecution.control
-
The context required in order to execute
stages
. - ExecutableStageContext.Factory - Interface in org.apache.beam.runners.fnexecution.control
-
Creates
ExecutableStageContext
instances. - ExecutableStageDoFnOperator<InputT,
OutputT> - Class in org.apache.beam.runners.flink.translation.wrappers.streaming -
This operator is the streaming equivalent of the
FlinkExecutableStageFunction
. - ExecutableStageDoFnOperator(String, Coder<WindowedValue<InputT>>, Map<TupleTag<?>, Coder<?>>, TupleTag<OutputT>, List<TupleTag<?>>, DoFnOperator.OutputManagerFactory<OutputT>, Map<Integer, PCollectionView<?>>, Collection<PCollectionView<?>>, Map<RunnerApi.ExecutableStagePayload.SideInputId, PCollectionView<?>>, PipelineOptions, RunnerApi.ExecutableStagePayload, JobInfo, FlinkExecutableStageContextFactory, Map<String, TupleTag<?>>, WindowingStrategy, Coder, KeySelector<WindowedValue<InputT>, ?>) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.ExecutableStageDoFnOperator
-
Constructor.
- execute() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter
- execute(BatchTSetEnvironment) - Method in class org.apache.beam.runners.twister2.BeamBatchWorker
- execute(Runnable) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- execute(String) - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.BatchTranslationContext
- execute(String) - Method in interface org.apache.beam.runners.flink.FlinkPortablePipelineTranslator.Executor
- execute(String) - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator.StreamingTranslationContext
- execute(String) - Method in class org.apache.beam.sdk.extensions.sql.BeamSqlCli
-
Executes the given sql.
- execute(CalcitePrepare.Context) - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateCatalog
- execute(CalcitePrepare.Context) - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateDatabase
- execute(CalcitePrepare.Context) - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateExternalTable
- execute(CalcitePrepare.Context) - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateFunction
- execute(CalcitePrepare.Context) - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlDropCatalog
- execute(CalcitePrepare.Context) - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlDropDatabase
- execute(CalcitePrepare.Context) - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlDropTable
- execute(CalcitePrepare.Context) - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlSetOptionBeam
- execute(CalcitePrepare.Context) - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlUseCatalog
- execute(CalcitePrepare.Context) - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlUseDatabase
- execute(Expression, Class<T>) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- execute(Expression, Type) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- EXECUTE_BUNDLE - Enum constant in enum class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.WriteMethod
- executeBundles(String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
- executeBundles(ValueProvider<String>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
- ExecuteBundles(String) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundles
- ExecuteBundles(ValueProvider<String>) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundles
-
Instantiates a new Execute bundles.
- executeDdl(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
- executeFhirBundle(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Execute fhir bundle http body.
- executeFhirBundle(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- executePipeline(BatchTSetEnvironment) - Method in class org.apache.beam.runners.twister2.BeamBatchWorker
- executeQuery(Queryable<T>) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- ExecutionDriver - Interface in org.apache.beam.runners.local
-
Drives the execution of a
Pipeline
by scheduling work. - ExecutionDriver.DriverState - Enum Class in org.apache.beam.runners.local
-
The state of the driver.
- ExecutorOptions - Interface in org.apache.beam.sdk.options
-
Options for configuring the
ScheduledExecutorService
used throughout the Java runtime. - ExecutorOptions.ScheduledExecutorServiceFactory - Class in org.apache.beam.sdk.options
-
Returns the default
ScheduledExecutorService
to use within the Apache Beam SDK. - ExecutorServiceFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcsOptions.ExecutorServiceFactory
- exists() - Method in class org.apache.beam.io.debezium.KafkaSourceConsumerFn.DebeziumSDFDatabaseHistory
- exitArrayQualifier(FieldSpecifierNotationParser.ArrayQualifierContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
Exit a parse tree produced by
FieldSpecifierNotationParser.arrayQualifier()
. - exitArrayQualifier(FieldSpecifierNotationParser.ArrayQualifierContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
-
Exit a parse tree produced by
FieldSpecifierNotationParser.arrayQualifier()
. - exitArrayQualifierList(FieldSpecifierNotationParser.ArrayQualifierListContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
Exit a parse tree produced by the
arrayQualifierList
labeled alternative inFieldSpecifierNotationParser.qualifierList()
. - exitArrayQualifierList(FieldSpecifierNotationParser.ArrayQualifierListContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
-
Exit a parse tree produced by the
arrayQualifierList
labeled alternative inFieldSpecifierNotationParser.qualifierList()
. - exitDotExpression(FieldSpecifierNotationParser.DotExpressionContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
Exit a parse tree produced by
FieldSpecifierNotationParser.dotExpression()
. - exitDotExpression(FieldSpecifierNotationParser.DotExpressionContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
-
Exit a parse tree produced by
FieldSpecifierNotationParser.dotExpression()
. - exitEveryRule(ParserRuleContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
- exitFieldSpecifier(FieldSpecifierNotationParser.FieldSpecifierContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
Exit a parse tree produced by
FieldSpecifierNotationParser.fieldSpecifier()
. - exitFieldSpecifier(FieldSpecifierNotationParser.FieldSpecifierContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
-
Exit a parse tree produced by
FieldSpecifierNotationParser.fieldSpecifier()
. - exitMapQualifier(FieldSpecifierNotationParser.MapQualifierContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
Exit a parse tree produced by
FieldSpecifierNotationParser.mapQualifier()
. - exitMapQualifier(FieldSpecifierNotationParser.MapQualifierContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
-
Exit a parse tree produced by
FieldSpecifierNotationParser.mapQualifier()
. - exitMapQualifierList(FieldSpecifierNotationParser.MapQualifierListContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
Exit a parse tree produced by the
mapQualifierList
labeled alternative inFieldSpecifierNotationParser.qualifierList()
. - exitMapQualifierList(FieldSpecifierNotationParser.MapQualifierListContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
-
Exit a parse tree produced by the
mapQualifierList
labeled alternative inFieldSpecifierNotationParser.qualifierList()
. - exitQualifiedComponent(FieldSpecifierNotationParser.QualifiedComponentContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
Exit a parse tree produced by
FieldSpecifierNotationParser.qualifiedComponent()
. - exitQualifiedComponent(FieldSpecifierNotationParser.QualifiedComponentContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
-
Exit a parse tree produced by
FieldSpecifierNotationParser.qualifiedComponent()
. - exitQualifyComponent(FieldSpecifierNotationParser.QualifyComponentContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
Exit a parse tree produced by the
qualifyComponent
labeled alternative inFieldSpecifierNotationParser.dotExpressionComponent()
. - exitQualifyComponent(FieldSpecifierNotationParser.QualifyComponentContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
-
Exit a parse tree produced by the
qualifyComponent
labeled alternative inFieldSpecifierNotationParser.dotExpressionComponent()
. - exitRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.ArrayQualifierContext
- exitRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.ArrayQualifierListContext
- exitRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionContext
- exitRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.FieldSpecifierContext
- exitRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.MapQualifierContext
- exitRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.MapQualifierListContext
- exitRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifiedComponentContext
- exitRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifyComponentContext
- exitRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.SimpleIdentifierContext
- exitRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.WildcardContext
- exitSimpleIdentifier(FieldSpecifierNotationParser.SimpleIdentifierContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
Exit a parse tree produced by the
simpleIdentifier
labeled alternative inFieldSpecifierNotationParser.dotExpressionComponent()
. - exitSimpleIdentifier(FieldSpecifierNotationParser.SimpleIdentifierContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
-
Exit a parse tree produced by the
simpleIdentifier
labeled alternative inFieldSpecifierNotationParser.dotExpressionComponent()
. - exitWildcard(FieldSpecifierNotationParser.WildcardContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
Exit a parse tree produced by the
wildcard
labeled alternative inFieldSpecifierNotationParser.dotExpressionComponent()
. - exitWildcard(FieldSpecifierNotationParser.WildcardContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
-
Exit a parse tree produced by the
wildcard
labeled alternative inFieldSpecifierNotationParser.dotExpressionComponent()
. - expand() - Method in class org.apache.beam.io.requestresponse.Result
- expand() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedEventProcessorResult
- expand() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Write.Result
- expand() - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.WriteBatches.Result
- expand() - Method in class org.apache.beam.sdk.io.csv.CsvIOParseResult
- expand() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
- expand() - Method in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata.Result
- expand() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
- expand() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read.Result
- expand() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search.Result
- expand() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.Result
- expand() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything.Result
- expand() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read.Result
- expand() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.Result
- expand() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write.Result
- expand() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteResult
- expand() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteResult
- expand() - Method in class org.apache.beam.sdk.io.jms.WriteJmsResult
- expand() - Method in class org.apache.beam.sdk.io.solace.write.SolaceOutput
- expand() - Method in class org.apache.beam.sdk.io.WriteFilesResult
- expand() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
-
Expands the component
PCollections
, stripping off any tag-specific information. - expand() - Method in class org.apache.beam.sdk.transforms.JsonToRow.ParseResult
- expand() - Method in class org.apache.beam.sdk.transforms.WithFailures.Result
- expand() - Method in class org.apache.beam.sdk.values.PBegin
- expand() - Method in class org.apache.beam.sdk.values.PCollection
- expand() - Method in class org.apache.beam.sdk.values.PCollectionList
- expand() - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
- expand() - Method in class org.apache.beam.sdk.values.PCollectionTuple
- expand() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
- expand() - Method in class org.apache.beam.sdk.values.PDone
- expand() - Method in interface org.apache.beam.sdk.values.PInput
- expand() - Method in interface org.apache.beam.sdk.values.POutput
- expand() - Method in interface org.apache.beam.sdk.values.PValue
-
Deprecated.A
PValue
always expands into itself. CallingPValue.expand()
on a PValue is almost never appropriate. - expand(InputT) - Method in class org.apache.beam.sdk.extensions.python.PythonExternalTransform
- expand(InputT) - Method in class org.apache.beam.sdk.extensions.yaml.YamlTransform
- expand(InputT) - Method in class org.apache.beam.sdk.transforms.PTransform
-
Override this method to specify how this
PTransform
should be expanded on the givenInputT
. - expand(ExpansionApi.ExpansionRequest, StreamObserver<ExpansionApi.ExpansionResponse>) - Method in class org.apache.beam.sdk.expansion.service.ExpansionService
- expand(GcsPath) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
-
Expands a pattern into matched paths.
- expand(KeyedPCollectionTuple<K>) - Method in class org.apache.beam.sdk.transforms.join.CoGroupByKey
- expand(PBegin) - Method in class org.apache.beam.io.debezium.DebeziumIO.Read
- expand(PBegin) - Method in class org.apache.beam.runners.spark.io.CreateStream
- expand(PBegin) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Parse
- expand(PBegin) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datagen.DataGeneratorPTransform
- expand(PBegin) - Method in class org.apache.beam.sdk.io.amqp.AmqpIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.aws2.dynamodb.DynamoDBIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.KinesisReadToBytes
- expand(PBegin) - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.azure.cosmos.CosmosIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.BoundedReadFromUnboundedSource
- expand(PBegin) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.cdap.CdapIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.FileIO.Match
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Deidentify
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Export
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.ListHL7v2Messages
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.SubscribeTransform
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
- expand(PBegin) - Method in class org.apache.beam.sdk.io.GenerateSequence
- expand(PBegin) - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.iceberg.IcebergIO.ReadRows
- expand(PBegin) - Method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadRows
- expand(PBegin) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadWithPartitions
- expand(PBegin) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.TypedWithoutMetadata
- expand(PBegin) - Method in class org.apache.beam.sdk.io.kudu.KuduIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Parse
- expand(PBegin) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.pulsar.PulsarIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.Read.Bounded
- expand(PBegin) - Method in class org.apache.beam.sdk.io.Read.Unbounded
- expand(PBegin) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.ReadWithPartitions
- expand(PBegin) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.solr.SolrIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.TextIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.TFRecordIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.tika.TikaIO.Parse
- expand(PBegin) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.testing.PAssert.OneSideInputAssert
- expand(PBegin) - Method in class org.apache.beam.sdk.testing.TestStream
- expand(PBegin) - Method in class org.apache.beam.sdk.transforms.Create.OfValueProvider
- expand(PBegin) - Method in class org.apache.beam.sdk.transforms.Create.TimestampedValues
- expand(PBegin) - Method in class org.apache.beam.sdk.transforms.Create.Values
- expand(PBegin) - Method in class org.apache.beam.sdk.transforms.Create.WindowedValues
- expand(PBegin) - Method in class org.apache.beam.sdk.transforms.Impulse
- expand(PBegin) - Method in class org.apache.beam.sdk.transforms.PeriodicImpulse
- expand(PCollection) - Method in class org.apache.beam.sdk.schemas.transforms.Join.Impl
- expand(PCollection<?>) - Method in class org.apache.beam.sdk.extensions.python.transforms.RunInference
- expand(PCollection<? extends InputT>) - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.ParDoSingle
- expand(PCollection<? extends InputT>) - Method in class org.apache.beam.sdk.extensions.python.transforms.PythonMap
- expand(PCollection<? extends InputT>) - Method in class org.apache.beam.sdk.schemas.io.DeadLetteredTransform
- expand(PCollection<? extends InputT>) - Method in class org.apache.beam.sdk.transforms.FlatMapElements
- expand(PCollection<? extends InputT>) - Method in class org.apache.beam.sdk.transforms.MapElements
- expand(PCollection<? extends InputT>) - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
- expand(PCollection<? extends InputT>) - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
- expand(PCollection<? extends Iterable<T>>) - Method in class org.apache.beam.sdk.transforms.Flatten.Iterables
- expand(PCollection<? extends KV<?, V>>) - Method in class org.apache.beam.sdk.transforms.Values
- expand(PCollection<? extends KV<K, ?>>) - Method in class org.apache.beam.sdk.transforms.Keys
- expand(PCollection<? extends KV<K, ? extends Iterable<InputT>>>) - Method in class org.apache.beam.sdk.transforms.Combine.GroupedValues
- expand(PCollection<byte[]>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.ProtoFromBytes
- expand(PCollection<byte[]>) - Method in class org.apache.beam.sdk.io.pulsar.PulsarIO.Write
- expand(PCollection<byte[]>) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
- expand(PCollection<SearchGoogleAdsStreamRequest>) - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsV19.ReadAll
- expand(PCollection<GenericJson>) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAICreateCatalogItem
-
The transform converts the contents of input PCollection into
CatalogItem
s and then calls the Recommendation AI service to create the catalog item. - expand(PCollection<GenericJson>) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIPredict
- expand(PCollection<GenericJson>) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIWriteUserEvent
-
The transform converts the contents of input PCollection into
UserEvent
s and then calls the Recommendation AI service to create the user event. - expand(PCollection<Row>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableRowToBeamRow
- expand(PCollection<Row>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableRowToBeamRowFlat
- expand(PCollection<Document>) - Method in class org.apache.beam.sdk.extensions.ml.AnnotateText
- expand(PCollection<PubSubMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.AddUuidsTransform
- expand(PCollection<PubSubMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.SetUuidFromPubSubMessage
- expand(PCollection<SequencedMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UuidDeduplicationTransform
- expand(PCollection<Mutation>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
- expand(PCollection<Entity>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
- expand(PCollection<Entity>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
- expand(PCollection<Entity>) - Method in class org.apache.beam.sdk.io.gcp.datastore.EntityToRow
- expand(PCollection<Key>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
- expand(PCollection<BatchGetDocumentsRequest>) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchGetDocuments
- expand(PCollection<ListCollectionIdsRequest>) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.ListCollectionIds
- expand(PCollection<ListDocumentsRequest>) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.ListDocuments
- expand(PCollection<PartitionQueryRequest>) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.PartitionQuery
- expand(PCollection<RunQueryRequest>) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.RunQuery
- expand(PCollection<Write>) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchWriteWithDeadLetterQueue
- expand(PCollection<Write>) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchWriteWithSummary
- expand(PCollection<ByteString>) - Method in class org.apache.beam.sdk.extensions.ml.VideoIntelligence.AnnotateVideoFromBytes
- expand(PCollection<ElemT>) - Method in class org.apache.beam.runners.dataflow.CreateDataflowView
- expand(PCollection<ElemT>) - Method in class org.apache.beam.runners.spark.translation.streaming.CreateStreamingSparkView
- expand(PCollection<ElemT>) - Method in class org.apache.beam.sdk.transforms.View.CreatePCollectionView
- expand(PCollection<ErrorT>) - Method in class org.apache.beam.sdk.transforms.errorhandling.ErrorHandler.PTransformErrorHandler.WriteErrorMetrics
- expand(PCollection<EventT>) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Write
- expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.extensions.jackson.AsJsons.AsJsonsWithFailures
- expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.extensions.jackson.AsJsons
- expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.GloballyDistinct
- expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.GlobalSketch
- expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite
- expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Write
- expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
- expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
- expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
- expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineGlobally
- expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
- expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.transforms.Combine.Globally
- expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.transforms.Combine.GloballyAsSingletonView
- expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.transforms.FlatMapElements.FlatMapWithFailures
- expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.transforms.MapElements.MapWithFailures
- expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.transforms.Watch.Growth
- expand(PCollection<Double>) - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.GlobalDigest
- expand(PCollection<String>) - Method in class org.apache.beam.runners.spark.structuredstreaming.examples.WordCount.CountWords
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ParseAll
-
Deprecated.
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ReadAll
-
Deprecated.
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.extensions.jackson.ParseJsons
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.extensions.jackson.ParseJsons.ParseJsonsWithFailures
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.extensions.ml.VideoIntelligence.AnnotateVideoFromUri
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTableProvider.CsvToRow
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTableProvider.LinesReadConverter
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.csv.CsvIOParse
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocToBulk
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.FileIO.MatchAll
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Import
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.FetchHL7v2Message
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsV19.Read
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Write
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.redis.RedisIO.ReadKeyPatterns
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.TextIO.ReadAll
-
Deprecated.
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.TextIO.Write
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.AllMatches
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.Find
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.FindAll
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.FindKV
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.FindName
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.FindNameKV
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.Matches
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.MatchesKV
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.MatchesName
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.MatchesNameKV
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.ReplaceAll
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.ReplaceFirst
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.Split
- expand(PCollection<List<ElemT>>) - Method in class org.apache.beam.runners.spark.translation.streaming.CreateStreamingSparkView.CreateSparkPCollectionView
- expand(PCollection<CassandraIO.Read<T>>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.ReadAll
- expand(PCollection<ElasticsearchIO.Document>) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BulkIO
- expand(PCollection<FileIO.ReadableFile>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ParseFiles
- expand(PCollection<FileIO.ReadableFile>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ReadFiles
- expand(PCollection<FileIO.ReadableFile>) - Method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.ReadFiles
- expand(PCollection<FileIO.ReadableFile>) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.ParseFiles
- expand(PCollection<FileIO.ReadableFile>) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.ReadFiles
- expand(PCollection<FileIO.ReadableFile>) - Method in class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceTransform
- expand(PCollection<FileIO.ReadableFile>) - Method in class org.apache.beam.sdk.io.TextIO.ReadFiles
- expand(PCollection<FileIO.ReadableFile>) - Method in class org.apache.beam.sdk.io.TFRecordIO.ReadFiles
- expand(PCollection<FileIO.ReadableFile>) - Method in class org.apache.beam.sdk.io.thrift.ThriftIO.ReadFiles
- expand(PCollection<FileIO.ReadableFile>) - Method in class org.apache.beam.sdk.io.tika.TikaIO.ParseFiles
- expand(PCollection<FileIO.ReadableFile>) - Method in class org.apache.beam.sdk.io.xml.XmlIO.ReadFiles
- expand(PCollection<MatchResult.Metadata>) - Method in class org.apache.beam.sdk.io.FileIO.ReadMatches
- expand(PCollection<FhirBundleParameter>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundles
- expand(PCollection<FhirIOPatientEverything.PatientEverythingParameter>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything
- expand(PCollection<FhirSearchParameter<T>>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search
- expand(PCollection<HL7v2Message>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write
- expand(PCollection<HL7v2ReadParameter>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read
- expand(PCollection<HL7v2ReadParameter>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read.FetchHL7v2Message
- expand(PCollection<PubsubMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
- expand(PCollection<MutationGroup>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.WriteGrouped
- expand(PCollection<ReadOperation>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
- expand(PCollection<HBaseIO.Read>) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.ReadAll
- expand(PCollection<KafkaSourceDescriptor>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
- expand(PCollection<RabbitMqMessage>) - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO.Write
- expand(PCollection<SolrIO.Read>) - Method in class org.apache.beam.sdk.io.solr.SolrIO.ReadAll
- expand(PCollection<SplunkEvent>) - Method in class org.apache.beam.sdk.io.splunk.SplunkIO.Write
- expand(PCollection<SuccessOrFailure>) - Method in class org.apache.beam.sdk.testing.PAssert.DefaultConcludeTransform
- expand(PCollection<PeriodicSequence.SequenceDefinition>) - Method in class org.apache.beam.sdk.transforms.PeriodicSequence
- expand(PCollection<KV<byte[], RowMutations>>) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.WriteRowMutations
- expand(PCollection<KV<ByteString, VideoContext>>) - Method in class org.apache.beam.sdk.extensions.ml.VideoIntelligence.AnnotateVideoFromBytesWithContext
- expand(PCollection<KV<ByteString, Iterable<Mutation>>>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
- expand(PCollection<KV<ByteString, Iterable<Mutation>>>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.WriteWithResults
- expand(PCollection<KV<DestinationT, ElementT>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.CreateTables
- expand(PCollection<KV<DestinationT, ElementT>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiConvertMessages
- expand(PCollection<KV<DestinationT, ElementT>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiLoads
- expand(PCollection<KV<DestinationT, ElementT>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingInserts
- expand(PCollection<KV<DestinationT, StorageApiWritePayload>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteRecordsInconsistent
- expand(PCollection<KV<DestinationT, StorageApiWritePayload>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords
- expand(PCollection<KV<EventKeyT, KV<Long, EventT>>>) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedEventProcessor
- expand(PCollection<KV<String, GenericJson>>) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportCatalogItems
-
The transform converts the contents of input PCollection into
CatalogItem
s and then calls the Recommendation AI service to create the catalog item. - expand(PCollection<KV<String, GenericJson>>) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportUserEvents
-
The transform converts the contents of input PCollection into
UserEvent
s and then calls the Recommendation AI service to create the user event. - expand(PCollection<KV<String, VideoContext>>) - Method in class org.apache.beam.sdk.extensions.ml.VideoIntelligence.AnnotateVideoFromURIWithContext
- expand(PCollection<KV<String, String>>) - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
-
The transform converts the contents of input PCollection into
Table.Row
s and then calls Cloud DLP service to perform the deidentification according to provided settings. - expand(PCollection<KV<String, String>>) - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText
-
The transform converts the contents of input PCollection into
Table.Row
s and then calls Cloud DLP service to perform the data inspection according to provided settings. - expand(PCollection<KV<String, String>>) - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
-
The transform converts the contents of input PCollection into
Table.Row
s and then calls Cloud DLP service to perform the reidentification according to provided settings. - expand(PCollection<KV<String, String>>) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Write
- expand(PCollection<KV<String, Map<String, String>>>) - Method in class org.apache.beam.sdk.io.redis.RedisIO.WriteStreams
- expand(PCollection<KV<K, InputT>>) - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
- expand(PCollection<KV<K, InputT>>) - Method in class org.apache.beam.sdk.transforms.Combine.PerKeyWithHotKeyFanout
- expand(PCollection<KV<K, InputT>>) - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches
- expand(PCollection<KV<K, InputT>>) - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches.WithShardedKey
- expand(PCollection<KV<K, Double>>) - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.PerKeyDigest
- expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.runners.dataflow.internal.DataflowGroupByKey
- expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.PerKeyDistinct
- expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.PerKeySketch
- expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.PerKey
- expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.io.cdap.CdapIO.Write
- expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
- expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.PerKey
-
Deprecated.
- expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.Deduplicate.KeyedValues
- expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.GroupByKey
- expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.KvSwap
- expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.Redistribute.RedistributeByKey
- expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.Reshuffle
- expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.View.AsMap
- expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.View.AsMultimap
- expand(PCollection<KV<K, V1>>) - Method in class org.apache.beam.sdk.extensions.joinlibrary.Join.FullOuterJoin
- expand(PCollection<KV<K, V1>>) - Method in class org.apache.beam.sdk.extensions.joinlibrary.Join.InnerJoin
- expand(PCollection<KV<K, V1>>) - Method in class org.apache.beam.sdk.extensions.joinlibrary.Join.LeftOuterJoin
- expand(PCollection<KV<K, V1>>) - Method in class org.apache.beam.sdk.extensions.joinlibrary.Join.RightOuterJoin
- expand(PCollection<KV<K, V1>>) - Method in class org.apache.beam.sdk.transforms.MapValues
- expand(PCollection<KV<K1, V>>) - Method in class org.apache.beam.sdk.transforms.MapKeys
- expand(PCollection<KV<KeyT, ValueT>>) - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Write
- expand(PCollection<KV<TableDestination, ElementT>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingWriteTables
- expand(PCollection<KV<KafkaSourceDescriptor, KafkaRecord<K, V>>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaCommitOffset
- expand(PCollection<KV<ShardedKey<DestinationT>, Iterable<StorageApiWritePayload>>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritesShardedRecords
- expand(PCollection<KV<PrimaryKeyT, Iterable<KV<SecondaryKeyT, ValueT>>>>) - Method in class org.apache.beam.sdk.extensions.sorter.SortValues
- expand(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.python.transforms.DataframeTransform
- expand(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamJoinTransforms.JoinAsLookup
- expand(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTable.RowToDocument
- expand(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTableProvider.LinesWriteConverter
- expand(PCollection<Row>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BeamRowToBigtableMutation
- expand(PCollection<Row>) - Method in class org.apache.beam.sdk.io.gcp.datastore.RowToEntity
- expand(PCollection<Row>) - Method in class org.apache.beam.sdk.io.iceberg.IcebergIO.WriteRows
- expand(PCollection<Mutation>) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Write
- expand(PCollection<HCatRecord>) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Write
- expand(PCollection<ProducerRecord<K, V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- expand(PCollection<Message>) - Method in class org.apache.beam.sdk.io.amqp.AmqpIO.Write
- expand(PCollection<SolrInputDocument>) - Method in class org.apache.beam.sdk.io.solr.SolrIO.Write
- expand(PCollection<Document>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTable.DocumentToRow
- expand(PCollection<Document>) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
- expand(PCollection<ParameterT>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadAll
- expand(PCollection<ParameterT>) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.ReadAll
- expand(PCollection<ParameterT>) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.WriteUnwind
- expand(PCollection<RequestT>) - Method in class org.apache.beam.io.requestresponse.RequestResponseIO
- expand(PCollection<SendMessageRequest>) - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.Write
-
Deprecated.
- expand(PCollection<T>) - Method in class org.apache.beam.runners.spark.io.ConsoleIO.Write.Unbound
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
- expand(PCollection<ByteString>) - Method in class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromBytes
-
Applies all necessary transforms to call the Vision API.
- expand(PCollection<KV<ByteString, ImageContext>>) - Method in class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromBytesWithContext
-
Applies all necessary transforms to call the Vision API.
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromGcsUri
-
Applies all necessary transforms to call the Vision API.
- expand(PCollection<KV<String, ImageContext>>) - Method in class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromGcsUriWithContext
-
Applies all necessary transforms to call the Vision API.
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.Globally
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.aws2.dynamodb.DynamoDBIO.Write
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Write
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.aws2.sns.SnsIO.Write
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.WriteBatches
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.csv.CsvIO.Write
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.CombineAsIterable
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.ReifyAsIterable
- expand(PCollection<Entity>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntityWithSummary
- expand(PCollection<Key>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKeyWithSummary
- expand(PCollection<Entity>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.WriteWithSummary
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.ProtoToBytes
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteVoid
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteWithResults
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.json.JsonIO.Write
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.kudu.KuduIO.Write
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Write
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.Write
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Write
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Write
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.schemas.transforms.AddFields.Inner
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.schemas.transforms.Cast
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.schemas.transforms.DropFields.Inner
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.schemas.transforms.Filter.Inner
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.schemas.transforms.RenameFields.Inner
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.schemas.transforms.Select.Fields
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.schemas.transforms.Select.Flattened
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.schemas.transforms.WithKeys
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.testing.PAssert.GroupThenAssert
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.testing.PAssert.GroupThenAssertForSingleton
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.Globally
-
Deprecated.
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.Deduplicate.Values
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.Deduplicate.WithRepresentativeValues
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.Distinct
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.Distinct.WithRepresentativeValues
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.Filter
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.Partition
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.Redistribute.RedistributeArbitrarily
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.Reshuffle.ViaRandomKey
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.Tee
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.ToJson
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.View.AsIterable
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.View.AsList
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.View.AsSingleton
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.Wait.OnSignal
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.windowing.Window.Assign
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.windowing.Window
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.WithTimestamps
- expand(PCollection<UserT>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
- expand(PCollection<UserT>) - Method in class org.apache.beam.sdk.io.FileIO.Write
- expand(PCollection<UserT>) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
- expand(PCollection<UserT>) - Method in class org.apache.beam.sdk.io.WriteFiles
- expand(PCollection<V>) - Method in class org.apache.beam.sdk.transforms.WithKeys
- expand(PCollection<ValueT>) - Method in class org.apache.beam.sdk.extensions.timeseries.FillGaps
- expand(PCollectionList<Row>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSetOperatorRelBase
- expand(PCollectionList<T>) - Method in class org.apache.beam.sdk.transforms.Flatten.PCollections
- expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider.CsvWriteTransform
- expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformProvider.BigQueryExportSchemaTransform
- expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransform
- expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryFileLoadsSchemaTransformProvider.BigQueryFileLoadsSchemaTransform
- expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform
- expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteSchemaTransformProvider.BigQueryWriteSchemaTransform
- expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransform
- expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransform
- expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.io.json.providers.JsonWriteTransformProvider.JsonWriteTransform
- expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceSchemaTransform
- expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider.ExplodeTransform
- expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider.JavaFilterTransform
- expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider.JavaMapToFieldsTransform
- expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider.LoggingTransform
- expand(PCollectionTuple) - Method in class org.apache.beam.sdk.schemas.transforms.CoGroup.ExpandCrossProduct
- expand(PCollectionTuple) - Method in class org.apache.beam.sdk.schemas.transforms.CoGroup.Impl
- expand(PInput) - Method in class org.apache.beam.sdk.extensions.sql.SqlTransform
- expand(PInput) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
- expand(PInput) - Method in class org.apache.beam.sdk.managed.Managed.ManagedTransform
- expandInconsistent(PCollection<KV<DestinationT, ElementT>>, Coder<KV<DestinationT, StorageApiWritePayload>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiLoads
- expandInput(PInput) - Static method in class org.apache.beam.sdk.values.PValues
- expandOutput(POutput) - Static method in class org.apache.beam.sdk.values.PValues
- expandTriggered(PCollection<KV<DestinationT, ElementT>>, Coder<KV<DestinationT, StorageApiWritePayload>>, Coder<StorageApiWritePayload>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiLoads
- expandUntriggered(PCollection<KV<DestinationT, ElementT>>, Coder<KV<DestinationT, StorageApiWritePayload>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiLoads
- expandValue(PValue) - Static method in class org.apache.beam.sdk.values.PValues
- ExpansionServer - Class in org.apache.beam.sdk.expansion.service
-
A
gRPC Server
for an ExpansionService. - ExpansionService - Class in org.apache.beam.sdk.expansion.service
-
A service that allows pipeline expand transforms from a remote SDK.
- ExpansionService() - Constructor for class org.apache.beam.sdk.expansion.service.ExpansionService
- ExpansionService(String[]) - Constructor for class org.apache.beam.sdk.expansion.service.ExpansionService
- ExpansionService(PipelineOptions) - Constructor for class org.apache.beam.sdk.expansion.service.ExpansionService
- ExpansionService(PipelineOptions, String) - Constructor for class org.apache.beam.sdk.expansion.service.ExpansionService
- ExpansionService.ExpansionServiceRegistrar - Interface in org.apache.beam.sdk.expansion.service
-
A registrar that creates
TransformProvider
instances fromRunnerApi.FunctionSpec
s. - ExpansionService.ExternalTransformRegistrarLoader - Class in org.apache.beam.sdk.expansion.service
-
Exposes Java transforms via
ExternalTransformRegistrar
. - ExpansionServiceConfig - Class in org.apache.beam.sdk.expansion.service
- ExpansionServiceConfig() - Constructor for class org.apache.beam.sdk.expansion.service.ExpansionServiceConfig
- ExpansionServiceConfigFactory() - Constructor for class org.apache.beam.sdk.expansion.service.ExpansionServiceOptions.ExpansionServiceConfigFactory
- ExpansionServiceOptions - Interface in org.apache.beam.sdk.expansion.service
-
Options used to configure the
ExpansionService
. - ExpansionServiceOptions.ExpansionServiceConfigFactory - Class in org.apache.beam.sdk.expansion.service
-
Loads the ExpansionService config.
- ExpansionServiceOptions.JavaClassLookupAllowListFactory - Class in org.apache.beam.sdk.expansion.service
-
Loads the allow list from
ExpansionServiceOptions.getJavaClassLookupAllowlistFile()
, defaulting to an emptyJavaClassLookupTransformProvider.AllowList
. - ExpansionServiceSchemaTransformProvider - Class in org.apache.beam.sdk.expansion.service
- expectDryRunQuery(String, String, JobStatistics) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
- EXPECTED_SQN_PATTERN - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiCDC
-
Expected valid pattern for a
StorageApiCDC.CHANGE_SQN_COLUMN
value for use with BigQuery's_CHANGE_SEQUENCE_NUMBER
format. - expectFileToNotExist() - Method in class org.apache.beam.sdk.io.fs.CreateOptions
-
True if the file is expected to not exist.
- ExperimentalOptions - Interface in org.apache.beam.sdk.options
-
Apache Beam provides a number of experimental features that can be enabled with this flag.
- explain(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
- explainLazily(RelNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils
-
A lazy explain via
Object.toString()
for logging purposes. - explainQuery(String) - Method in class org.apache.beam.sdk.extensions.sql.BeamSqlCli
-
Returns a human readable representation of the query execution plan.
- explainTerms(RelWriter) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamAggregationRel
- explainTerms(RelWriter) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamPushDownIOSourceRel
- explainTerms(RelWriter) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnnestRel
- explainTerms(RelWriter) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.BeamZetaSqlUnnestRel
- explainTerms(RelWriter) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.ZetaSqlUnnest
- explicitRandomPartitioner(int) - Static method in interface org.apache.beam.sdk.io.aws2.kinesis.KinesisPartitioner
-
Explicit hash key partitioner that randomly returns one of x precalculated hash keys.
- EXPLODE_WINDOWS - Enum constant in enum class org.apache.beam.runners.spark.translation.SparkCombineFn.WindowedAccumulator.Type
- explodeWindows() - Method in interface org.apache.beam.sdk.values.WindowedValue
-
A representation of each of the actual values represented by this compressed
WindowedValue
, one per window. - Export(ValueProvider<String>, ValueProvider<String>) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Export
- EXPORT - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead.Method
-
Export data to Google Cloud Storage in Avro format and read data files from that location.
- exportFhirResourceToBigQuery(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Export a FHIR Resource to BigQuery.
- exportFhirResourceToBigQuery(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- exportFhirResourceToGcs(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Export a FHIR Resource to GCS.
- exportFhirResourceToGcs(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- exportResources(String, String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
-
Export resources to GCS.
- exportResources(ValueProvider<String>, ValueProvider<String>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
- exportResources(DoFn.ProcessContext) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Export.ExportResourcesFn
- ExportResourcesFn(ValueProvider<String>) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Export.ExportResourcesFn
- ExpressionConverter - Class in org.apache.beam.sdk.extensions.sql.zetasql.translation
-
Extracts expressions (function calls, field accesses) from the resolve query nodes, converts them to RexNodes.
- ExpressionConverter(RelOptCluster, QueryPlanner.QueryParameters, UserFunctionDefinitions) - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.translation.ExpressionConverter
- expressionsInFilter(List<RexNode>) - Static method in interface org.apache.beam.sdk.extensions.sql.meta.BeamSqlTableFilter
-
Count a number of
RexNode
s involved in all supported filters. - extend(String) - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Path
-
Extend the path by appending a sub-component path.
- External() - Constructor for class org.apache.beam.sdk.io.GenerateSequence.External
- External() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External
- External() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaIO.Write.External
- ExternalConfiguration() - Constructor for class org.apache.beam.sdk.io.GenerateSequence.External.ExternalConfiguration
- ExternalEnvironmentFactory - Class in org.apache.beam.runners.fnexecution.environment
-
An
EnvironmentFactory
which requests workers via the given URL in the Environment. - ExternalEnvironmentFactory.Provider - Class in org.apache.beam.runners.fnexecution.environment
-
Provider of ExternalEnvironmentFactory.
- ExternalRead - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Exposes
PubsubIO.Read
as an external transform for cross-language usage. - ExternalRead() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead
- ExternalRead.Configuration - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Parameters class to expose the transform to an external SDK.
- ExternalRead.ReadBuilder - Class in org.apache.beam.sdk.io.gcp.pubsub
- ExternalSchemaIOTransformRegistrar - Class in org.apache.beam.sdk.extensions.schemaio.expansion
- ExternalSchemaIOTransformRegistrar() - Constructor for class org.apache.beam.sdk.extensions.schemaio.expansion.ExternalSchemaIOTransformRegistrar
- ExternalSchemaIOTransformRegistrar.Configuration - Class in org.apache.beam.sdk.extensions.schemaio.expansion
- ExternalSorter - Class in org.apache.beam.sdk.extensions.sorter
-
Does an external sort of the provided values.
- ExternalSorter.Options - Class in org.apache.beam.sdk.extensions.sorter
-
ExternalSorter.Options
contains configuration of the sorter. - ExternalSorter.Options.SorterType - Enum Class in org.apache.beam.sdk.extensions.sorter
-
Sorter type.
- ExternalSqlTransformRegistrar - Class in org.apache.beam.sdk.extensions.sql.expansion
- ExternalSqlTransformRegistrar() - Constructor for class org.apache.beam.sdk.extensions.sql.expansion.ExternalSqlTransformRegistrar
- ExternalSqlTransformRegistrar.Configuration - Class in org.apache.beam.sdk.extensions.sql.expansion
- ExternalSynchronization - Interface in org.apache.beam.sdk.io.hadoop.format
-
Provides mechanism for acquiring locks related to the job.
- ExternalTransformBuilder<ConfigT,
InputT, - Interface in org.apache.beam.sdk.transformsOutputT> -
An interface for building a transform from an externally provided configuration.
- ExternalTransformRegistrar - Interface in org.apache.beam.sdk.expansion
-
A registrar which contains a mapping from URNs to available
ExternalTransformBuilder
s. - ExternalTransformRegistrarImpl - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
- ExternalTransformRegistrarImpl() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.ExternalTransformRegistrarImpl
- ExternalTransformRegistrarLoader() - Constructor for class org.apache.beam.sdk.expansion.service.ExpansionService.ExternalTransformRegistrarLoader
- externalWithMetadata() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- ExternalWrite - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Exposes
PubsubIO.Write
as an external transform for cross-language usage. - ExternalWrite() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite
- ExternalWrite.Configuration - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Parameters class to expose the transform to an external SDK.
- ExternalWrite.ParsePubsubMessageProtoAsPayloadFromWindowedValue - Class in org.apache.beam.sdk.io.gcp.pubsub
- ExternalWrite.WriteBuilder - Class in org.apache.beam.sdk.io.gcp.pubsub
- extractFromTypeParameters(T, Class<? super T>, TypeDescriptors.TypeVariableExtractor<T, V>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
Extracts a type from the actual type parameters of a parameterized class, subject to Java type erasure.
- extractFromTypeParameters(TypeDescriptor<T>, Class<? super T>, TypeDescriptors.TypeVariableExtractor<T, V>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
Like
TypeDescriptors.extractFromTypeParameters(Object, Class, TypeVariableExtractor)
, but takes aTypeDescriptor
of the instance being analyzed rather than the instance itself. - extractKeys(Object, Object[], int) - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueComparator
- extractOutput() - Method in interface org.apache.beam.runners.spark.translation.SparkCombineFn.WindowedAccumulator
-
Extract output.
- extractOutput() - Method in interface org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn.Accumulator
-
Returns the output value that is the result of combining all the input values represented by this accumulator.
- extractOutput(double[]) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
- extractOutput(int[]) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
- extractOutput(long[]) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CountIf.CountIfFn
- extractOutput(long[]) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
- extractOutput(AccumT) - Method in class org.apache.beam.sdk.extensions.sql.impl.LazyAggregateCombineFn
- extractOutput(AccumT) - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
- extractOutput(AccumT) - Method in interface org.apache.beam.sdk.extensions.sql.udf.AggregateFn
-
Returns the output value that is the result of combining all the input values represented by the given accumulator.
- extractOutput(AccumT) - Method in class org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn
- extractOutput(AccumT) - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
-
Returns the output value that is the result of combining all the input values represented by the given accumulator.
- extractOutput(AccumT, CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
-
Returns the output value that is the result of combining all the input values represented by the given accumulator.
- extractOutput(HyperLogLogPlus) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
-
Output the whole structure so it can be queried, reused or stored easily.
- extractOutput(MergingDigest) - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
-
Output the whole structure so it can be queried, reused or stored easily.
- extractOutput(Iterable<T>) - Method in class org.apache.beam.runners.flink.FlinkStreamingAggregationsTranslators.ConcatenateAsIterable
- extractOutput(Long) - Method in class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.Sum
- extractOutput(Object[]) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
- extractOutput(Object[], CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
- extractOutput(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggByte
- extractOutput(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggString
- extractOutput(List<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Concatenate
- extractOutput(List<T>) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.ArrayAgg.ArrayAggArray
- extractOutput(List<V>) - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
- extractOutput(K, AccumT, PipelineOptions, SideInputReader, Collection<? extends BoundedWindow>) - Method in class org.apache.beam.runners.flink.translation.functions.AbstractFlinkCombineRunner.CompleteFlinkCombiner
- extractOutput(K, AccumT, PipelineOptions, SideInputReader, Collection<? extends BoundedWindow>) - Method in class org.apache.beam.runners.flink.translation.functions.AbstractFlinkCombineRunner.FinalFlinkCombiner
- extractOutput(K, AccumT, PipelineOptions, SideInputReader, Collection<? extends BoundedWindow>) - Method in interface org.apache.beam.runners.flink.translation.functions.AbstractFlinkCombineRunner.FlinkCombiner
- extractOutput(K, AccumT, PipelineOptions, SideInputReader, Collection<? extends BoundedWindow>) - Method in class org.apache.beam.runners.flink.translation.functions.AbstractFlinkCombineRunner.PartialFlinkCombiner
- extractOutput(SequenceRangeAccumulator) - Method in class org.apache.beam.sdk.extensions.ordered.combiner.DefaultSequenceCombiner
- extractOutput(SketchFrequencies.Sketch<InputT>) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
-
Output the whole structure so it can be queried, reused or stored easily.
- extractOutput(CovarianceAccumulator) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CovarianceFn
- extractOutput(VarianceAccumulator) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.VarianceFn
- extractOutput(BeamBuiltinAggregations.BitXOr.Accum) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations.BitXOr
- extractOutput(ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
-
Deprecated.
- extractOutput(Combine.Holder<V>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
- extractOutput(Top.BoundedHeap<KV<Integer, T>, SerializableComparator<KV<Integer, T>>>) - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
- extractOutputs(PCollectionRowTuple) - Method in class org.apache.beam.sdk.expansion.service.ExpansionServiceSchemaTransformProvider
- extractOutputs(OutputT) - Method in interface org.apache.beam.sdk.expansion.service.TransformProvider
- extractOutputStream(SparkCombineFn.WindowedAccumulator<?, ?, AccumT, ?>) - Method in class org.apache.beam.runners.spark.translation.SparkCombineFn
-
Extracts the stream of accumulated values.
- extractTableNamesFromNode(SqlNode) - Static method in class org.apache.beam.sdk.extensions.sql.TableNameExtractionUtils
- extractTimestampAttribute(String, Map<String, String>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Return the timestamp (in ms since unix epoch) to use for a Pubsub message with
timestampAttribute
andattriutes
. - extractTimestampsFromValues() - Static method in class org.apache.beam.sdk.transforms.Reify
-
Extracts the timestamps from each value in a
KV
.
F
- factory - Variable in class org.apache.beam.sdk.extensions.sql.impl.CalciteFactoryWrapper
- Factory<T> - Interface in org.apache.beam.sdk.schemas
-
A Factory interface for schema-related objects for a specific Java type.
- Factory() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel.Factory
- FACTORY - Static variable in class org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner
- FACTORY - Static variable in class org.apache.beam.sdk.extensions.sql.impl.parser.BeamSqlParser
-
Parser factory.
- FACTORY - Static variable in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
- FACTORY - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner
- FACTORY - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
-
Factory for creating Pubsub clients using gRPC transport.
- FACTORY - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
-
Factory for creating Pubsub clients using Json transport.
- FAIL_FAST - Enum constant in enum class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.FailureMode
-
Invalid write to Spanner will cause the pipeline to fail.
- FAIL_IF_EXISTS - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ExistingPipelineOptions
- failed(Error) - Method in interface org.apache.beam.runners.local.PipelineMessageReceiver
-
Report that a failure has occurred.
- failed(Exception) - Method in interface org.apache.beam.runners.local.PipelineMessageReceiver
-
Report that a failure has occurred.
- FAILED - Enum constant in enum class org.apache.beam.runners.local.ExecutionDriver.DriverState
- FAILED - Enum constant in enum class org.apache.beam.sdk.PipelineResult.State
-
The job has failed.
- FAILED - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write
-
The tag for the failed writes to HL7v2 store`.
- FAILED_BODY - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
-
The tag for the failed writes to FHIR store.
- FAILED_BUNDLES - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundles
-
The TupleTag used for bundles that failed to be executed for any reason.
- FAILED_FILES - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
-
The tag for the files that failed to FHIR store.
- FAILED_PUBLISH_TAG - Static variable in class org.apache.beam.sdk.io.solace.SolaceIO.Write
- FAILED_WRITES - Static variable in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
- failedRecords(List<RecT>, List<ResT>) - Method in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
- FailedRunningPipelineResults - Class in org.apache.beam.runners.jet
-
Alternative implementation of
PipelineResult
used to avoid throwing Exceptions in certain situations. - FailedRunningPipelineResults(RuntimeException) - Constructor for class org.apache.beam.runners.jet.FailedRunningPipelineResults
- FailedWritesException(List<FirestoreV1.WriteFailure>) - Constructor for exception class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.FailedWritesException
- failOnInsert(Map<TableRow, List<TableDataInsertAllResponse.InsertErrors>>) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
-
Cause a given
TableRow
object to fail when it's inserted. - FailsafeValueInSingleWindow<T,
ErrorT> - Class in org.apache.beam.sdk.values -
An immutable tuple of value, timestamp, window, and pane.
- FailsafeValueInSingleWindow() - Constructor for class org.apache.beam.sdk.values.FailsafeValueInSingleWindow
- FailsafeValueInSingleWindow.Coder<T,
ErrorT> - Class in org.apache.beam.sdk.values -
A coder for
FailsafeValueInSingleWindow
. - failure(String, String, Metadata, Throwable) - Static method in class org.apache.beam.sdk.io.tika.ParseResult
- failure(PAssert.PAssertionSite, Throwable) - Static method in class org.apache.beam.sdk.testing.SuccessOrFailure
- Failure - Class in org.apache.beam.sdk.schemas.io
-
A generic failure of an SQL transform.
- Failure() - Constructor for class org.apache.beam.sdk.schemas.io.Failure
- Failure() - Constructor for class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Failure
- FAILURE_COUNTER - Static variable in class org.apache.beam.sdk.testing.PAssert
- FAILURE_TAG - Static variable in class org.apache.beam.sdk.extensions.ml.RecommendationAICreateCatalogItem
- FAILURE_TAG - Static variable in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportCatalogItems
- FAILURE_TAG - Static variable in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportUserEvents
- FAILURE_TAG - Static variable in class org.apache.beam.sdk.extensions.ml.RecommendationAIPredict
- FAILURE_TAG - Static variable in class org.apache.beam.sdk.extensions.ml.RecommendationAIWriteUserEvent
- Failure.Builder - Class in org.apache.beam.sdk.schemas.io
- FailureCollectorWrapper - Class in org.apache.beam.sdk.io.cdap.context
-
Class FailureCollectorWrapper is a class for collecting ValidationFailure.
- FailureCollectorWrapper() - Constructor for class org.apache.beam.sdk.io.cdap.context.FailureCollectorWrapper
- failures() - Method in class org.apache.beam.sdk.transforms.WithFailures.Result
- failuresTo(List<PCollection<FailureElementT>>) - Method in class org.apache.beam.sdk.transforms.WithFailures.Result
-
Adds the failure collection to the passed list and returns just the output collection.
- FakeBigQueryServerStream(List<T>) - Constructor for class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices.FakeBigQueryServerStream
- FakeBigQueryServices - Class in org.apache.beam.sdk.io.gcp.testing
-
A fake implementation of BigQuery's query service..
- FakeBigQueryServices() - Constructor for class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
- FakeBigQueryServices.FakeBigQueryServerStream<T> - Class in org.apache.beam.sdk.io.gcp.testing
-
An implementation of
BigQueryServices.BigQueryServerStream
which takes aList
as theIterable
to simulate a server stream. - FakeDatasetService - Class in org.apache.beam.sdk.io.gcp.testing
-
A fake dataset service that can be serialized, for use in testReadFromTable.
- FakeDatasetService() - Constructor for class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- FakeJobService - Class in org.apache.beam.sdk.io.gcp.testing
-
A fake implementation of BigQuery's job service.
- FakeJobService() - Constructor for class org.apache.beam.sdk.io.gcp.testing.FakeJobService
- FakeJobService(int) - Constructor for class org.apache.beam.sdk.io.gcp.testing.FakeJobService
- Fanout() - Constructor for class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields.Fanout
- Fault Tolerance - Search tag in class org.apache.beam.sdk.transforms.ParDo
- Section
- featureList - Variable in class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromBytes
- featureList - Variable in class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromBytesWithContext
- featureList - Variable in class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromGcsUri
- featureList - Variable in class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromGcsUriWithContext
- features() - Method in class org.apache.beam.sdk.extensions.ml.AnnotateText
- fetchDataflowJobId() - Static method in class org.apache.beam.sdk.extensions.gcp.util.GceMetadataUtil
- fetchDataflowJobName() - Static method in class org.apache.beam.sdk.extensions.gcp.util.GceMetadataUtil
- fetchDataflowWorkerId() - Static method in class org.apache.beam.sdk.extensions.gcp.util.GceMetadataUtil
- FetchHL7v2Message() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read.FetchHL7v2Message
-
Instantiates a new Fetch HL7v2 message DoFn.
- FetchHL7v2Message() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.FetchHL7v2Message
-
Instantiates a new Fetch HL7v2 message DoFn.
- fewKeys() - Method in class org.apache.beam.sdk.transforms.GroupByKey
-
Returns whether it groups just few keys.
- fewKeys(CombineFnBase.GlobalCombineFn<? super InputT, ?, OutputT>) - Static method in class org.apache.beam.sdk.transforms.Combine
-
Returns a
Combine.PerKey
, and set fewKeys inGroupByKey
. - FhirBundleParameter - Class in org.apache.beam.sdk.io.gcp.healthcare
-
FhirBundleParameter represents a FHIR bundle in JSON format to be executed on a FHIR store.
- FhirBundleParameter() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleParameter
- FhirBundleResponse - Class in org.apache.beam.sdk.io.gcp.healthcare
- FhirBundleResponse() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleResponse
- FhirIO - Class in org.apache.beam.sdk.io.gcp.healthcare
-
FhirIO
provides an API for reading and writing resources to Google Cloud Healthcare Fhir API. - FhirIO() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
- FhirIO.Deidentify - Class in org.apache.beam.sdk.io.gcp.healthcare
-
Deidentify FHIR resources from a FHIR store to a destination FHIR store.
- FhirIO.Deidentify.DeidentifyFn - Class in org.apache.beam.sdk.io.gcp.healthcare
-
A function that schedules a deidentify operation and monitors the status.
- FhirIO.ExecuteBundles - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type Execute bundles.
- FhirIO.ExecuteBundlesResult - Class in org.apache.beam.sdk.io.gcp.healthcare
-
ExecuteBundlesResult contains both successfully executed bundles and information help debugging failed executions (eg metadata invalid input: '&' error msgs).
- FhirIO.Export - Class in org.apache.beam.sdk.io.gcp.healthcare
-
Export FHIR resources from a FHIR store to new line delimited json files on GCS or BigQuery.
- FhirIO.Export.ExportResourcesFn - Class in org.apache.beam.sdk.io.gcp.healthcare
-
A function that schedules an export operation and monitors the status.
- FhirIO.Import - Class in org.apache.beam.sdk.io.gcp.healthcare
-
Writes each bundle of elements to a new-line delimited JSON file on GCS and issues a fhirStores.import Request for that file.
- FhirIO.Import.ContentStructure - Enum Class in org.apache.beam.sdk.io.gcp.healthcare
-
The enum Content structure.
- FhirIO.Read - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type Read.
- FhirIO.Read.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type Result.
- FhirIO.Search<T> - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type Search.
- FhirIO.Search.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
- FhirIO.Write - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type Write.
- FhirIO.Write.AbstractResult - Class in org.apache.beam.sdk.io.gcp.healthcare
- FhirIO.Write.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type Result.
- FhirIO.Write.WriteMethod - Enum Class in org.apache.beam.sdk.io.gcp.healthcare
-
The enum Write method.
- FhirIOPatientEverything - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type FhirIOPatientEverything for querying a FHIR Patient resource's compartment.
- FhirIOPatientEverything() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything
- FhirIOPatientEverything.PatientEverythingParameter - Class in org.apache.beam.sdk.io.gcp.healthcare
-
PatientEverythingParameter defines required attributes for a FHIR GetPatientEverything request in
FhirIOPatientEverything
. - FhirIOPatientEverything.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The Result for a
FhirIOPatientEverything
request. - FhirResourcePagesIterator(HttpHealthcareApiClient.FhirResourcePagesIterator.FhirMethod, HealthcareApiClient, String, String, String, Map<String, Object>) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.FhirResourcePagesIterator
- FhirSearchParameter<T> - Class in org.apache.beam.sdk.io.gcp.healthcare
-
FhirSearchParameter represents the query parameters for a FHIR search request, used as a parameter for
FhirIO.Search
. - FhirSearchParameterCoder<T> - Class in org.apache.beam.sdk.io.gcp.healthcare
-
FhirSearchParameterCoder is the coder for
FhirSearchParameter
, which takes a coder for type T. - fhirStoresImport(String, String, String, FhirIO.Import.ContentStructure) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
-
Import method for batch writing resources.
- fhirStoresImport(String, String, FhirIO.Import.ContentStructure) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
- fhirStoresImport(ValueProvider<String>, ValueProvider<String>, ValueProvider<String>, FhirIO.Import.ContentStructure) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
- field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.ArrayOfNestedStringBundle.Field
- field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.ArrayOfStringBundle.Field
- field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.ByteBufferBundle.Field
- field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.BytesBundle.Field
- field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.DateTimeBundle.Field
- field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.IntBundle.Field
- field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.MapOfIntBundle.Field
- field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.MapOfNestedIntBundle.Field
- field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.NestedBytesBundle.Field
- field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.NestedIntBundle.Field
- field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.StringBuilderBundle.Field
- field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.StringBundle.Field
- field(String, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.transforms.AddFields.Inner
-
Add a new field of the specified type.
- field(String, Schema.FieldType, Object) - Method in class org.apache.beam.sdk.schemas.transforms.AddFields.Inner
-
Add a new field of the specified type.
- field(Row, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel.WrappedRow
- Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.ArrayOfNestedStringBundle.Field
- Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.ArrayOfStringBundle.Field
- Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.ByteBufferBundle.Field
- Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.BytesBundle.Field
- Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.DateTimeBundle.Field
- Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.IntBundle.Field
- Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.MapOfIntBundle.Field
- Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.MapOfNestedIntBundle.Field
- Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.NestedBytesBundle.Field
- Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.NestedIntBundle.Field
- Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.StringBuilderBundle.Field
- Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.StringBundle.Field
- Field() - Constructor for class org.apache.beam.sdk.schemas.Schema.Field
- fieldAccess(FieldAccessDescriptor) - Static method in class org.apache.beam.sdk.schemas.transforms.Select
-
Select a set of fields described in a
FieldAccessDescriptor
. - fieldAccessDescriptor(FieldAccessDescriptor) - Static method in class org.apache.beam.sdk.schemas.transforms.CoGroup.By
-
Join by the following field access descriptor.
- FieldAccessDescriptor - Class in org.apache.beam.sdk.schemas
-
Used inside of a
DoFn
to describe which fields in a schema type need to be accessed for processing. - FieldAccessDescriptor() - Constructor for class org.apache.beam.sdk.schemas.FieldAccessDescriptor
- FieldAccessDescriptor.FieldDescriptor - Class in org.apache.beam.sdk.schemas
-
Description of a single field.
- FieldAccessDescriptor.FieldDescriptor.Builder - Class in org.apache.beam.sdk.schemas
-
Builder class.
- FieldAccessDescriptor.FieldDescriptor.ListQualifier - Enum Class in org.apache.beam.sdk.schemas
-
Qualifier for a list selector.
- FieldAccessDescriptor.FieldDescriptor.MapQualifier - Enum Class in org.apache.beam.sdk.schemas
-
Qualifier for a map selector.
- FieldAccessDescriptor.FieldDescriptor.Qualifier - Class in org.apache.beam.sdk.schemas
-
OneOf union for a collection selector.
- FieldAccessDescriptor.FieldDescriptor.Qualifier.Kind - Enum Class in org.apache.beam.sdk.schemas
-
The kind of qualifier.
- FieldAccessDescriptorParser - Class in org.apache.beam.sdk.schemas.parser
-
Parser for textual field-access selector.
- FieldAccessDescriptorParser() - Constructor for class org.apache.beam.sdk.schemas.parser.FieldAccessDescriptorParser
- FieldDescriptor() - Constructor for class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor
- fieldFromType(TypeDescriptor, FieldValueTypeSupplier) - Static method in class org.apache.beam.sdk.schemas.utils.StaticSchemaInference
-
Map a Java field type to a Beam Schema FieldType.
- fieldIds(Integer...) - Static method in class org.apache.beam.sdk.schemas.transforms.CoGroup.By
-
Join by the following field ids.
- fieldIds(Integer...) - Static method in class org.apache.beam.sdk.schemas.transforms.Select
-
Select a set of top-level field ids from the row.
- fieldIdsAccessed() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
-
Return the field ids accessed.
- fieldNames(String...) - Static method in class org.apache.beam.sdk.schemas.transforms.CoGroup.By
-
Join by the following field names.
- fieldNames(String...) - Static method in class org.apache.beam.sdk.schemas.transforms.Select
-
Select a set of top-level field names from the row.
- fieldNamesAccessed() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
-
Return the field names accessed.
- fields() - Method in class org.apache.beam.sdk.io.splunk.SplunkEvent
- fields() - Static method in class org.apache.beam.sdk.state.StateKeySpec
- fields(Integer...) - Static method in class org.apache.beam.sdk.schemas.transforms.DropFields
- fields(String...) - Static method in class org.apache.beam.sdk.schemas.transforms.DropFields
- fields(FieldAccessDescriptor) - Static method in class org.apache.beam.sdk.schemas.transforms.DropFields
- Fields() - Constructor for class org.apache.beam.sdk.schemas.transforms.Select.Fields
- Fields: - Search tag in class org.apache.beam.sdk.io.contextualtextio.RecordWithMetadata
- Section
- FieldsEqual() - Constructor for class org.apache.beam.sdk.schemas.transforms.Join.FieldsEqual
- fieldSpecifier() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- FieldSpecifierContext(ParserRuleContext, int) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.FieldSpecifierContext
- FieldSpecifierNotationBaseListener - Class in org.apache.beam.sdk.schemas.parser.generated
-
This class provides an empty implementation of
FieldSpecifierNotationListener
, which can be extended to create a listener which only needs to handle a subset of the available methods. - FieldSpecifierNotationBaseListener() - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
- FieldSpecifierNotationBaseVisitor<T> - Class in org.apache.beam.sdk.schemas.parser.generated
-
This class provides an empty implementation of
FieldSpecifierNotationVisitor
, which can be extended to create a visitor which only needs to handle a subset of the available methods. - FieldSpecifierNotationBaseVisitor() - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseVisitor
- FieldSpecifierNotationLexer - Class in org.apache.beam.sdk.schemas.parser.generated
- FieldSpecifierNotationLexer(CharStream) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
- FieldSpecifierNotationListener - Interface in org.apache.beam.sdk.schemas.parser.generated
-
This interface defines a complete listener for a parse tree produced by
FieldSpecifierNotationParser
. - FieldSpecifierNotationParser - Class in org.apache.beam.sdk.schemas.parser.generated
- FieldSpecifierNotationParser(TokenStream) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- FieldSpecifierNotationParser.ArrayQualifierContext - Class in org.apache.beam.sdk.schemas.parser.generated
- FieldSpecifierNotationParser.ArrayQualifierListContext - Class in org.apache.beam.sdk.schemas.parser.generated
- FieldSpecifierNotationParser.DotExpressionComponentContext - Class in org.apache.beam.sdk.schemas.parser.generated
- FieldSpecifierNotationParser.DotExpressionContext - Class in org.apache.beam.sdk.schemas.parser.generated
- FieldSpecifierNotationParser.FieldSpecifierContext - Class in org.apache.beam.sdk.schemas.parser.generated
- FieldSpecifierNotationParser.MapQualifierContext - Class in org.apache.beam.sdk.schemas.parser.generated
- FieldSpecifierNotationParser.MapQualifierListContext - Class in org.apache.beam.sdk.schemas.parser.generated
- FieldSpecifierNotationParser.QualifiedComponentContext - Class in org.apache.beam.sdk.schemas.parser.generated
- FieldSpecifierNotationParser.QualifierListContext - Class in org.apache.beam.sdk.schemas.parser.generated
- FieldSpecifierNotationParser.QualifyComponentContext - Class in org.apache.beam.sdk.schemas.parser.generated
- FieldSpecifierNotationParser.SimpleIdentifierContext - Class in org.apache.beam.sdk.schemas.parser.generated
- FieldSpecifierNotationParser.WildcardContext - Class in org.apache.beam.sdk.schemas.parser.generated
- FieldSpecifierNotationVisitor<T> - Interface in org.apache.beam.sdk.schemas.parser.generated
-
This interface defines a complete generic visitor for a parse tree produced by
FieldSpecifierNotationParser
. - FieldType() - Constructor for class org.apache.beam.sdk.schemas.Schema.FieldType
- FieldTypeDescriptors - Class in org.apache.beam.sdk.schemas
-
Utilities for converting between
Schema
field types andTypeDescriptor
s that define Java objects which can represent these field types. - FieldTypeDescriptors() - Constructor for class org.apache.beam.sdk.schemas.FieldTypeDescriptors
- fieldTypeForJavaType(TypeDescriptor) - Static method in class org.apache.beam.sdk.schemas.FieldTypeDescriptors
-
Get a
Schema.FieldType
from aTypeDescriptor
. - fieldUpdate(String, String, String) - Static method in class org.apache.beam.sdk.io.mongodb.UpdateField
- FieldValueGetter<ObjectT,
ValueT> - Interface in org.apache.beam.sdk.schemas -
For internal use only; no backwards-compatibility guarantees.
- fieldValueGetters(Class<?>, Schema) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProvider
-
Deprecated.new implementations should override
GetterBasedSchemaProvider.fieldValueGetters(TypeDescriptor, Schema)
and make this method throw anUnsupportedOperationException
- fieldValueGetters(Class<?>, Schema) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProviderV2
- fieldValueGetters(TypeDescriptor<T>, Schema) - Method in class org.apache.beam.sdk.io.thrift.ThriftSchema
- fieldValueGetters(TypeDescriptor<T>, Schema) - Method in class org.apache.beam.sdk.extensions.avro.schemas.AvroRecordSchema
- fieldValueGetters(TypeDescriptor<T>, Schema) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoMessageSchema
- fieldValueGetters(TypeDescriptor<T>, Schema) - Method in class org.apache.beam.sdk.io.aws2.schemas.AwsSchemaProvider
- fieldValueGetters(TypeDescriptor<T>, Schema) - Method in class org.apache.beam.sdk.schemas.AutoValueSchema
- fieldValueGetters(TypeDescriptor<T>, Schema) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProvider
-
Deprecated.Delegates to the
GetterBasedSchemaProvider.fieldValueGetters(Class, Schema)
for backwards compatibility, override it if you want to use the richer type signature contained in theTypeDescriptor
not subject to the type erasure. - fieldValueGetters(TypeDescriptor<T>, Schema) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProviderV2
- fieldValueGetters(TypeDescriptor<T>, Schema) - Method in class org.apache.beam.sdk.schemas.JavaBeanSchema
- fieldValueGetters(TypeDescriptor<T>, Schema) - Method in class org.apache.beam.sdk.schemas.JavaFieldSchema
- FieldValueSetter<ObjectT,
ValueT> - Interface in org.apache.beam.sdk.schemas -
For internal use only; no backwards-compatibility guarantees.
- FieldValueTypeInformation - Class in org.apache.beam.sdk.schemas
-
Represents type information for a Java type that will be used to infer a Schema type.
- FieldValueTypeInformation() - Constructor for class org.apache.beam.sdk.schemas.FieldValueTypeInformation
- FieldValueTypeInformation.Builder - Class in org.apache.beam.sdk.schemas
- fieldValueTypeInformations(Class<?>, Schema) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProvider
-
Deprecated.new implementations should override
GetterBasedSchemaProvider.fieldValueTypeInformations(TypeDescriptor, Schema)
and make this method throw anUnsupportedOperationException
- fieldValueTypeInformations(Class<?>, Schema) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProviderV2
- fieldValueTypeInformations(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.io.thrift.ThriftSchema
- fieldValueTypeInformations(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.extensions.avro.schemas.AvroRecordSchema
- fieldValueTypeInformations(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoMessageSchema
- fieldValueTypeInformations(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.io.aws2.schemas.AwsSchemaProvider
- fieldValueTypeInformations(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.schemas.AutoValueSchema
- fieldValueTypeInformations(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProvider
-
Deprecated.Delegates to the
GetterBasedSchemaProvider.fieldValueTypeInformations(Class, Schema)
for backwards compatibility, override it if you want to use the richer type signature contained in theTypeDescriptor
not subject to the type erasure. - fieldValueTypeInformations(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProviderV2
- fieldValueTypeInformations(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.schemas.JavaBeanSchema
- fieldValueTypeInformations(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.schemas.JavaFieldSchema
- FieldValueTypeSupplier - Interface in org.apache.beam.sdk.schemas.utils
-
A naming policy for schema fields.
- FILE - Enum constant in enum class org.apache.beam.sdk.io.FileSystem.LineageLevel
- FILE_ARTIFACT_URN - Static variable in class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
- FILE_LOADS - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.Method
-
Use BigQuery load jobs to insert data.
- FILE_NAME_FIELD - Static variable in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformProvider
- FILE_TRIGGERING_BYTE_COUNT - Static variable in class org.apache.beam.sdk.io.WriteFiles
- FILE_TRIGGERING_RECORD_BUFFERING_DURATION - Static variable in class org.apache.beam.sdk.io.WriteFiles
- FILE_TRIGGERING_RECORD_COUNT - Static variable in class org.apache.beam.sdk.io.WriteFiles
- FileBasedReader(FileBasedSource<T>) - Constructor for class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
-
Subclasses should not perform IO operations at the constructor.
- FileBasedSink<UserT,
DestinationT, - Class in org.apache.beam.sdk.ioOutputT> -
Abstract class for file-based output.
- FileBasedSink(ValueProvider<ResourceId>, FileBasedSink.DynamicDestinations<?, DestinationT, OutputT>) - Constructor for class org.apache.beam.sdk.io.FileBasedSink
-
Construct a
FileBasedSink
with the given temp directory, producing uncompressed files. - FileBasedSink(ValueProvider<ResourceId>, FileBasedSink.DynamicDestinations<?, DestinationT, OutputT>, Compression) - Constructor for class org.apache.beam.sdk.io.FileBasedSink
-
Construct a
FileBasedSink
with the given temp directory and output channel type. - FileBasedSink(ValueProvider<ResourceId>, FileBasedSink.DynamicDestinations<?, DestinationT, OutputT>, FileBasedSink.WritableByteChannelFactory) - Constructor for class org.apache.beam.sdk.io.FileBasedSink
-
Construct a
FileBasedSink
with the given temp directory and output channel type. - FileBasedSink.CompressionType - Enum Class in org.apache.beam.sdk.io
-
Deprecated.use
Compression
. - FileBasedSink.DynamicDestinations<UserT,
DestinationT, - Class in org.apache.beam.sdk.ioOutputT> -
A class that allows value-dependent writes in
FileBasedSink
. - FileBasedSink.FilenamePolicy - Class in org.apache.beam.sdk.io
-
A naming policy for output files.
- FileBasedSink.FileResult<DestinationT> - Class in org.apache.beam.sdk.io
-
Result of a single bundle write.
- FileBasedSink.FileResultCoder<DestinationT> - Class in org.apache.beam.sdk.io
-
A coder for
FileBasedSink.FileResult
objects. - FileBasedSink.OutputFileHints - Interface in org.apache.beam.sdk.io
-
Provides hints about how to generate output files, such as a suggested filename suffix (e.g.
- FileBasedSink.WritableByteChannelFactory - Interface in org.apache.beam.sdk.io
-
Implementations create instances of
WritableByteChannel
used byFileBasedSink
and related classes to allow decorating, or otherwise transforming, the raw data that would normally be written directly to theWritableByteChannel
passed intoFileBasedSink.WritableByteChannelFactory.create(WritableByteChannel)
. - FileBasedSink.WriteOperation<DestinationT,
OutputT> - Class in org.apache.beam.sdk.io -
Abstract operation that manages the process of writing to
FileBasedSink
. - FileBasedSink.Writer<DestinationT,
OutputT> - Class in org.apache.beam.sdk.io -
Abstract writer that writes a bundle to a
FileBasedSink
. - FileBasedSource<T> - Class in org.apache.beam.sdk.io
-
A common base class for all file-based
Source
s. - FileBasedSource(MatchResult.Metadata, long, long, long) - Constructor for class org.apache.beam.sdk.io.FileBasedSource
-
Create a
FileBasedSource
based on a single file. - FileBasedSource(ValueProvider<String>, long) - Constructor for class org.apache.beam.sdk.io.FileBasedSource
-
Like
FileBasedSource(ValueProvider, EmptyMatchTreatment, long)
, but uses the default value ofEmptyMatchTreatment.DISALLOW
. - FileBasedSource(ValueProvider<String>, EmptyMatchTreatment, long) - Constructor for class org.apache.beam.sdk.io.FileBasedSource
-
Create a
FileBaseSource
based on a file or a file pattern specification, with the given strategy for treating filepatterns that do not match any files. - FileBasedSource.FileBasedReader<T> - Class in org.apache.beam.sdk.io
-
A
reader
that implements code common to readers ofFileBasedSource
s. - FileBasedSource.Mode - Enum Class in org.apache.beam.sdk.io
-
A given
FileBasedSource
represents a file resource of one of these types. - FileChecksumMatcher - Class in org.apache.beam.sdk.testing
-
Matcher to verify checksum of the contents of an
ShardedFile
in E2E test. - fileContentsHaveChecksum(String) - Static method in class org.apache.beam.sdk.testing.FileChecksumMatcher
- FileIO - Class in org.apache.beam.sdk.io
-
General-purpose transforms for working with files: listing files (matching), reading and writing.
- FileIO() - Constructor for class org.apache.beam.sdk.io.FileIO
- FileIO.Match - Class in org.apache.beam.sdk.io
-
Implementation of
FileIO.match()
. - FileIO.MatchAll - Class in org.apache.beam.sdk.io
-
Implementation of
FileIO.matchAll()
. - FileIO.MatchConfiguration - Class in org.apache.beam.sdk.io
-
Describes configuration for matching filepatterns, such as
EmptyMatchTreatment
and continuous watching for matching files. - FileIO.ReadableFile - Class in org.apache.beam.sdk.io
-
A utility class for accessing a potentially compressed file.
- FileIO.ReadMatches - Class in org.apache.beam.sdk.io
-
Implementation of
FileIO.readMatches()
. - FileIO.ReadMatches.DirectoryTreatment - Enum Class in org.apache.beam.sdk.io
-
Enum to control how directories are handled.
- FileIO.Sink<ElementT> - Interface in org.apache.beam.sdk.io
-
Specifies how to write elements to individual files in
FileIO.write()
andFileIO.writeDynamic()
. - FileIO.Write<DestinationT,
UserT> - Class in org.apache.beam.sdk.io -
Implementation of
FileIO.write()
andFileIO.writeDynamic()
. - FileIO.Write.FileNaming - Interface in org.apache.beam.sdk.io
-
A policy for generating names for shard files.
- FilenamePolicy() - Constructor for class org.apache.beam.sdk.io.FileBasedSink.FilenamePolicy
- File naming - Search tag in class org.apache.beam.sdk.io.FileIO
- Section
- filepattern(String) - Method in class org.apache.beam.sdk.io.FileIO.Match
-
Matches the given filepattern.
- filepattern(String) - Method in class org.apache.beam.sdk.io.tika.TikaIO.Parse
-
Matches the given filepattern.
- filepattern(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.FileIO.Match
-
Like
FileIO.Match.filepattern(String)
but using aValueProvider
. - filepattern(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.tika.TikaIO.Parse
-
Like
TikaIO.Parse.filepattern(String)
but using aValueProvider
. - FILEPATTERN - Enum constant in enum class org.apache.beam.sdk.io.FileBasedSource.Mode
- Filepattern expansion and watching - Search tag in class org.apache.beam.sdk.extensions.avro.io.AvroIO
- Section
- Filepattern expansion and watching - Search tag in class org.apache.beam.sdk.io.TextIO
- Section
- Filepattern expansion and watching - Search tag in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO
- Section
- FileReadSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.fileschematransform
- FileReadSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration
- FileReadSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.fileschematransform
- FileReadSchemaTransformFormatProvider - Interface in org.apache.beam.sdk.io.fileschematransform
-
Interface that provides a
PTransform
that reads in aPCollection
ofFileIO.ReadableFile
s and outputs the data represented as aPCollection
ofRow
s. - FileReadSchemaTransformProvider - Class in org.apache.beam.sdk.io.fileschematransform
- FileReadSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformProvider
- FileReporter - Class in org.apache.beam.runners.flink.metrics
-
Flink
metrics reporter
for writing metrics to a file specified via the "metrics.reporter.file.path" config key (assuming an alias of "file" for this reporter in the "metrics.reporters" setting). - FileReporter() - Constructor for class org.apache.beam.runners.flink.metrics.FileReporter
- FileResult(ResourceId, int, BoundedWindow, PaneInfo, DestinationT) - Constructor for class org.apache.beam.sdk.io.FileBasedSink.FileResult
- FileResultCoder(Coder<BoundedWindow>, Coder<DestinationT>) - Constructor for class org.apache.beam.sdk.io.FileBasedSink.FileResultCoder
- fileSize(GcsPath) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
-
Returns the file size from GCS or throws
FileNotFoundException
if the resource does not exist. - FileStagingOptions - Interface in org.apache.beam.sdk.options
-
File staging related options.
- FileSystem<ResourceIdT> - Class in org.apache.beam.sdk.io
-
File system interface in Beam.
- FileSystem() - Constructor for class org.apache.beam.sdk.io.FileSystem
- FileSystem.LineageLevel - Enum Class in org.apache.beam.sdk.io
- FileSystemRegistrar - Interface in org.apache.beam.sdk.io
-
A registrar that creates
FileSystem
instances fromPipelineOptions
. - FileSystems - Class in org.apache.beam.sdk.io
-
Clients facing
FileSystem
utility. - FileSystems() - Constructor for class org.apache.beam.sdk.io.FileSystems
- FileSystemUtils - Class in org.apache.beam.sdk.io
- FileSystemUtils() - Constructor for class org.apache.beam.sdk.io.FileSystemUtils
- FileWriteSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.fileschematransform
-
The configuration for building file writing transforms using
SchemaTransform
andSchemaTransformProvider
. - FileWriteSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration
- FileWriteSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.fileschematransform
- FileWriteSchemaTransformConfiguration.CsvConfiguration - Class in org.apache.beam.sdk.io.fileschematransform
-
Configures extra details related to writing CSV formatted files.
- FileWriteSchemaTransformConfiguration.CsvConfiguration.Builder - Class in org.apache.beam.sdk.io.fileschematransform
- FileWriteSchemaTransformConfiguration.ParquetConfiguration - Class in org.apache.beam.sdk.io.fileschematransform
-
Configures extra details related to writing Parquet formatted files.
- FileWriteSchemaTransformConfiguration.ParquetConfiguration.Builder - Class in org.apache.beam.sdk.io.fileschematransform
- FileWriteSchemaTransformConfiguration.XmlConfiguration - Class in org.apache.beam.sdk.io.fileschematransform
-
Configures extra details related to writing XML formatted files.
- FileWriteSchemaTransformConfiguration.XmlConfiguration.Builder - Class in org.apache.beam.sdk.io.fileschematransform
- FileWriteSchemaTransformFormatProvider - Interface in org.apache.beam.sdk.io.fileschematransform
-
Provides a
PTransform
that writes aPCollection
ofRow
s and outputs aPCollection
of the file names according to a registeredAutoService
FileWriteSchemaTransformFormatProvider
implementation. - FileWriteSchemaTransformFormatProviders - Class in org.apache.beam.sdk.io.fileschematransform
-
FileWriteSchemaTransformFormatProviders
containsFileWriteSchemaTransformFormatProvider
implementations. - FileWriteSchemaTransformFormatProviders() - Constructor for class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformFormatProviders
- FileWriteSchemaTransformProvider - Class in org.apache.beam.sdk.io.fileschematransform
-
A
TypedSchemaTransformProvider
implementation for writing aRow
PCollection
to file systems, driven by aFileWriteSchemaTransformConfiguration
. - FileWriteSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformProvider
- FillGaps<ValueT> - Class in org.apache.beam.sdk.extensions.timeseries
-
Fill gaps in timeseries.
- FillGaps() - Constructor for class org.apache.beam.sdk.extensions.timeseries.FillGaps
- FillGaps.FillGapsDoFn<ValueT> - Class in org.apache.beam.sdk.extensions.timeseries
- FillGaps.InterpolateData<ValueT> - Class in org.apache.beam.sdk.extensions.timeseries
-
Argument to withInterpolateFunction function.
- Filter - Class in org.apache.beam.sdk.schemas.transforms
-
A
PTransform
for filtering a collection of schema types. - Filter<T> - Class in org.apache.beam.sdk.transforms
-
PTransform
s for filtering from aPCollection
the elements satisfying a predicate, or satisfying an inequality with a given value based on the elements' natural ordering. - Filter() - Constructor for class org.apache.beam.sdk.schemas.transforms.Filter
- FILTER - Enum constant in enum class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider.PushDownOptions
- Filter.Inner<T> - Class in org.apache.beam.sdk.schemas.transforms
-
Implementation of the filter.
- filterCharacters(String) - Method in class org.apache.beam.runners.flink.metrics.FileReporter
- FilterForMutationDoFn - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn
- FilterForMutationDoFn() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.FilterForMutationDoFn
- FilterUtils - Class in org.apache.beam.sdk.io.iceberg
-
Utilities that convert between a SQL filter expression and an Iceberg
Expression
. - FilterUtils() - Constructor for class org.apache.beam.sdk.io.iceberg.FilterUtils
- FinalFlinkCombiner(CombineFnBase.GlobalCombineFn<?, AccumT, OutputT>) - Constructor for class org.apache.beam.runners.flink.translation.functions.AbstractFlinkCombineRunner.FinalFlinkCombiner
- FINALIZE_STREAM - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics.RpcMethod
- finalizeAllOutstandingBundles() - Method in class org.apache.beam.runners.fnexecution.control.BundleFinalizationHandlers.InMemoryFinalizer
-
All finalization requests will be sent without waiting for the responses.
- finalizeCheckpoint() - Method in class org.apache.beam.runners.spark.io.EmptyCheckpointMark
- finalizeCheckpoint() - Method in class org.apache.beam.sdk.io.CountingSource.CounterMark
- finalizeCheckpoint() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.CheckpointMarkImpl
- finalizeCheckpoint() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark
- finalizeCheckpoint() - Method in class org.apache.beam.sdk.io.solace.read.SolaceCheckpointMark
- finalizeCheckpoint() - Method in interface org.apache.beam.sdk.io.UnboundedSource.CheckpointMark
-
Called by the system to signal that this checkpoint mark has been committed along with all the records which have been read from the
UnboundedSource.UnboundedReader
since the previous checkpoint was taken. - finalizeCheckpoint() - Method in class org.apache.beam.sdk.io.UnboundedSource.CheckpointMark.NoopCheckpointMark
- finalizeDestination(DestinationT, BoundedWindow, Integer, Collection<FileBasedSink.FileResult<DestinationT>>) - Method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
- finalizeWriteStream(String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.WriteStreamService
-
Finalize a write stream.
- finalizeWriteStream(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.WriteStreamServiceImpl
- finalizeWriteStream(String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- find(String) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.Find
PTransform
that checks if a portion of the line matches the Regex. - find(String, int) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.Find
PTransform
that checks if a portion of the line matches the Regex. - find(String, String) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.FindName
PTransform
that checks if a portion of the line matches the Regex. - find(Pattern) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.Find
PTransform
that checks if a portion of the line matches the Regex. - find(Pattern, int) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.Find
PTransform
that checks if a portion of the line matches the Regex. - find(Pattern, String) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.FindName
PTransform
that checks if a portion of the line matches the Regex. - Find(Pattern, int) - Constructor for class org.apache.beam.sdk.transforms.Regex.Find
- findAll(String) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.FindAll
PTransform
that checks if a portion of the line matches the Regex. - findAll(Pattern) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.FindAll
PTransform
that checks if a portion of the line matches the Regex. - FindAll(Pattern) - Constructor for class org.apache.beam.sdk.transforms.Regex.FindAll
- findAllTableIndexes() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
-
Finds all indexes for the metadata table.
- findAvailablePort() - Static method in class org.apache.beam.sdk.extensions.python.PythonService
- findDateTimePattern(String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.DateTimeUtils
- findDateTimePattern(String, ImmutableMap<DateTimeUtils.TimestampPatterns, DateTimeFormatter>) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.DateTimeUtils
- findKV(String, int, int) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.FindKV
PTransform
that checks if a portion of the line matches the Regex. - findKV(String, String, String) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.FindNameKV
PTransform
that checks if a portion of the line matches the Regex. - findKV(Pattern, int, int) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.FindKV
PTransform
that checks if a portion of the line matches the Regex. - findKV(Pattern, String, String) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.FindNameKV
PTransform
that checks if a portion of the line matches the Regex. - FindKV(Pattern, int, int) - Constructor for class org.apache.beam.sdk.transforms.Regex.FindKV
- FindName(Pattern, String) - Constructor for class org.apache.beam.sdk.transforms.Regex.FindName
- FindNameKV(Pattern, String, String) - Constructor for class org.apache.beam.sdk.transforms.Regex.FindNameKV
- FindQuery - Class in org.apache.beam.sdk.io.mongodb
-
Builds a MongoDB FindQuery object.
- FindQuery() - Constructor for class org.apache.beam.sdk.io.mongodb.FindQuery
- finish() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- finish() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.ErrorCounterFn
- finish() - Method in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformProvider.ErrorFn
- finish(DoFn.FinishBundleContext) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider.ErrorFn
- finish(DoFn.FinishBundleContext) - Method in class org.apache.beam.sdk.io.TFRecordReadSchemaTransformProvider.ErrorFn
- finish(DoFn.FinishBundleContext) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.ErrorFn
- finish(DoFn.FinishBundleContext) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.ErrorFn
- finish(DoFn.FinishBundleContext) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.DataChangeRecordToRow
- finishBundle() - Method in class org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate
- finishBundle() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.stableinput.BufferingDoFnRunner
- finishBundle() - Method in class org.apache.beam.runners.spark.translation.DoFnRunnerWithMetrics
- finishBundle() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write.PubsubBoundedWriter
- finishBundle() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.PubsubLiteSink
- finishBundle() - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
Deprecated.Use
TestPipeline
with theDirectRunner
. - finishBundle(DoFn.FinishBundleContext) - Method in class org.apache.beam.sdk.io.gcp.bigquery.UpdateSchemaDestination
- finishBundle(DoFn.FinishBundleContext) - Method in class org.apache.beam.sdk.io.solace.write.UnboundedBatchedSolaceWriter
- finishBundle(DoFn.FinishBundleContext) - Method in class org.apache.beam.sdk.io.solace.write.UnboundedStreamingSolaceWriter
- finishBundle(DoFn.FinishBundleContext, PipelineOptions) - Method in interface org.apache.beam.sdk.extensions.sql.BeamSqlSeekableTable
- finishBundle(DoFn.FinishBundleContext) - Method in class org.apache.beam.sdk.transforms.View.ToListViewDoFn
- FinishBundleContext() - Constructor for class org.apache.beam.sdk.transforms.DoFn.FinishBundleContext
- FINISHED - Enum constant in enum class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.State
- finishRunnerBundle(DoFnRunner<InputT, OutputT>) - Method in class org.apache.beam.runners.jet.processors.ParDoP
- finishRunnerBundle(DoFnRunner<KV<?, ?>, OutputT>) - Method in class org.apache.beam.runners.jet.processors.StatefulParDoP
- finishSpecifying() - Method in interface org.apache.beam.sdk.state.StateSpec
-
For internal use only; no backwards-compatibility guarantees.
- finishSpecifying(PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PCollection
-
After building, finalizes this
PValue
to make it ready for running. - finishSpecifying(PInput, PTransform<?, ?>) - Method in interface org.apache.beam.sdk.values.PValue
-
After building, finalizes this
PValue
to make it ready for being used as an input to aPTransform
. - finishSpecifying(PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PValueBase
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.io.requestresponse.Result
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedEventProcessorResult
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Write.Result
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.WriteBatches.Result
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.csv.CsvIOParseResult
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata.Result
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read.Result
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search.Result
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.AbstractResult
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.Result
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything.Result
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read.Result
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.Result
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write.Result
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteResult
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteResult
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.jms.WriteJmsResult
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.solace.write.SolaceOutput
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.WriteFilesResult
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.transforms.JsonToRow.ParseResult
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.transforms.WithFailures.Result
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PCollection
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PCollectionList
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PDone
-
Does nothing; there is nothing to finish specifying.
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in interface org.apache.beam.sdk.values.POutput
-
As part of applying the producing
PTransform
, finalizes this output to make it ready for being used as an input and for running. - finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PValueBase
- finishSplit(int) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- finishWrite() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
-
Called after all calls to
FileBasedSink.Writer.writeHeader()
,FileBasedSink.Writer.write(OutputT)
andFileBasedSink.Writer.writeFooter()
. - FIRE_ALWAYS - Enum constant in enum class org.apache.beam.sdk.transforms.windowing.Window.ClosingBehavior
-
Always fire the last pane.
- FIRE_ALWAYS - Enum constant in enum class org.apache.beam.sdk.transforms.windowing.Window.OnTimeBehavior
-
Always fire the on-time pane.
- FIRE_IF_NON_EMPTY - Enum constant in enum class org.apache.beam.sdk.transforms.windowing.Window.ClosingBehavior
-
Only fire the last pane if there is new data since the previous firing.
- FIRE_IF_NON_EMPTY - Enum constant in enum class org.apache.beam.sdk.transforms.windowing.Window.OnTimeBehavior
-
Only fire the on-time pane if there is new data since the previous firing.
- fireEligibleTimers(InMemoryTimerInternals, Map<KV<String, String>, FnDataReceiver<Timer>>, Object) - Static method in class org.apache.beam.runners.fnexecution.translation.PipelineTranslatorUtils
-
Fires all timers which are ready to be fired.
- FirestoreIO - Class in org.apache.beam.sdk.io.gcp.firestore
-
FirestoreIO
provides an API for reading from and writing to Google Cloud Firestore. - FirestoreOptions - Interface in org.apache.beam.sdk.io.gcp.firestore
- FirestoreV1 - Class in org.apache.beam.sdk.io.gcp.firestore
-
FirestoreV1
provides an API which provides lifecycle managedPTransform
s for Cloud Firestore v1 API. - FirestoreV1.BatchGetDocuments - Class in org.apache.beam.sdk.io.gcp.firestore
-
Concrete class representing a
PTransform
<
PCollection
<
BatchGetDocumentsRequest
>,
PTransform
<
BatchGetDocumentsResponse
>>
which will read from Firestore. - FirestoreV1.BatchGetDocuments.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
-
A type safe builder for
FirestoreV1.BatchGetDocuments
allowing configuration and instantiation. - FirestoreV1.BatchWriteWithDeadLetterQueue - Class in org.apache.beam.sdk.io.gcp.firestore
-
Concrete class representing a
PTransform
<
PCollection
<
Write
>,
PCollection
<
FirestoreV1.WriteFailure
>
which will write to Firestore. - FirestoreV1.BatchWriteWithDeadLetterQueue.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
-
A type safe builder for
FirestoreV1.BatchWriteWithDeadLetterQueue
allowing configuration and instantiation. - FirestoreV1.BatchWriteWithSummary - Class in org.apache.beam.sdk.io.gcp.firestore
- FirestoreV1.BatchWriteWithSummary.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
-
A type safe builder for
FirestoreV1.BatchWriteWithSummary
allowing configuration and instantiation. - FirestoreV1.FailedWritesException - Exception Class in org.apache.beam.sdk.io.gcp.firestore
-
Exception that is thrown if one or more
Write
s is unsuccessful with a non-retryable status code. - FirestoreV1.ListCollectionIds - Class in org.apache.beam.sdk.io.gcp.firestore
-
Concrete class representing a
PTransform
<
PCollection
<
ListCollectionIdsRequest
>,
PTransform
<
ListCollectionIdsResponse
>>
which will read from Firestore. - FirestoreV1.ListCollectionIds.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
-
A type safe builder for
FirestoreV1.ListCollectionIds
allowing configuration and instantiation. - FirestoreV1.ListDocuments - Class in org.apache.beam.sdk.io.gcp.firestore
-
Concrete class representing a
PTransform
<
PCollection
<
ListDocumentsRequest
>,
PTransform
<
ListDocumentsResponse
>>
which will read from Firestore. - FirestoreV1.ListDocuments.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
-
A type safe builder for
FirestoreV1.ListDocuments
allowing configuration and instantiation. - FirestoreV1.PartitionQuery - Class in org.apache.beam.sdk.io.gcp.firestore
-
Concrete class representing a
PTransform
<
PCollection
<
PartitionQueryRequest
>,
PTransform
<
RunQueryRequest
>>
which will read from Firestore. - FirestoreV1.PartitionQuery.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
-
A type safe builder for
FirestoreV1.PartitionQuery
allowing configuration and instantiation. - FirestoreV1.Read - Class in org.apache.beam.sdk.io.gcp.firestore
-
Type safe builder factory for read operations.
- FirestoreV1.RunQuery - Class in org.apache.beam.sdk.io.gcp.firestore
-
Concrete class representing a
PTransform
<
PCollection
<
RunQueryRequest
>,
PTransform
<
RunQueryResponse
>>
which will read from Firestore. - FirestoreV1.RunQuery.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
-
A type safe builder for
FirestoreV1.RunQuery
allowing configuration and instantiation. - FirestoreV1.Write - Class in org.apache.beam.sdk.io.gcp.firestore
-
Type safe builder factory for write operations.
- FirestoreV1.WriteFailure - Class in org.apache.beam.sdk.io.gcp.firestore
-
Failure details for an attempted
Write
. - FirestoreV1.WriteSuccessSummary - Class in org.apache.beam.sdk.io.gcp.firestore
-
Summary object produced when a number of writes are successfully written to Firestore in a single BatchWrite.
- fireTimer(TimerInternals.TimerData) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- fireTimer(TimerInternals.TimerData) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.SplittableDoFnOperator
- fireTimer(TimerInternals.TimerData) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.WindowDoFnOperator
- fireTimer(TimerInternals.TimerData) - Method in class org.apache.beam.runners.spark.translation.AbstractInOutIterator
-
Fires a timer using the DoFnRunner from the context and performs cleanup afterwards.
- fireTimerInternal(FlinkKey, TimerInternals.TimerData) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- fireTimerInternal(FlinkKey, TimerInternals.TimerData) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.ExecutableStageDoFnOperator
- fireTimestamp() - Method in class org.apache.beam.sdk.transforms.DoFn.OnTimerContext
-
Returns the firing timestamp of the current timer.
- first - Variable in class org.apache.beam.sdk.transforms.PeriodicSequence.SequenceDefinition
- FIRST - Enum constant in enum class org.apache.beam.sdk.extensions.sql.impl.cep.CEPKind
- firstInput(K, AccumT, PipelineOptions, SideInputReader, Collection<? extends BoundedWindow>) - Method in class org.apache.beam.runners.flink.translation.functions.AbstractFlinkCombineRunner.FinalFlinkCombiner
- firstInput(K, InputT, PipelineOptions, SideInputReader, Collection<? extends BoundedWindow>) - Method in class org.apache.beam.runners.flink.translation.functions.AbstractFlinkCombineRunner.CompleteFlinkCombiner
- firstInput(K, InputT, PipelineOptions, SideInputReader, Collection<? extends BoundedWindow>) - Method in interface org.apache.beam.runners.flink.translation.functions.AbstractFlinkCombineRunner.FlinkCombiner
- firstInput(K, InputT, PipelineOptions, SideInputReader, Collection<? extends BoundedWindow>) - Method in class org.apache.beam.runners.flink.translation.functions.AbstractFlinkCombineRunner.PartialFlinkCombiner
- fixDefaults() - Method in class org.apache.beam.sdk.values.WindowingStrategy
-
Fixes all the defaults so that equals can be used to check that two strategies are the same, regardless of the state of "defaulted-ness".
- FIXED_LENGTH - Static variable in class org.apache.beam.runners.dataflow.internal.IsmFormat.Footer
- FIXED_WINDOW_TVF - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.TVFStreamingUtils
- FixedBytes - Class in org.apache.beam.sdk.schemas.logicaltypes
-
A LogicalType representing a fixed-length byte array.
- FixedPrecisionNumeric - Class in org.apache.beam.sdk.schemas.logicaltypes
-
Fixed precision numeric types used to represent jdbc NUMERIC and DECIMAL types.
- fixedSizeGlobally(int) - Static method in class org.apache.beam.sdk.transforms.Sample
-
Returns a
PTransform
that takes aPCollection<T>
, selectssampleSize
elements, uniformly at random, and returns aPCollection<Iterable<T>>
containing the selected elements. - fixedSizePerKey(int) - Static method in class org.apache.beam.sdk.transforms.Sample
-
Returns a
PTransform
that takes an inputPCollection<KV<K, V>>
and returns aPCollection<KV<K, Iterable<V>>>
that contains an output element mapping each distinct key in the inputPCollection
to a sample ofsampleSize
values associated with that key in the inputPCollection
, taken uniformly at random. - fixedString(int) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- FixedString - Class in org.apache.beam.sdk.schemas.logicaltypes
-
A LogicalType representing a fixed-length string.
- FIXEDSTRING - Enum constant in enum class org.apache.beam.sdk.io.clickhouse.TableSchema.TypeName
- fixedStringSize() - Method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- FixedWindows - Class in org.apache.beam.sdk.transforms.windowing
-
A
WindowFn
that windows values into fixed-size timestamp-based windows. - flatMap(RawUnionValue, Collector<WindowedValue<?>>) - Method in class org.apache.beam.runners.flink.translation.functions.FlinkExecutableStagePruningFunction
- flatMap(KV<K, Iterable<WindowedValue<V>>>, RecordCollector<WindowedValue<KV<K, Iterable<V>>>>) - Method in class org.apache.beam.runners.twister2.translators.functions.GroupByWindowFunction
- flatMap(WindowedValue<InputT>, Collector<WindowedValue<RawUnionValue>>) - Method in class org.apache.beam.runners.flink.translation.functions.FlinkDoFnFunction
- flatMap(WindowedValue<RawUnionValue>, Collector<WindowedValue<T>>) - Method in class org.apache.beam.runners.flink.translation.functions.FlinkMultiOutputPruningFunction
- flatMap(WindowedValue<T>, Collector<WindowedValue<T>>) - Method in class org.apache.beam.runners.flink.translation.functions.FlinkAssignWindows
- flatMap(WindowedValue<T>, Collector<WindowedValue<T>>) - Method in class org.apache.beam.runners.flink.translation.functions.FlinkExplodeWindowsFunction
- FlatMapElements<InputT,
OutputT> - Class in org.apache.beam.sdk.transforms -
PTransform
s for mapping a simple function that returns iterables over the elements of aPCollection
and merging the results. - FlatMapElements.FlatMapWithFailures<InputT,
OutputT, - Class in org.apache.beam.sdk.transformsFailureT> -
A
PTransform
that adds exception handling toFlatMapElements
. - Flat style - Search tag in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
- Section
- Flatten - Class in org.apache.beam.sdk.transforms
-
Flatten<T>
takes multiplePCollection<T>
s bundled into aPCollectionList<T>
and returns a singlePCollection<T>
containing all the elements in all the inputPCollection
s. - Flatten() - Constructor for class org.apache.beam.sdk.transforms.Flatten
- Flatten.Iterables<T> - Class in org.apache.beam.sdk.transforms
-
FlattenIterables<T>
takes aPCollection<Iterable<T>>
and returns aPCollection<T>
that contains all the elements from each iterable. - Flatten.PCollections<T> - Class in org.apache.beam.sdk.transforms
-
A
PTransform
that flattens aPCollectionList
into aPCollection
containing all the elements of all thePCollection
s in its input. - Flattened() - Constructor for class org.apache.beam.sdk.schemas.transforms.Select.Flattened
- flattenedSchema() - Static method in class org.apache.beam.sdk.schemas.transforms.Select
-
Selects every leaf-level field.
- FlattenP - Class in org.apache.beam.runners.jet.processors
-
Jet
Processor
implementation for Beam's Flatten primitive. - FlattenP.Supplier - Class in org.apache.beam.runners.jet.processors
-
Jet
Processor
supplier that will provide instances ofFlattenP
. - flattenRel(RelStructuredTypeFlattener) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSinkRel
- FlattenTransformProvider - Class in org.apache.beam.sdk.schemas.transforms.providers
-
An implementation of
TypedSchemaTransformProvider
for Flatten. - FlattenTransformProvider() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.FlattenTransformProvider
- FlattenTransformProvider.Configuration - Class in org.apache.beam.sdk.schemas.transforms.providers
- FlattenTransformProvider.Configuration.Builder - Class in org.apache.beam.sdk.schemas.transforms.providers
- FlattenTranslatorBatch<T> - Class in org.apache.beam.runners.twister2.translators.batch
-
Flatten translator.
- FlattenTranslatorBatch() - Constructor for class org.apache.beam.runners.twister2.translators.batch.FlattenTranslatorBatch
- FlattenWithHeterogeneousCoders - Interface in org.apache.beam.sdk.testing
-
Category tag for tests that use a
Flatten
where the inputPCollectionList
containsPCollections
heterogeneouscoders
. - FlinkAssignWindows<T,
W> - Class in org.apache.beam.runners.flink.translation.functions -
Flink
FlatMapFunction
for implementingWindow.Assign
. - FlinkAssignWindows(WindowFn<T, W>) - Constructor for class org.apache.beam.runners.flink.translation.functions.FlinkAssignWindows
- FlinkBatchPortablePipelineTranslator - Class in org.apache.beam.runners.flink
-
A translator that translates bounded portable pipelines into executable Flink pipelines.
- FlinkBatchPortablePipelineTranslator(Map<String, FlinkBatchPortablePipelineTranslator.PTransformTranslator>) - Constructor for class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator
- FlinkBatchPortablePipelineTranslator.BatchTranslationContext - Class in org.apache.beam.runners.flink
-
Batch translation context.
- FlinkBatchPortablePipelineTranslator.IsFlinkNativeTransform - Class in org.apache.beam.runners.flink
-
Predicate to determine whether a URN is a Flink native transform.
- FlinkBatchPortablePipelineTranslator.PTransformTranslator - Interface in org.apache.beam.runners.flink
-
Transform translation interface.
- FlinkBoundedSource<T> - Class in org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.bounded
-
A Flink
Source
implementation that wraps a BeamBoundedSource
. - FlinkBoundedSource(String, BoundedSource<T>, SerializablePipelineOptions, Boundedness, int) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.bounded.FlinkBoundedSource
- FlinkBoundedSource(String, BoundedSource<T>, SerializablePipelineOptions, Boundedness, int, FlinkSource.TimestampExtractor<WindowedValue<T>>) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.bounded.FlinkBoundedSource
- FlinkBoundedSourceReader<T> - Class in org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.bounded
-
A Flink
SourceReader
implementation that reads from the assignedFlinkSourceSplits
by using BeamBoundedReaders
. - FlinkBoundedSourceReader(String, SourceReaderContext, PipelineOptions, ScheduledExecutorService, Function<WindowedValue<T>, Long>) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.bounded.FlinkBoundedSourceReader
- FlinkBoundedSourceReader(String, SourceReaderContext, PipelineOptions, Function<WindowedValue<T>, Long>) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.bounded.FlinkBoundedSourceReader
- FlinkBroadcastStateInternals<K> - Class in org.apache.beam.runners.flink.translation.wrappers.streaming.state
-
StateInternals
that uses a FlinkOperatorStateBackend
to manage the broadcast state. - FlinkBroadcastStateInternals(int, OperatorStateBackend, SerializablePipelineOptions) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkBroadcastStateInternals
- FlinkDetachedRunnerResult - Class in org.apache.beam.runners.flink
-
Result of a detached execution of a
Pipeline
with Flink. - FlinkDoFnFunction<InputT,
OutputT> - Class in org.apache.beam.runners.flink.translation.functions -
Encapsulates a
DoFn
inside a FlinkRichMapPartitionFunction
. - FlinkDoFnFunction(DoFn<InputT, OutputT>, String, WindowingStrategy<?, ?>, Map<PCollectionView<?>, WindowingStrategy<?, ?>>, PipelineOptions, Map<TupleTag<?>, Integer>, TupleTag<OutputT>, Coder<InputT>, Map<TupleTag<?>, Coder<?>>, DoFnSchemaInformation, Map<String, PCollectionView<?>>) - Constructor for class org.apache.beam.runners.flink.translation.functions.FlinkDoFnFunction
- FlinkExecutableStageContextFactory - Class in org.apache.beam.runners.flink.translation.functions
-
Singleton class that contains one
ExecutableStageContext.Factory
per job. - FlinkExecutableStageFunction<InputT> - Class in org.apache.beam.runners.flink.translation.functions
-
Flink operator that passes its input DataSet through an SDK-executed
ExecutableStage
. - FlinkExecutableStageFunction(String, PipelineOptions, RunnerApi.ExecutableStagePayload, JobInfo, Map<String, Integer>, FlinkExecutableStageContextFactory, Coder, Coder<WindowedValue<InputT>>) - Constructor for class org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction
- FlinkExecutableStagePruningFunction - Class in org.apache.beam.runners.flink.translation.functions
-
A Flink function that demultiplexes output from a
FlinkExecutableStageFunction
. - FlinkExecutableStagePruningFunction(int, PipelineOptions) - Constructor for class org.apache.beam.runners.flink.translation.functions.FlinkExecutableStagePruningFunction
-
Creates a
FlinkExecutableStagePruningFunction
that extracts elements of the given union tag. - FlinkExecutionEnvironments - Class in org.apache.beam.runners.flink
-
Utilities for Flink execution environments.
- FlinkExecutionEnvironments() - Constructor for class org.apache.beam.runners.flink.FlinkExecutionEnvironments
- FlinkExplodeWindowsFunction<T> - Class in org.apache.beam.runners.flink.translation.functions
-
Explode
WindowedValue
that belongs to multiple windows into multiple "single window"values
, so we can safely group elements by (K, W) tuples. - FlinkExplodeWindowsFunction() - Constructor for class org.apache.beam.runners.flink.translation.functions.FlinkExplodeWindowsFunction
- FlinkIdentityFunction<T> - Class in org.apache.beam.runners.flink.translation.functions
-
A map function that outputs the input element without any change.
- FlinkJobInvoker - Class in org.apache.beam.runners.flink
-
Job Invoker for the
FlinkRunner
. - FlinkJobInvoker(FlinkJobServerDriver.FlinkServerConfiguration) - Constructor for class org.apache.beam.runners.flink.FlinkJobInvoker
- FlinkJobServerDriver - Class in org.apache.beam.runners.flink
-
Driver program that starts a job server for the Flink runner.
- FlinkJobServerDriver.FlinkServerConfiguration - Class in org.apache.beam.runners.flink
-
Flink runner-specific Configuration for the jobServer.
- FlinkKey - Class in org.apache.beam.runners.flink.adapter
- FlinkKey() - Constructor for class org.apache.beam.runners.flink.adapter.FlinkKey
- FlinkKeyUtils - Class in org.apache.beam.runners.flink.translation.wrappers.streaming
-
Utility functions for dealing with key encoding.
- FlinkKeyUtils() - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.FlinkKeyUtils
- FlinkMergingNonShuffleReduceFunction<K,
InputT, - Class in org.apache.beam.runners.flink.translation.functionsAccumT, OutputT, W> -
Special version of
FlinkReduceFunction
that supports merging windows. - FlinkMergingNonShuffleReduceFunction(CombineFnBase.GlobalCombineFn<InputT, AccumT, OutputT>, WindowingStrategy<Object, W>, Map<PCollectionView<?>, WindowingStrategy<?, ?>>, PipelineOptions) - Constructor for class org.apache.beam.runners.flink.translation.functions.FlinkMergingNonShuffleReduceFunction
- FlinkMetricContainer - Class in org.apache.beam.runners.flink.metrics
-
Helper class for holding a
MetricsContainerImpl
and forwarding Beam metrics to Flink accumulators and metrics. - FlinkMetricContainer(RuntimeContext) - Constructor for class org.apache.beam.runners.flink.metrics.FlinkMetricContainer
- FlinkMetricContainerWithoutAccumulator - Class in org.apache.beam.runners.flink.metrics
-
The base helper class for holding a
MetricsContainerImpl
and forwarding Beam metrics to Flink accumulators and metrics. - FlinkMetricContainerWithoutAccumulator(MetricGroup) - Constructor for class org.apache.beam.runners.flink.metrics.FlinkMetricContainerWithoutAccumulator
- FlinkMiniClusterEntryPoint - Class in org.apache.beam.runners.flink
-
Entry point for starting an embedded Flink cluster.
- FlinkMiniClusterEntryPoint() - Constructor for class org.apache.beam.runners.flink.FlinkMiniClusterEntryPoint
- FlinkMultiOutputPruningFunction<T> - Class in org.apache.beam.runners.flink.translation.functions
-
A
FlatMapFunction
function that filters out those elements that don't belong in this output. - FlinkMultiOutputPruningFunction(int, PipelineOptions) - Constructor for class org.apache.beam.runners.flink.translation.functions.FlinkMultiOutputPruningFunction
- FlinkNonMergingReduceFunction<K,
InputT> - Class in org.apache.beam.runners.flink.translation.functions -
Reduce function for non-merging GBK implementation.
- FlinkNonMergingReduceFunction(WindowingStrategy<?, ?>, boolean) - Constructor for class org.apache.beam.runners.flink.translation.functions.FlinkNonMergingReduceFunction
- FlinkNoOpStepContext - Class in org.apache.beam.runners.flink.translation.functions
-
A
StepContext
for Flink Batch Runner execution. - FlinkNoOpStepContext() - Constructor for class org.apache.beam.runners.flink.translation.functions.FlinkNoOpStepContext
- FlinkPartialReduceFunction<K,
InputT, - Class in org.apache.beam.runners.flink.translation.functionsAccumT, W> -
This is the first step for executing a
Combine.PerKey
on Flink. - FlinkPartialReduceFunction(CombineFnBase.GlobalCombineFn<InputT, AccumT, ?>, WindowingStrategy<Object, W>, Map<PCollectionView<?>, WindowingStrategy<?, ?>>, PipelineOptions) - Constructor for class org.apache.beam.runners.flink.translation.functions.FlinkPartialReduceFunction
- FlinkPartialReduceFunction(CombineFnBase.GlobalCombineFn<InputT, AccumT, ?>, WindowingStrategy<Object, W>, Map<PCollectionView<?>, WindowingStrategy<?, ?>>, PipelineOptions, boolean) - Constructor for class org.apache.beam.runners.flink.translation.functions.FlinkPartialReduceFunction
- FlinkPipelineOptions - Interface in org.apache.beam.runners.flink
-
Options which can be used to configure the Flink Runner.
- FlinkPipelineOptions.MaxBundleSizeFactory - Class in org.apache.beam.runners.flink
-
Maximum bundle size factory.
- FlinkPipelineOptions.MaxBundleTimeFactory - Class in org.apache.beam.runners.flink
-
Maximum bundle time factory.
- FlinkPipelineRunner - Class in org.apache.beam.runners.flink
-
Runs a Pipeline on Flink via
FlinkRunner
. - FlinkPipelineRunner(FlinkPipelineOptions, String, List<String>) - Constructor for class org.apache.beam.runners.flink.FlinkPipelineRunner
-
Setup a flink pipeline runner.
- FlinkPortableClientEntryPoint - Class in org.apache.beam.runners.flink
-
Flink job entry point to launch a Beam pipeline by executing an external SDK driver program.
- FlinkPortableClientEntryPoint(String) - Constructor for class org.apache.beam.runners.flink.FlinkPortableClientEntryPoint
- FlinkPortablePipelineTranslator<T> - Interface in org.apache.beam.runners.flink
-
Interface for portable Flink translators.
- FlinkPortablePipelineTranslator.Executor - Interface in org.apache.beam.runners.flink
-
A handle used to execute a translated pipeline.
- FlinkPortablePipelineTranslator.TranslationContext - Interface in org.apache.beam.runners.flink
-
The context used for pipeline translation.
- FlinkPortableRunnerResult - Class in org.apache.beam.runners.flink
-
Result of executing a portable
Pipeline
with Flink. - FlinkPortableRunnerUtils - Class in org.apache.beam.runners.flink.translation.utils
-
Various utilies related to portability.
- FlinkReduceFunction<K,
AccumT, - Class in org.apache.beam.runners.flink.translation.functionsOutputT, W> -
This is the second part for executing a
Combine.PerKey
on Flink, the second part isFlinkReduceFunction
. - FlinkReduceFunction(CombineFnBase.GlobalCombineFn<?, AccumT, OutputT>, WindowingStrategy<Object, W>, Map<PCollectionView<?>, WindowingStrategy<?, ?>>, PipelineOptions) - Constructor for class org.apache.beam.runners.flink.translation.functions.FlinkReduceFunction
- FlinkReduceFunction(CombineFnBase.GlobalCombineFn<?, AccumT, OutputT>, WindowingStrategy<Object, W>, Map<PCollectionView<?>, WindowingStrategy<?, ?>>, PipelineOptions, boolean) - Constructor for class org.apache.beam.runners.flink.translation.functions.FlinkReduceFunction
- FlinkRunner - Class in org.apache.beam.runners.flink
-
A
PipelineRunner
that executes the operations in the pipeline by first translating them to a Flink Plan and then executing them either locally or on a Flink cluster, depending on the configuration. - FlinkRunner(FlinkPipelineOptions) - Constructor for class org.apache.beam.runners.flink.FlinkRunner
- FlinkRunnerRegistrar - Class in org.apache.beam.runners.flink
-
AutoService registrar - will register FlinkRunner and FlinkOptions as possible pipeline runner services.
- FlinkRunnerRegistrar.Options - Class in org.apache.beam.runners.flink
-
Pipeline options registrar.
- FlinkRunnerRegistrar.Runner - Class in org.apache.beam.runners.flink
-
Pipeline runner registrar.
- FlinkRunnerResult - Class in org.apache.beam.runners.flink
-
Result of executing a
Pipeline
with Flink. - FlinkServerConfiguration() - Constructor for class org.apache.beam.runners.flink.FlinkJobServerDriver.FlinkServerConfiguration
- FlinkSideInputReader - Class in org.apache.beam.runners.flink.translation.functions
-
A
SideInputReader
for the Flink Batch Runner. - FlinkSideInputReader(Map<PCollectionView<?>, WindowingStrategy<?, ?>>, RuntimeContext) - Constructor for class org.apache.beam.runners.flink.translation.functions.FlinkSideInputReader
- FlinkSource<T,
OutputT> - Class in org.apache.beam.runners.flink.translation.wrappers.streaming.io.source -
The base class for
FlinkBoundedSource
andFlinkUnboundedSource
. - FlinkSource(String, Source<T>, SerializablePipelineOptions, Boundedness, int) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSource
- FlinkSource.TimestampExtractor<T> - Interface in org.apache.beam.runners.flink.translation.wrappers.streaming.io.source
- FlinkSourceReaderBase<T,
OutputT> - Class in org.apache.beam.runners.flink.translation.wrappers.streaming.io.source -
An abstract implementation of
SourceReader
which encapsulatesBeam Sources
for data reading. - FlinkSourceReaderBase(String, ScheduledExecutorService, SourceReaderContext, PipelineOptions, Function<OutputT, Long>) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- FlinkSourceReaderBase(String, SourceReaderContext, PipelineOptions, Function<OutputT, Long>) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- FlinkSourceReaderBase.ReaderAndOutput - Class in org.apache.beam.runners.flink.translation.wrappers.streaming.io.source
-
A wrapper for the reader and its associated information.
- FlinkSourceSplit<T> - Class in org.apache.beam.runners.flink.translation.wrappers.streaming.io.source
-
A Flink
SourceSplit
implementation that encapsulates a BeamSource
. - FlinkSourceSplit(int, Source<T>) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceSplit
- FlinkSourceSplit(int, Source<T>, byte[], UnboundedSource.CheckpointMark) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceSplit
- FlinkSourceSplitEnumerator<T> - Class in org.apache.beam.runners.flink.translation.wrappers.streaming.io.source
- FlinkSourceSplitEnumerator(SplitEnumeratorContext<FlinkSourceSplit<T>>, Source<T>, PipelineOptions, int) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceSplitEnumerator
- FlinkSourceSplitEnumerator(SplitEnumeratorContext<FlinkSourceSplit<T>>, Source<T>, PipelineOptions, int, boolean) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceSplitEnumerator
- FlinkStateBackendFactory - Interface in org.apache.beam.runners.flink
-
Constructs a StateBackend to use from flink pipeline options.
- FlinkStatefulDoFnFunction<K,
V, - Class in org.apache.beam.runners.flink.translation.functionsOutputT> -
A
RichGroupReduceFunction
for statefulParDo
in Flink Batch Runner. - FlinkStatefulDoFnFunction(DoFn<KV<K, V>, OutputT>, String, WindowingStrategy<?, ?>, Map<PCollectionView<?>, WindowingStrategy<?, ?>>, PipelineOptions, Map<TupleTag<?>, Integer>, TupleTag<OutputT>, Coder<KV<K, V>>, Map<TupleTag<?>, Coder<?>>, DoFnSchemaInformation, Map<String, PCollectionView<?>>) - Constructor for class org.apache.beam.runners.flink.translation.functions.FlinkStatefulDoFnFunction
- FlinkStateInternals<K> - Class in org.apache.beam.runners.flink.translation.wrappers.streaming.state
-
StateInternals
that uses a FlinkKeyedStateBackend
to manage state. - FlinkStateInternals(KeyedStateBackend<FlinkKey>, Coder<K>, Coder<? extends BoundedWindow>, SerializablePipelineOptions) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals
- FlinkStateInternals.EarlyBinder - Class in org.apache.beam.runners.flink.translation.wrappers.streaming.state
-
Eagerly create user state to work around https://jira.apache.org/jira/browse/FLINK-12653.
- FlinkStateInternals.FlinkStateNamespaceKeySerializer - Class in org.apache.beam.runners.flink.translation.wrappers.streaming.state
- FlinkStateInternals.FlinkStateNamespaceKeySerializer.FlinkStateNameSpaceSerializerSnapshot - Class in org.apache.beam.runners.flink.translation.wrappers.streaming.state
-
Serializer configuration snapshot for compatibility and format evolution.
- FlinkStateNamespaceKeySerializer(Coder<? extends BoundedWindow>) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.FlinkStateNamespaceKeySerializer
- FlinkStateNameSpaceSerializerSnapshot() - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.FlinkStateNamespaceKeySerializer.FlinkStateNameSpaceSerializerSnapshot
- FlinkStepContext() - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator.FlinkStepContext
- FlinkStreamingAggregationsTranslators - Class in org.apache.beam.runners.flink
- FlinkStreamingAggregationsTranslators() - Constructor for class org.apache.beam.runners.flink.FlinkStreamingAggregationsTranslators
- FlinkStreamingAggregationsTranslators.ConcatenateAsIterable<T> - Class in org.apache.beam.runners.flink
- FlinkStreamingPortablePipelineTranslator - Class in org.apache.beam.runners.flink
-
Translate an unbounded portable pipeline representation into a Flink pipeline representation.
- FlinkStreamingPortablePipelineTranslator() - Constructor for class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator
- FlinkStreamingPortablePipelineTranslator(Map<String, FlinkStreamingPortablePipelineTranslator.PTransformTranslator<FlinkStreamingPortablePipelineTranslator.StreamingTranslationContext>>) - Constructor for class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator
- FlinkStreamingPortablePipelineTranslator.IsFlinkNativeTransform - Class in org.apache.beam.runners.flink
-
Predicate to determine whether a URN is a Flink native transform.
- FlinkStreamingPortablePipelineTranslator.PTransformTranslator<T> - Interface in org.apache.beam.runners.flink
- FlinkStreamingPortablePipelineTranslator.StreamingTranslationContext - Class in org.apache.beam.runners.flink
-
Streaming translation context.
- FlinkUnboundedSource<T> - Class in org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.unbounded
-
A Flink
Source
implementation that wraps a BeamUnboundedSource
. - FlinkUnboundedSource(String, UnboundedSource<T, ?>, SerializablePipelineOptions, int) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.unbounded.FlinkUnboundedSource
- FlinkUnboundedSource(String, UnboundedSource<T, ?>, SerializablePipelineOptions, int, FlinkSource.TimestampExtractor<WindowedValue<ValueWithRecordId<T>>>) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.unbounded.FlinkUnboundedSource
- FlinkUnboundedSourceReader<T> - Class in org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.unbounded
-
A Flink
SourceReader
implementation that reads from the assignedFlinkSourceSplits
by using BeamUnboundedReaders
. - FlinkUnboundedSourceReader(String, SourceReaderContext, PipelineOptions, ScheduledExecutorService, Function<WindowedValue<ValueWithRecordId<T>>, Long>) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.unbounded.FlinkUnboundedSourceReader
- FlinkUnboundedSourceReader(String, SourceReaderContext, PipelineOptions, Function<WindowedValue<ValueWithRecordId<T>>, Long>) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.unbounded.FlinkUnboundedSourceReader
- FLOAT - Enum constant in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- FLOAT - Enum constant in enum class org.apache.beam.sdk.transforms.display.DisplayData.Type
- FLOAT - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
- FLOAT - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
-
The type of float fields.
- FLOAT32 - Enum constant in enum class org.apache.beam.sdk.io.clickhouse.TableSchema.TypeName
- FLOAT32 - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- FLOAT64 - Enum constant in enum class org.apache.beam.sdk.io.clickhouse.TableSchema.TypeName
- FLOAT64 - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- FloatCoder - Class in org.apache.beam.sdk.coders
-
A
FloatCoder
encodesFloat
values in 4 bytes using Java serialization. - floats() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
The
TypeDescriptor
for Float. - floatToByteArray(float) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigtableUtils
- flush() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Sink
- flush() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver
- flush() - Method in interface org.apache.beam.sdk.fn.data.CloseableFnDataReceiver
-
Deprecated.to be removed once splitting/checkpointing are available in SDKs and rewinding in readers.
- flush() - Method in interface org.apache.beam.sdk.io.FileIO.Sink
-
Flushes the buffered state (if any) before the channel is closed.
- flush() - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Sink
- flush() - Method in class org.apache.beam.sdk.io.TextIO.Sink
- flush() - Method in class org.apache.beam.sdk.io.TFRecordIO.Sink
- flush() - Method in class org.apache.beam.sdk.io.thrift.ThriftIO.Sink
- flush() - Method in class org.apache.beam.sdk.io.xml.XmlIO.Sink
- flush(boolean) - Method in class org.apache.beam.runners.jet.metrics.JetMetricsContainer
- flush(String, long) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.WriteStreamService
-
Flush a given stream up to the given offset.
- flush(String, long) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.WriteStreamServiceImpl
- flush(String, long) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- FLUSH_ROWS - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics.RpcMethod
- flushBufferedMetrics() - Method in interface org.apache.beam.sdk.io.kafka.KafkaMetrics
- flushBufferedMetrics() - Method in class org.apache.beam.sdk.io.kafka.KafkaMetrics.KafkaMetricsImpl
-
Export all metrics recorded in this instance to the underlying
perWorkerMetrics
containers. - flushBufferedMetrics() - Method in class org.apache.beam.sdk.io.kafka.KafkaMetrics.NoOpKafkaMetrics
- flushBundle(DoFn.OnTimerContext) - Method in class org.apache.beam.sdk.io.solace.write.UnboundedBatchedSolaceWriter
- flushData() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.ExecutableStageDoFnOperator
- fn(Contextful.Fn<InputT, OutputT>, Requirements) - Static method in class org.apache.beam.sdk.transforms.Contextful
-
Same with
Contextful.of(ClosureT, org.apache.beam.sdk.transforms.Requirements)
but with better type inference behavior for the case ofContextful.Fn
. - fn(ProcessFunction<InputT, OutputT>) - Static method in class org.apache.beam.sdk.transforms.Contextful
- fn(SerializableFunction<InputT, OutputT>) - Static method in class org.apache.beam.sdk.transforms.Contextful
-
Binary compatibility adapter for
Contextful.fn(ProcessFunction)
. - FnApiControlClient - Class in org.apache.beam.runners.fnexecution.control
-
A client for the control plane of an SDK harness, which can issue requests to it over the Fn API.
- FnApiControlClientPoolService - Class in org.apache.beam.runners.fnexecution.control
-
A Fn API control service which adds incoming SDK harness connections to a sink.
- FnDataReceiver<T> - Interface in org.apache.beam.sdk.fn.data
-
A receiver of streamed data.
- FnDataService - Interface in org.apache.beam.runners.fnexecution.data
-
The
FnDataService
is able to forward inbound elements to a consumer and is also a consumer of outbound elements. - FnService - Interface in org.apache.beam.sdk.fn.server
-
An interface sharing common behavior with services used during execution of user Fns.
- Footer() - Constructor for class org.apache.beam.runners.dataflow.internal.IsmFormat.Footer
- FooterCoder() - Constructor for class org.apache.beam.runners.dataflow.internal.IsmFormat.FooterCoder
- forBagUserStateHandlerFactory(ProcessBundleDescriptors.ExecutableProcessBundleDescriptor, StateRequestHandlers.BagUserStateHandlerFactory) - Static method in class org.apache.beam.runners.fnexecution.state.StateRequestHandlers
-
Returns an adapter which converts a
StateRequestHandlers.BagUserStateHandlerFactory
to aStateRequestHandler
. - forBatch(PCollectionView<ViewT>) - Static method in class org.apache.beam.runners.dataflow.CreateDataflowView
- forBoolean(Boolean) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Constructs a
CloudObject
representing the given value. - forBytes() - Static method in class org.apache.beam.sdk.extensions.zetasketch.HllCount.Init
-
Returns a
HllCount.Init.Builder
for aHllCount.Init
combiningPTransform
that computes bytes-type HLL++ sketches. - forClass(Class<?>) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Constructs a
CloudObject
to be used for serializing an instance of the supplied class for transport via the Dataflow API. - forClassName(String) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Constructs a
CloudObject
to be used for serializing data to be deserialized using the supplied class name the supplied class name for transport via the Dataflow API. - forCoder(TypeDescriptor<?>, Coder<?>) - Static method in class org.apache.beam.sdk.coders.CoderProviders
-
Creates a
CoderProvider
that always returns the given coder for the specified type. - forConsumers(List<DataEndpoint<?>>, List<TimerEndpoint<?>>) - Static method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver
-
Creates a receiver that is able to consume elements multiplexing on to the provided set of endpoints.
- forDescriptor(Endpoints.ApiServiceDescriptor) - Method in class org.apache.beam.sdk.fn.channel.ManagedChannelFactory
- forDescriptor(ProtoDomain, Descriptors.Descriptor) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoDynamicMessageSchema
-
Create a new ProtoDynamicMessageSchema from a
ProtoDomain
and for a descriptor. - forDescriptor(ProtoDomain, String) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoDynamicMessageSchema
-
Create a new ProtoDynamicMessageSchema from a
ProtoDomain
and for a message. - forEncoding(ByteString) - Static method in class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow
- forever(Trigger) - Static method in class org.apache.beam.sdk.transforms.windowing.Repeatedly
-
Create a composite trigger that repeatedly executes the trigger
repeated
, firing each time it fires and ignoring any indications to finish. - forField(TypeDescriptor<?>, Field, int) - Static method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
- forFloat(Double) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Constructs a
CloudObject
representing the given value. - forFloat(Float) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Constructs a
CloudObject
representing the given value. - forGetter(Method, int) - Static method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
- forGetter(TypeDescriptor<?>, Method, int) - Static method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
- forHandler(RunnerApi.Environment, InstructionRequestHandler) - Static method in interface org.apache.beam.runners.fnexecution.environment.RemoteEnvironment
-
Create a new
RemoteEnvironment
for the providedRunnerApi.Environment
andAutoCloseable
InstructionRequestHandler
. - forInteger(Integer) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Constructs a
CloudObject
representing the given value. - forInteger(Long) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Constructs a
CloudObject
representing the given value. - forIntegers() - Static method in class org.apache.beam.sdk.extensions.zetasketch.HllCount.Init
-
Returns a
HllCount.Init.Builder
for aHllCount.Init
combiningPTransform
that computes integer-type HLL++ sketches. - forIterableSideInput(String, String, Coder<V>, Coder<W>) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.SideInputHandlerFactory
- forIterableSideInput(String, String, Coder<V>, Coder<W>) - Method in class org.apache.beam.runners.fnexecution.translation.BatchSideInputHandlerFactory
- forIterableSideInput(String, String, Coder<V>, Coder<W>) - Method in class org.apache.beam.runners.fnexecution.translation.StreamingSideInputHandlerFactory
- forKey(K) - Static method in class org.apache.beam.runners.spark.stateful.SparkStateInternals
- forKeyAndState(K, Table<String, String, byte[]>) - Static method in class org.apache.beam.runners.spark.stateful.SparkStateInternals
- forKnownType(Object) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Constructs a
CloudObject
representing the given value of a well-known cloud object type. - forLongs() - Static method in class org.apache.beam.sdk.extensions.zetasketch.HllCount.Init
-
Returns a
HllCount.Init.Builder
for aHllCount.Init
combiningPTransform
that computes long-type HLL++ sketches. - format - Variable in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
- FORMAT - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- FormatAsTextFn() - Constructor for class org.apache.beam.runners.spark.structuredstreaming.examples.WordCount.FormatAsTextFn
- formatByteStringRange(Range.ByteStringRange) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ByteStringRangeHelper
-
Returns formatted string of a partition for debugging.
- formatRecord(ElementT, Schema) - Method in interface org.apache.beam.sdk.extensions.avro.io.AvroIO.RecordFormatter
-
Deprecated.
- formatRecord(UserT) - Method in class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
-
Convert an input record type into the output type.
- formatTimestamp(Instant) - Static method in class org.apache.beam.sdk.transforms.windowing.BoundedWindow
-
Formats a
Instant
timestamp with additional Beam-specific metadata, such as indicating whether the timestamp is the end of the global window or one of the distinguished valuesBoundedWindow.TIMESTAMP_MIN_VALUE
orBoundedWindow.TIMESTAMP_MIN_VALUE
. - formatTimestampWithTimeZone(DateTime) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.DateTimeUtils
- forMultimapSideInput(String, String, KvCoder<K, V>, Coder<W>) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.SideInputHandlerFactory
- forMultimapSideInput(String, String, KvCoder<K, V>, Coder<W>) - Method in class org.apache.beam.runners.fnexecution.translation.BatchSideInputHandlerFactory
- forMultimapSideInput(String, String, KvCoder<K, V>, Coder<W>) - Method in class org.apache.beam.runners.fnexecution.translation.StreamingSideInputHandlerFactory
- forNewInput(Instant, InputT) - Method in interface org.apache.beam.sdk.transforms.Watch.Growth.TerminationCondition
-
Called by the
Watch
transform to create a new independent termination state for a newly arrivedInputT
. - forOneOf(String, boolean, Map<String, FieldValueTypeInformation>) - Static method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
- forOrdinal(int) - Static method in enum class org.apache.beam.sdk.io.kafka.KafkaTimestampType
- forProject(String, int, String) - Static method in class org.apache.beam.sdk.transformservice.launcher.TransformServiceLauncher
-
Initializes a client for managing transform service instances.
- forRegistry(MetricRegistry) - Static method in class org.apache.beam.runners.spark.metrics.WithMetricsSupport
- forRegistry(MetricRegistry) - Static method in class org.apache.beam.runners.spark.structuredstreaming.metrics.WithMetricsSupport
- forRequestObserver(String, StreamObserver<BeamFnApi.InstructionRequest>, ConcurrentMap<String, BeamFnApi.ProcessBundleDescriptor>) - Static method in class org.apache.beam.runners.fnexecution.control.FnApiControlClient
-
Returns a
FnApiControlClient
which will submit its requests to the provided observer. - forService(InstructionRequestHandler) - Static method in class org.apache.beam.runners.fnexecution.environment.StaticRemoteEnvironmentFactory
- forSetter(Method) - Static method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
- forSetter(Method, String) - Static method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
- forSetter(TypeDescriptor<?>, Method) - Static method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
- forSetter(TypeDescriptor<?>, Method, String) - Static method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
- forSideInputHandlerFactory(Map<String, Map<String, ProcessBundleDescriptors.SideInputSpec>>, StateRequestHandlers.SideInputHandlerFactory) - Static method in class org.apache.beam.runners.fnexecution.state.StateRequestHandlers
-
Returns an adapter which converts a
StateRequestHandlers.SideInputHandlerFactory
to aStateRequestHandler
. - forSqlType(Schema.TypeName) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.BigDecimalConverter
- forStage(ExecutableStage) - Method in class org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory
- forStage(ExecutableStage) - Method in interface org.apache.beam.runners.fnexecution.control.JobBundleFactory
- forStage(ExecutableStage) - Method in class org.apache.beam.runners.fnexecution.control.SingleEnvironmentInstanceJobBundleFactory
-
Deprecated.
- forStage(ExecutableStage, Map<RunnerApi.ExecutableStagePayload.SideInputId, PCollectionView<?>>, SideInputHandler) - Static method in class org.apache.beam.runners.fnexecution.translation.StreamingSideInputHandlerFactory
-
Creates a new state handler for the given stage.
- forStage(ExecutableStage, BatchSideInputHandlerFactory.SideInputGetter) - Static method in class org.apache.beam.runners.fnexecution.translation.BatchSideInputHandlerFactory
-
Creates a new state handler for the given stage.
- forStreamFromSources(List<Integer>, Map<Integer, GlobalWatermarkHolder.SparkWatermarks>) - Static method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
-
Build the
TimerInternals
according to the feeding streams. - forStreaming(PCollectionView<ViewT>) - Static method in class org.apache.beam.runners.dataflow.CreateDataflowView
- forString(String) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Constructs a
CloudObject
representing the given value. - forStrings() - Static method in class org.apache.beam.sdk.extensions.zetasketch.HllCount.Init
-
Returns a
HllCount.Init.Builder
for aHllCount.Init
combiningPTransform
that computes string-type HLL++ sketches. - forThrowable(Throwable) - Static method in class org.apache.beam.sdk.values.EncodableThrowable
-
Wraps
throwable
and returns the result. - forTransformHierarchy(TransformHierarchy, PipelineOptions) - Static method in class org.apache.beam.sdk.Pipeline
- forTypeName(Schema.TypeName) - Static method in class org.apache.beam.sdk.schemas.Schema.FieldType
- forUserState(String, String, Coder<K>, Coder<V>, Coder<W>) - Method in class org.apache.beam.runners.fnexecution.state.InMemoryBagUserStateFactory
- forUserState(String, String, Coder<K>, Coder<V>, Coder<W>) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.BagUserStateHandlerFactory
- ForwardingClientResponseObserver<ReqT,
RespT> - Class in org.apache.beam.sdk.fn.stream -
A
ClientResponseObserver
which delegates allStreamObserver
calls. - forWriter(LogWriter) - Static method in class org.apache.beam.runners.fnexecution.logging.GrpcLoggingService
- freeze() - Method in class org.apache.beam.runners.jet.metrics.JetMetricResults
- from(double, double) - Static method in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker.Progress
-
A representation for the amount of known completed and remaining work.
- from(long) - Static method in class org.apache.beam.sdk.io.GenerateSequence
-
Specifies the minimum number to generate (inclusive).
- from(TableReference) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
-
Read from table specified by a
TableReference
. - from(TableReference) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- from(Struct) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper.PartitionMetadataMapper
-
Transforms a
Struct
representing a partition metadata row into aPartitionMetadata
model. - from(ConfigT) - Method in class org.apache.beam.sdk.schemas.transforms.TypedSchemaTransformProvider
-
Produce a SchemaTransform from ConfigT.
- from(String) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Parse
-
Reads from the given filename or filepattern.
- from(String) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Read
- from(String) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroSource
- from(String) - Static method in class org.apache.beam.sdk.extensions.python.PythonExternalTransform
-
Instantiates a cross-language wrapper for a Python transform with a given transform name.
- from(String) - Method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.Read
-
Reads text from the file(s) with the given filename or filename pattern.
- from(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
-
Reads a BigQuery table specified as
"[project_id]:[dataset_id].[table_id]"
or"[dataset_id].[table_id]"
for tables within the current project. - from(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- from(String) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Parse
- from(String) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Read
- from(String) - Method in class org.apache.beam.sdk.io.solr.SolrIO.Read
-
Provide name of collection while reading from Solr.
- from(String) - Method in class org.apache.beam.sdk.io.TextIO.Read
-
Reads text files that reads from the file(s) with the given filename or filename pattern.
- from(String) - Method in class org.apache.beam.sdk.io.TFRecordIO.Read
-
Returns a transform for reading TFRecord files that reads from the file(s) with the given filename or filename pattern.
- from(String) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
-
Reads a single XML file or a set of XML files defined by a Java "glob" file pattern.
- from(String, String) - Static method in class org.apache.beam.sdk.extensions.python.PythonExternalTransform
-
Instantiates a cross-language wrapper for a Python transform with a given transform name.
- from(String, Row, Schema) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaIOProvider
-
Produces a SchemaIO given a String representing the data's location, the schema of the data that resides there, and some IO-specific configuration object.
- from(String, Row, Schema) - Method in class org.apache.beam.sdk.io.jdbc.JdbcSchemaIOProvider
-
Produce a SchemaIO given a String representing the data's location, the schema of the data that resides there, and some IO-specific configuration object.
- from(String, Row, Schema) - Method in interface org.apache.beam.sdk.schemas.io.SchemaIOProvider
-
Produce a SchemaIO given a String representing the data's location, the schema of the data that resides there, and some IO-specific configuration object.
- from(String, Row, Schema) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSchemaIOProvider
-
Produce a SchemaIO given a String representing the data's location, the schema of the data that resides there, and some IO-specific configuration object.
- from(String, Row, Schema) - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider
-
Produce a SchemaIO given a String representing the data's location, the schema of the data that resides there, and some IO-specific configuration object.
- from(String, Row, Schema) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
-
Produce a SchemaIO given a String representing the data's location, the schema of the data that resides there, and some IO-specific configuration object.
- from(ExecutorService) - Static method in class org.apache.beam.sdk.fn.test.TestExecutors
- from(Supplier<ExecutorService>) - Static method in class org.apache.beam.sdk.fn.test.TestExecutors
- from(Map<String, String>) - Static method in class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.WorkerLogLevelOverrides
-
Deprecated.Expects a map keyed by logger
Name
s with values representingLevel
s. - from(Map<String, String>) - Static method in class org.apache.beam.sdk.options.SdkHarnessOptions.SdkHarnessLogLevelOverrides
-
Expects a map keyed by logger
Name
s with values representingLogLevel
s. - from(DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration) - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider
- from(WindowIntoTransformProvider.Configuration) - Method in class org.apache.beam.sdk.expansion.service.WindowIntoTransformProvider
- from(SqsReadConfiguration) - Method in class org.apache.beam.sdk.io.aws2.sqs.providers.SqsReadSchemaTransformProvider
- from(BoundedSource<T>) - Method in class org.apache.beam.sdk.io.Read.Builder
-
Returns a new
Read.Bounded
PTransform
reading from the givenBoundedSource
. - from(BoundedSource<T>) - Static method in class org.apache.beam.sdk.io.Read
-
Returns a new
Read.Bounded
PTransform
reading from the givenBoundedSource
. - from(CsvWriteTransformProvider.CsvWriteConfiguration) - Method in class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider
- from(FileBasedSource<T>) - Static method in class org.apache.beam.sdk.io.CompressedSource
-
Creates a
CompressedSource
from an underlyingFileBasedSource
. - from(FileReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformProvider
- from(FileWriteSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformProvider
-
Builds a
SchemaTransform
from aFileWriteSchemaTransformConfiguration
. - from(MatchResult.Metadata) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroSource
- from(BigQueryExportReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformProvider
-
Returns the expected
SchemaTransform
of the configuration. - from(BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider
- from(BigQueryWriteConfiguration) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryFileLoadsSchemaTransformProvider
- from(BigQueryWriteConfiguration) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider
- from(BigQueryWriteConfiguration) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteSchemaTransformProvider
- from(BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider
- from(BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider
- from(PubsubReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformProvider
- from(PubsubWriteSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformProvider
- from(PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
- from(PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
- from(SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider
- from(SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider
- from(SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider
- from(IcebergCdcReadSchemaTransformProvider.Configuration) - Method in class org.apache.beam.sdk.io.iceberg.IcebergCdcReadSchemaTransformProvider
- from(IcebergReadSchemaTransformProvider.Configuration) - Method in class org.apache.beam.sdk.io.iceberg.IcebergReadSchemaTransformProvider
- from(IcebergWriteSchemaTransformProvider.Configuration) - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider
- from(JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider
- from(JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider
- from(JsonWriteTransformProvider.JsonWriteConfiguration) - Method in class org.apache.beam.sdk.io.json.providers.JsonWriteTransformProvider
- from(KafkaReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider
- from(KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider
- from(SingleStoreSchemaTransformReadConfiguration) - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadProvider
-
Returns the expected
SchemaTransform
of the configuration. - from(SingleStoreSchemaTransformWriteConfiguration) - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteProvider
-
Returns the expected
SchemaTransform
of the configuration. - from(Solace.Queue) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Read
-
Set the queue name to read from.
- from(Solace.Topic) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Read
-
Set the topic name to read from.
- from(TFRecordReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.TFRecordReadSchemaTransformProvider
-
Returns the expected
SchemaTransform
of the configuration. - from(TFRecordWriteSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformProvider
-
Returns the expected
SchemaTransform
of the configuration. - from(UnboundedSource<T, ?>) - Method in class org.apache.beam.sdk.io.Read.Builder
-
Returns a new
Read.Unbounded
PTransform
reading from the givenUnboundedSource
. - from(UnboundedSource<T, ?>) - Static method in class org.apache.beam.sdk.io.Read
- from(ManagedSchemaTransformProvider.ManagedConfig) - Method in class org.apache.beam.sdk.managed.ManagedSchemaTransformProvider
- from(TestSchemaTransformProvider.Config) - Method in class org.apache.beam.sdk.managed.testing.TestSchemaTransformProvider
- from(ValueProvider<String>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Parse
- from(ValueProvider<String>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Read
-
Reads from the given filename or filepattern.
- from(ValueProvider<String>) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroSource
-
Reads from the given file name or pattern ("glob").
- from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.Read
-
Same as
from(filepattern)
, but accepting aValueProvider
. - from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
-
Same as
from(String)
, but with aValueProvider
. - from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Parse
- from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Read
-
Reads from the given filename or filepattern.
- from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.TextIO.Read
-
Same as
from(filepattern)
, but accepting aValueProvider
. - from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.TFRecordIO.Read
-
Same as
from(filepattern)
, but accepting aValueProvider
. - from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
-
Reads a single XML file or a set of XML files defined by a Java "glob" file pattern.
- from(GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration) - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider
- from(FlattenTransformProvider.Configuration) - Method in class org.apache.beam.sdk.schemas.transforms.providers.FlattenTransformProvider
- from(JavaExplodeTransformProvider.Configuration) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider
- from(JavaFilterTransformProvider.Configuration) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider
- from(JavaMapToFieldsTransformProvider.Configuration) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider
- from(LoggingTransformProvider.Configuration) - Method in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider
- from(HasDisplayData) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
-
Collect the
DisplayData
from a component. - from(Row) - Method in class org.apache.beam.sdk.extensions.sql.expansion.SqlTransformSchemaTransformProvider
- from(Row) - Method in interface org.apache.beam.sdk.schemas.transforms.SchemaTransformProvider
-
Produce a
SchemaTransform
from some transform-specific configuration object. - from(Row) - Method in class org.apache.beam.sdk.schemas.transforms.TypedSchemaTransformProvider
-
Produces a
SchemaTransform
from a Row configuration. - from(TableIdentifier) - Method in class org.apache.beam.sdk.io.iceberg.IcebergIO.ReadRows
- fromArgs(String...) - Method in class org.apache.beam.sdk.options.PipelineOptionsFactory.Builder
-
Sets the command line arguments to parse when constructing the
PipelineOptions
. - fromArgs(String...) - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
-
Sets the command line arguments to parse when constructing the
PipelineOptions
. - fromArray(T...) - Static method in class org.apache.beam.sdk.fn.stream.PrefetchableIterables
-
Returns a
PrefetchableIterable
over the specified values. - fromArray(T...) - Static method in class org.apache.beam.sdk.fn.stream.PrefetchableIterators
-
Returns a
PrefetchableIterator
over the specified values. - fromAvroType(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.FixedBytesField
-
Create a
AvroUtils.FixedBytesField
from an AVRO type. - fromBeamFieldType(Schema.FieldType) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.FixedBytesField
-
Create a
AvroUtils.FixedBytesField
from a BeamSchema.FieldType
. - fromByteArray(byte[], Coder<T>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
-
Utility method for deserializing a byte array using the specified coder.
- fromByteArray(byte[], Coder<T>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.CoderHelpers
-
Utility method for deserializing a byte array using the specified coder.
- fromByteArray(byte[], Coder<T>) - Static method in class org.apache.beam.runners.twister2.utils.TranslationUtils
-
Utility method for deserializing a byte array using the specified coder.
- fromByteArray(byte[], WindowedValues.WindowedValueCoder<T>) - Static method in class org.apache.beam.runners.twister2.utils.TranslationUtils
-
Utility method for deserializing a byte array using the specified coder.
- fromByteArrays(Collection<byte[]>, Coder<T>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
-
Utility method for deserializing a Iterable of byte arrays using the specified coder.
- fromByteFunction(Coder<T>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
-
A function wrapper for converting a byte array to an object.
- FromByteFunction(Coder<K>, Coder<V>) - Constructor for class org.apache.beam.runners.spark.coders.CoderHelpers.FromByteFunction
- fromByteFunctionIterable(Coder<K>, Coder<V>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
-
A function wrapper for converting a byte array pair to a key-value pair, where values are
Iterable
. - fromCanonical(Compression) - Static method in enum class org.apache.beam.sdk.io.FileBasedSink.CompressionType
-
Deprecated.
- fromCloudDuration(String) - Static method in class org.apache.beam.runners.dataflow.util.TimeUtil
-
Converts a Dataflow API duration string into a
Duration
. - fromCloudObject(CloudObject) - Method in interface org.apache.beam.runners.dataflow.util.CloudObjectTranslator
-
Converts back into the original object from a provided
CloudObject
. - fromCloudObject(CloudObject) - Method in class org.apache.beam.runners.dataflow.util.RowCoderCloudObjectTranslator
-
Convert from a cloud object.
- fromCloudObject(CloudObject) - Method in class org.apache.beam.runners.dataflow.util.SchemaCoderCloudObjectTranslator
-
Convert from a cloud object.
- fromCloudPubsubMessages() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.CloudPubsubTransforms
-
Transform messages publishable using PubsubIO to their equivalent Pub/Sub Lite publishable message.
- fromCloudTime(String) - Static method in class org.apache.beam.runners.dataflow.util.TimeUtil
-
Converts a time value received via the Dataflow API into the corresponding
Instant
. - fromComponents(String, String) - Static method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
-
Creates a GcsPath from bucket and object components.
- fromComponents(List<Coder<?>>, byte[]) - Static method in class org.apache.beam.sdk.values.WindowedValues.ParamWindowedValueCoder
- fromComponents(List<Coder<?>>, byte[], CoderTranslation.TranslationContext) - Method in class org.apache.beam.sdk.extensions.avro.AvroGenericCoderTranslator
- fromConfig(FlinkJobServerDriver.FlinkServerConfiguration) - Static method in class org.apache.beam.runners.flink.FlinkJobServerDriver
- fromConfig(FlinkJobServerDriver.FlinkServerConfiguration, JobServerDriver.JobInvokerFactory) - Static method in class org.apache.beam.runners.flink.FlinkJobServerDriver
- fromConfig(SparkJobServerDriver.SparkServerConfiguration) - Static method in class org.apache.beam.runners.spark.SparkJobServerDriver
- fromConfigRow(Row, PipelineOptions) - Method in class org.apache.beam.sdk.schemas.transforms.SchemaTransformTranslation.SchemaTransformPayloadTranslator
- fromExceptionInformation(RecordT, Coder<RecordT>, Exception, String) - Static method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord
- fromExecutableStage(String, ExecutableStage, Endpoints.ApiServiceDescriptor) - Static method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors
- fromExecutableStage(String, ExecutableStage, Endpoints.ApiServiceDescriptor, Endpoints.ApiServiceDescriptor) - Static method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors
-
Note that the
BeamFnApi.ProcessBundleDescriptor
is constructed by: Adding gRPC read and write nodes wiring them to the specified data endpoint. - fromExistingTable(String, String) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataTableNames
-
Encapsulates a selected table name.
- fromFeedRange(FeedRange) - Static method in class org.apache.beam.sdk.io.azure.cosmos.NormalizedRange
- fromFile(File, boolean) - Static method in class org.apache.beam.sdk.io.LocalResources
- fromFile(String) - Static method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService.ArtifactDestination
- fromFile(String, OutputStream) - Static method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService.ArtifactDestination
- fromGenericAvroSchema(Schema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
-
Convert an Avro
Schema
to a BigQueryTableSchema
. - fromGenericAvroSchema(Schema, Boolean) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
-
Convert an Avro
Schema
to a BigQueryTableSchema
. - fromHex(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinStringFunctions
- fromHttpResponse(HttpResponse) - Static method in class org.apache.beam.sdk.io.solace.broker.BrokerResponse
- fromInt(Integer, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.DateConversion
- fromInt(Integer, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimeMillisConversion
- fromInt(Integer, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.DateConversion
- fromInt(Integer, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimeConversion
- fromIr(Ir) - Static method in class org.apache.beam.sdk.extensions.sbe.SerializableIr
-
Creates a new instance from
ir
. - fromIr(Ir, SbeSchema.IrOptions) - Static method in class org.apache.beam.sdk.extensions.sbe.SbeSchema
-
Creates a new
SbeSchema
from the given intermediate representation. - fromJsonFile(File) - Method in class org.apache.beam.sdk.io.cdap.ConfigWrapper
-
Gets
ConfigWrapper
by JSON file. - fromJsonString(String, Class<T>) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
- fromJsonString(String) - Method in class org.apache.beam.sdk.io.cdap.ConfigWrapper
-
Gets
ConfigWrapper
by JSON string. - fromLong(Long, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.LocalTimestampMicrosConversion
- fromLong(Long, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.LocalTimestampMillisConversion
- fromLong(Long, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimeMicrosConversion
- fromLong(Long, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimestampMicrosConversion
- fromLong(Long, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimestampMillisConversion
- fromLong(Long, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimeMicrosConversion
- fromLong(Long, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimestampConversion
- fromLong(Long, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimestampMicrosConversion
- fromMap(Map<String, String>) - Static method in class org.apache.beam.sdk.io.hadoop.SerializableConfiguration
-
Returns a new configuration instance using provided flags.
- fromModel(Message) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
-
From model
Message
to hl7v2 message. - fromName(String) - Static method in enum class org.apache.beam.io.debezium.Connectors
-
Returns a connector class corresponding to the given connector name.
- fromName(String) - Static method in class org.apache.beam.sdk.io.solace.data.Solace.Queue
- fromName(String) - Static method in class org.apache.beam.sdk.io.solace.data.Solace.Topic
- fromObject(StorageObject) - Static method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
-
Creates a GcsPath from a StorageObject.
- fromOptions(DataflowPipelineOptions) - Static method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator
-
Constructs a translator from the provided options.
- fromOptions(PipelineOptions) - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem.ClassLoaderFileSystemRegistrar
- fromOptions(PipelineOptions) - Method in class org.apache.beam.sdk.io.LocalFileSystemRegistrar
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.dataflow.DataflowRunner
-
Construct a runner from the provided options.
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.dataflow.TestDataflowRunner
-
Constructs a runner from the provided options.
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.dataflow.util.GcsStager
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.direct.DirectRunner
-
Construct a
DirectRunner
from the provided options. - fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.flink.FlinkRunner
-
Construct a runner from the provided options.
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.flink.TestFlinkRunner
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.jet.JetRunner
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.portability.PortableRunner
-
Constructs a runner from the provided options.
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.portability.testing.TestPortableRunner
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.portability.testing.TestUniversalRunner
-
Constructs a runner from the provided options.
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.prism.PrismRunner
-
Invoked from
Pipeline.run()
wherePrismRunner
instantiates usingPrismPipelineOptions
configuration details. - fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.prism.TestPrismRunner
-
Invoked from
Pipeline.run()
whereTestPrismRunner
instantiates usingTestPrismPipelineOptions
configuration details. - fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.spark.SparkRunner
-
Creates and returns a new SparkRunner with specified options.
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.spark.SparkRunnerDebugger
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingRunner
-
Creates and returns a new SparkStructuredStreamingRunner with specified options.
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.spark.TestSparkRunner
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.twister2.Twister2Runner
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.twister2.Twister2TestRunner
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.extensions.gcp.auth.GcpCredentialFactory
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.extensions.gcp.auth.NoopCredentialFactory
- fromOptions(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsFileSystemRegistrar
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.extensions.gcp.storage.GcsPathValidator
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.extensions.gcp.storage.NoopPathValidator
- fromOptions(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws2.s3.DefaultS3FileSystemSchemeRegistrar
- fromOptions(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemRegistrar
- fromOptions(PipelineOptions) - Method in interface org.apache.beam.sdk.io.aws2.s3.S3FileSystemSchemeRegistrar
-
Create zero or more
S3FileSystemConfiguration
instances from the givenPipelineOptions
. - fromOptions(PipelineOptions) - Method in class org.apache.beam.sdk.io.azure.blobstore.AzureBlobStoreFileSystemRegistrar
- fromOptions(PipelineOptions) - Method in interface org.apache.beam.sdk.io.FileSystemRegistrar
-
Create zero or more
filesystems
from the givenPipelineOptions
. - fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
-
Creates an instance of this rule using provided options.
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.io.googleads.GoogleAdsUserCredentialFactory
- fromOptions(PipelineOptions) - Method in class org.apache.beam.sdk.io.hdfs.HadoopFileSystemRegistrar
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.PipelineRunner
-
Constructs a runner from the provided
PipelineOptions
. - fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.testing.CrashingRunner
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.testing.TestPipeline
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHints
-
Creates a
ResourceHints
instance with hints supplied in options. - fromOptions(PipelineOptions, Function<ClientConfig, JetInstance>) - Static method in class org.apache.beam.runners.jet.JetRunner
- fromParams(String[]) - Static method in class org.apache.beam.runners.flink.FlinkJobServerDriver
- fromParams(String[]) - Static method in class org.apache.beam.runners.spark.SparkJobServerDriver
- fromParams(DefaultFilenamePolicy.Params) - Static method in class org.apache.beam.sdk.io.DefaultFilenamePolicy
-
Construct a
DefaultFilenamePolicy
from aDefaultFilenamePolicy.Params
object. - fromPath(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubSubscription
-
Creates a class representing a Pub/Sub subscription from the specified subscription path.
- fromPath(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
-
Creates a class representing a Cloud Pub/Sub topic from the specified topic path.
- fromPath(Path, boolean) - Static method in class org.apache.beam.sdk.io.LocalResources
- fromProcessFunctionWithOutputType(ProcessFunction<InputT, OutputT>, TypeDescriptor<OutputT>) - Static method in class org.apache.beam.sdk.transforms.InferableFunction
- fromProto(PubsubMessage) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages
- fromPTransform(RunnerApi.PTransform) - Static method in class org.apache.beam.sdk.fn.data.RemoteGrpcPortRead
- fromPTransform(RunnerApi.PTransform) - Static method in class org.apache.beam.sdk.fn.data.RemoteGrpcPortWrite
- fromQuery(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
-
Reads results received after executing the given query.
- fromQuery(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- fromQuery(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
-
A query to be executed in Snowflake.
- fromQuery(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
-
Same as
fromQuery(String)
, but with aValueProvider
. - fromQuery(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- fromQuery(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
- fromRawEvents(Coder<T>, List<TestStream.Event<T>>) - Static method in class org.apache.beam.sdk.testing.TestStream
-
For internal use only.
- fromResourceName(String) - Static method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
-
Creates a GcsPath from a OnePlatform resource name in string form.
- fromRow(Row) - Static method in class org.apache.beam.sdk.values.Row
-
Creates a row builder based on the specified row.
- fromRowFunction(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.io.aws2.schemas.AwsSchemaProvider
- fromRowFunction(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.annotations.DefaultSchema.DefaultSchemaProvider
-
Given a type, returns a function that converts from a
Row
object to that type. - fromRowFunction(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProvider
-
Deprecated.
- fromRowFunction(TypeDescriptor<T>) - Method in interface org.apache.beam.sdk.schemas.SchemaProvider
-
Given a type, returns a function that converts from a
Row
object to that type. - fromRows(Class<OutputT>) - Static method in class org.apache.beam.sdk.schemas.transforms.Convert
-
Convert a
PCollection
<Row> into aPCollection
<OutputT>. - fromRows(TypeDescriptor<OutputT>) - Static method in class org.apache.beam.sdk.schemas.transforms.Convert
-
Convert a
PCollection
<Row> into aPCollection
<OutputT>. - fromS3Options(S3Options) - Static method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
- fromSerializableFunctionWithOutputType(SerializableFunction<InputT, OutputT>, TypeDescriptor<OutputT>) - Static method in class org.apache.beam.sdk.transforms.SimpleFunction
- fromSnapshot(Long) - Method in class org.apache.beam.sdk.io.iceberg.IcebergIO.ReadRows
- fromSnapshot(Snapshot) - Static method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo
- fromSnapshot(Snapshot, String) - Static method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo
- fromSpec(Map<String, Object>) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Constructs a
CloudObject
by copying the supplied serialized object spec, which must represent an SDK object serialized for transport via the Dataflow API. - fromSpec(HCatalogIO.Read) - Static method in class org.apache.beam.sdk.io.hcatalog.HCatToRow
- fromStandardParameters(ValueProvider<ResourceId>, String, String, boolean) - Static method in class org.apache.beam.sdk.io.DefaultFilenamePolicy
-
Construct a
DefaultFilenamePolicy
. - fromStaticMethods(Class<?>, Class<?>) - Static method in class org.apache.beam.sdk.coders.CoderProviders
-
Creates a
CoderProvider
from a class'sstatic <T> Coder<T> of(TypeDescriptor<T>, List<Coder<?>>
) method. - fromString(String, boolean) - Static method in class org.apache.beam.sdk.io.LocalResources
- fromString(ValueProvider<String>, boolean) - Static method in class org.apache.beam.sdk.io.LocalResources
- fromSubscription(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
-
Reads from the given subscription.
- fromSubscription(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
-
Like
subscription()
but with aValueProvider
. - fromSupplier(SerializableSupplier<Matcher<T>>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
Constructs a
SerializableMatcher
from a non-serializableMatcher
via indirection throughSerializableSupplier
. - fromTable(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
-
A table name to be read in Snowflake.
- fromTable(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
- fromTableSchema(TableSchema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
-
Convert a BigQuery
TableSchema
to a BeamSchema
. - fromTableSchema(TableSchema, BigQueryUtils.SchemaConversionOptions) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
-
Convert a BigQuery
TableSchema
to a BeamSchema
. - fromTimestamp(Long) - Method in class org.apache.beam.sdk.io.iceberg.IcebergIO.ReadRows
- fromTopic(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
-
Creates and returns a transform for reading from a Cloud Pub/Sub topic.
- fromTopic(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
-
Like
PubsubIO.Read.fromTopic(String)
but with aValueProvider
. - fromUri(String) - Static method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
-
Creates a GcsPath from a URI in string form.
- fromUri(URI) - Static method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
-
Creates a GcsPath from a URI.
- FULL - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService.TableMetadataView
- FULL_RANGE - Static variable in class org.apache.beam.sdk.io.azure.cosmos.NormalizedRange
- FullNameTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider
-
Base class for table providers that look up table metadata using full table names, instead of querying it by parts of the name separately.
- FullNameTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.FullNameTableProvider
- fullOuterJoin(String, PCollection<KV<K, V1>>, PCollection<KV<K, V2>>, V1, V2) - Static method in class org.apache.beam.sdk.extensions.joinlibrary.Join
-
Full Outer Join of two collections of KV elements.
- fullOuterJoin(PCollection<KV<K, V1>>, PCollection<KV<K, V2>>, V1, V2) - Static method in class org.apache.beam.sdk.extensions.joinlibrary.Join
-
Full Outer Join of two collections of KV elements.
- fullOuterJoin(PCollection<RhsT>) - Static method in class org.apache.beam.sdk.schemas.transforms.Join
-
Perform a full outer join.
- fullUpdate(String, String) - Static method in class org.apache.beam.sdk.io.mongodb.UpdateField
-
Sets the limit of documents to find.
- fullyExpand(Map<TupleTag<?>, PValue>) - Static method in class org.apache.beam.sdk.values.PValues
-
Returns all the tagged
PCollections
represented in the givenPValue
. - fun1(ScalaInterop.Fun1<T, V>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.utils.ScalaInterop
- fun2(ScalaInterop.Fun2<T1, T2, V>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.utils.ScalaInterop
- FUNCTION - Enum constant in enum class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields.Fanout.Kind
- functionGroup - Variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.ZetaSqlScalarFunctionImpl
-
ZetaSQL function group identifier.
- functionToFlatMapFunction(Function<InputT, OutputT>) - Static method in class org.apache.beam.runners.spark.translation.TranslationUtils
- fuse(PipelineTranslator.UnresolvedTranslation<T, T2>) - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator.UnresolvedTranslation
G
- gauge(Class<?>, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
-
Create a metric that can have its new value set, and is aggregated by taking the last reported value.
- gauge(String, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
-
Create a metric that can have its new value set, and is aggregated by taking the last reported value.
- gauge(MetricName) - Static method in class org.apache.beam.sdk.metrics.Metrics
-
Create a metric that can have its new value set, and is aggregated by taking the last reported value.
- Gauge - Interface in org.apache.beam.sdk.metrics
-
A metric that reports the latest value out of reported values.
- GaugeImpl - Class in org.apache.beam.runners.jet.metrics
-
Implementation of
Gauge
. - GaugeImpl(MetricName) - Constructor for class org.apache.beam.runners.jet.metrics.GaugeImpl
- GaugeResult - Class in org.apache.beam.sdk.metrics
-
The result of a
Gauge
metric. - GaugeResult() - Constructor for class org.apache.beam.sdk.metrics.GaugeResult
- GaugeResult.EmptyGaugeResult - Class in org.apache.beam.sdk.metrics
-
Empty
GaugeResult
, representing no values reported. - GceMetadataUtil - Class in org.apache.beam.sdk.extensions.gcp.util
- GceMetadataUtil() - Constructor for class org.apache.beam.sdk.extensions.gcp.util.GceMetadataUtil
- GcpCredentialFactory - Class in org.apache.beam.sdk.extensions.gcp.auth
-
Construct an oauth credential to be used by the SDK and the SDK workers.
- GcpIoPipelineOptionsRegistrar - Class in org.apache.beam.sdk.io.gcp.common
-
A registrar containing the default GCP options.
- GcpIoPipelineOptionsRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.common.GcpIoPipelineOptionsRegistrar
- GcpOAuthScopesFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.GcpOAuthScopesFactory
- GcpOptions - Interface in org.apache.beam.sdk.extensions.gcp.options
-
Options used to configure Google Cloud Platform specific options such as the project and credentials.
- GcpOptions.DefaultProjectFactory - Class in org.apache.beam.sdk.extensions.gcp.options
-
Attempts to infer the default project based upon the environment this application is executing within.
- GcpOptions.EnableStreamingEngineFactory - Class in org.apache.beam.sdk.extensions.gcp.options
-
EnableStreamingEngine defaults to false unless one of the two experiments is set.
- GcpOptions.GcpOAuthScopesFactory - Class in org.apache.beam.sdk.extensions.gcp.options
-
Returns the default set of OAuth scopes.
- GcpOptions.GcpTempLocationFactory - Class in org.apache.beam.sdk.extensions.gcp.options
-
Returns
PipelineOptions.getTempLocation()
as the default GCP temp location. - GcpOptions.GcpUserCredentialsFactory - Class in org.apache.beam.sdk.extensions.gcp.options
-
Attempts to load the GCP credentials.
- GcpPipelineOptionsRegistrar - Class in org.apache.beam.sdk.extensions.gcp.options
-
A registrar containing the default GCP options.
- GcpPipelineOptionsRegistrar() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcpPipelineOptionsRegistrar
- GCPSecretSessionServiceFactory - Class in org.apache.beam.sdk.io.solace.broker
-
This class implements a
SessionServiceFactory
that retrieve the basic authentication credentials from a Google Cloud Secret Manager secret. - GCPSecretSessionServiceFactory() - Constructor for class org.apache.beam.sdk.io.solace.broker.GCPSecretSessionServiceFactory
- GCPSecretSessionServiceFactory.Builder - Class in org.apache.beam.sdk.io.solace.broker
- GcpTempLocationFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.GcpTempLocationFactory
- GcpUserCredentialsFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.GcpUserCredentialsFactory
- GCS_URI - Static variable in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
-
Pattern that is used to parse a GCS URL.
- GcsCountersOptions() - Constructor for class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.GcsCountersOptions
- GcsCreateOptions - Class in org.apache.beam.sdk.extensions.gcp.storage
-
An abstract class that contains common configuration options for creating resources.
- GcsCreateOptions() - Constructor for class org.apache.beam.sdk.extensions.gcp.storage.GcsCreateOptions
- GcsCreateOptions.Builder - Class in org.apache.beam.sdk.extensions.gcp.storage
-
A builder for
GcsCreateOptions
. - GcsCustomAuditEntries() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcsOptions.GcsCustomAuditEntries
- GcsFileSystemRegistrar - Class in org.apache.beam.sdk.extensions.gcp.storage
-
AutoService
registrar for theGcsFileSystem
. - GcsFileSystemRegistrar() - Constructor for class org.apache.beam.sdk.extensions.gcp.storage.GcsFileSystemRegistrar
- GcsOptions - Interface in org.apache.beam.sdk.extensions.gcp.options
-
Options used to configure Google Cloud Storage.
- GcsOptions.ExecutorServiceFactory - Class in org.apache.beam.sdk.extensions.gcp.options
-
Returns the default
ExecutorService
to use within the Apache Beam SDK. - GcsOptions.GcsCustomAuditEntries - Class in org.apache.beam.sdk.extensions.gcp.options
-
Creates a
GcsOptions.GcsCustomAuditEntries
that key-value pairs to be stored as custom information in GCS audit logs. - GcsOptions.PathValidatorFactory - Class in org.apache.beam.sdk.extensions.gcp.options
-
Creates a
PathValidator
object using the class specified inGcsOptions.getPathValidatorClass()
. - GcsPath - Class in org.apache.beam.sdk.extensions.gcp.util.gcsfs
-
Implements the Java NIO
Path
API for Google Cloud Storage paths. - GcsPath(FileSystem, String, String) - Constructor for class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
-
Constructs a GcsPath.
- GcsPathValidator - Class in org.apache.beam.sdk.extensions.gcp.storage
-
GCP implementation of
PathValidator
. - GcsReadOptionsFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.GcsReadOptionsFactory
- GcsResourceId - Class in org.apache.beam.sdk.extensions.gcp.storage
-
ResourceId
implementation for Google Cloud Storage. - GcsStager - Class in org.apache.beam.runners.dataflow.util
-
Utility class for staging files to GCS.
- gcsUploadBufferSizeBytes() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsCreateOptions
-
The buffer size (in bytes) to use when uploading files to GCS.
- GcsUtil - Class in org.apache.beam.sdk.extensions.gcp.util
-
Provides operations on GCS.
- GcsUtil.CreateOptions - Class in org.apache.beam.sdk.extensions.gcp.util
- GcsUtil.CreateOptions.Builder - Class in org.apache.beam.sdk.extensions.gcp.util
- GcsUtil.GcsCountersOptions - Class in org.apache.beam.sdk.extensions.gcp.util
- GcsUtil.GcsReadOptionsFactory - Class in org.apache.beam.sdk.extensions.gcp.util
- GcsUtil.GcsUtilFactory - Class in org.apache.beam.sdk.extensions.gcp.util
-
This is a
DefaultValueFactory
able to create aGcsUtil
using any transport flags specified on thePipelineOptions
. - GcsUtil.StorageObjectOrIOException - Class in org.apache.beam.sdk.extensions.gcp.util
-
A class that holds either a
StorageObject
or anIOException
. - GcsUtilFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.GcsUtilFactory
- generate(Schema) - Static method in class org.apache.beam.sdk.coders.RowCoderGenerator
- generateInitialChangeStreamPartitions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.ChangeStreamDao
-
Returns the result from GenerateInitialChangeStreamPartitions API.
- generateInitialPartitionsAction(ChangeStreamMetrics, ChangeStreamDao, Instant) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ActionFactory
-
Creates and returns a singleton instance of an action class for processing
DetectNewPartitionsDoFn
- GenerateInitialPartitionsAction - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.action
-
Class to generate first set of outputs for
DetectNewPartitionsDoFn
. - GenerateInitialPartitionsAction(ChangeStreamMetrics, ChangeStreamDao, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.GenerateInitialPartitionsAction
- generateRandom(String) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataTableNames
-
Generates a unique name for the partition metadata table and its indexes.
- generateRowKeyPrefix() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.UniqueIdGenerator
-
Return a random base64 encoded 8 byte string.
- GenerateSequence - Class in org.apache.beam.sdk.io
-
A
PTransform
that produces longs starting from the given value, and either up to the given limit or untilLong.MAX_VALUE
/ until the given time elapses. - GenerateSequence() - Constructor for class org.apache.beam.sdk.io.GenerateSequence
- GenerateSequence.External - Class in org.apache.beam.sdk.io
-
Exposes GenerateSequence as an external transform for cross-language usage.
- GenerateSequence.External.ExternalConfiguration - Class in org.apache.beam.sdk.io
-
Parameters class to expose the transform to an external SDK.
- GenerateSequenceConfiguration() - Constructor for class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration
- GenerateSequenceSchemaTransformProvider - Class in org.apache.beam.sdk.providers
- GenerateSequenceSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider
- GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration - Class in org.apache.beam.sdk.providers
- GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Builder - Class in org.apache.beam.sdk.providers
- GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Rate - Class in org.apache.beam.sdk.providers
- GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Rate.Builder - Class in org.apache.beam.sdk.providers
- GenerateSequenceSchemaTransformProvider.GenerateSequenceSchemaTransform - Class in org.apache.beam.sdk.providers
- GenerateSequenceTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.seqgen
-
Sequence generator table provider.
- GenerateSequenceTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.seqgen.GenerateSequenceTableProvider
- generic() - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory
-
Returns an
AvroDatumFactory
instance for GenericRecord. - generic(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
-
Returns an
AvroCoder
instance for the Avro schema. - GenericDatumFactory() - Constructor for class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory.GenericDatumFactory
- GenericDlq - Class in org.apache.beam.sdk.schemas.io
-
Helper to generate a DLQ transform to write PCollection
to an external system. - GenericDlqProvider - Interface in org.apache.beam.sdk.schemas.io
-
A Provider for generic DLQ transforms that handle deserialization failures.
- get() - Method in class org.apache.beam.runners.portability.CloseableResource
-
Gets the underlying resource.
- get() - Static method in class org.apache.beam.runners.spark.io.EmptyCheckpointMark
- get() - Method in class org.apache.beam.runners.twister2.translators.functions.Twister2SinkFunction
- get() - Method in class org.apache.beam.sdk.io.aws2.schemas.AwsBuilderFactory
- get() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator.BytesThroughputEstimator
-
Returns the estimated throughput bytes for this run.
- get() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.BytesThroughputEstimator
-
Returns the estimated throughput bytes for now.
- get() - Method in interface org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.ThroughputEstimator
-
Returns the estimated throughput for now.
- get() - Method in class org.apache.beam.sdk.io.hadoop.SerializableConfiguration
- get() - Method in class org.apache.beam.sdk.io.kafka.KafkaIOUtils.MovingAvg
- get() - Method in interface org.apache.beam.sdk.options.ValueProvider
-
Returns the runtime value wrapped by this
ValueProvider
in case it isValueProvider.isAccessible()
, otherwise fails. - get() - Method in class org.apache.beam.sdk.options.ValueProvider.NestedValueProvider
- get() - Method in class org.apache.beam.sdk.options.ValueProvider.RuntimeValueProvider
- get() - Method in class org.apache.beam.sdk.options.ValueProvider.StaticValueProvider
- get() - Method in interface org.apache.beam.sdk.transforms.Materializations.IterableView
-
Returns an iterable for all values.
- get() - Method in interface org.apache.beam.sdk.transforms.Materializations.MultimapView
-
Returns an iterable of all keys.
- get() - Method in class org.apache.beam.sdk.values.PCollectionViews.ValueOrMetadata
- get(int) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel.WrappedList
- get(int) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel.WrappedRow
- get(int) - Method in class org.apache.beam.sdk.values.PCollectionList
-
Returns the
PCollection
at the given index (origin zero). - get(int) - Method in class org.apache.beam.sdk.values.TupleTagList
-
Returns the
TupleTag
at the given index (origin zero). - get(K) - Method in interface org.apache.beam.sdk.transforms.Materializations.MultimapView
-
Returns an iterable of all the values for the specified key.
- get(Long) - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
-
Returns the
Broadcast
containing theGlobalWatermarkHolder.SparkWatermarks
mapped to their sources. - get(Object) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel.WrappedMap
- get(Object) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
- get(String) - Method in interface org.apache.beam.sdk.state.TimerMap
- get(String) - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
- get(String) - Method in class org.apache.beam.sdk.values.PCollectionTuple
-
Returns the
PCollection
associated with the given tag in thisPCollectionTuple
. - get(K) - Method in interface org.apache.beam.sdk.state.MapState
-
A deferred lookup, using null values if the item is not found.
- get(K) - Method in interface org.apache.beam.sdk.state.MultimapState
-
A deferred lookup, returns an empty iterable if the item is not found.
- get(K, W) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.BagUserStateHandler
-
Returns an
Iterable
of values representing the bag user state for the given key and window. - get(K, W) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.MultimapSideInputHandler
-
Returns an
Iterable
of values representing the side input for the given key and window. - get(ObjectT) - Method in interface org.apache.beam.sdk.schemas.FieldValueGetter
- get(JobInfo) - Method in class org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageContextFactory
- get(JobInfo) - Method in interface org.apache.beam.runners.fnexecution.control.ExecutableStageContext.Factory
-
Get or create
ExecutableStageContext
for givenJobInfo
. - get(JobInfo) - Method in class org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory
- get(JobInfo) - Method in class org.apache.beam.runners.spark.translation.SparkExecutableStageContextFactory
- get(BoundedWindow) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.SideInputValues.ByWindow
- get(BoundedWindow) - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.SideInputValues
- get(BoundedWindow) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.SideInputValues.Global
- get(PCollectionView<T>, BoundedWindow) - Method in class org.apache.beam.runners.flink.translation.functions.FlinkSideInputReader
- get(PCollectionView<T>, BoundedWindow) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.CachedSideInputReader
- get(PCollectionView<T>, BoundedWindow) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.SparkSideInputReader
- get(PCollectionView<T>, BoundedWindow) - Method in class org.apache.beam.runners.spark.util.CachedSideInputReader
- get(PCollectionView<T>, BoundedWindow) - Method in class org.apache.beam.runners.spark.util.SparkSideInputReader
- get(PCollectionView<T>, BoundedWindow) - Method in class org.apache.beam.runners.twister2.utils.Twister2SideInputReader
- get(PValue) - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
-
Retrieve an object of Type T associated with the PValue passed in.
- get(TupleTag<T>) - Method in interface org.apache.beam.sdk.transforms.DoFn.MultiOutputReceiver
-
Returns an
DoFn.OutputReceiver
for the given tag. - get(TupleTag<T>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
- get(TupleTag<V>) - Method in class org.apache.beam.sdk.transforms.CombineFns.CoCombineResult
-
Returns the value represented by the given
TupleTag
. - get(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.AutoValueSchema.AbstractGetterTypeSupplier
- get(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.JavaBeanSchema.GetterTypeSupplier
- get(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.JavaBeanSchema.SetterTypeSupplier
- get(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.JavaFieldSchema.JavaFieldTypeSupplier
- get(TypeDescriptor<?>) - Method in interface org.apache.beam.sdk.schemas.utils.FieldValueTypeSupplier
-
Return all the FieldValueTypeInformations.
- get(TypeDescriptor<?>, Schema) - Method in interface org.apache.beam.sdk.schemas.utils.FieldValueTypeSupplier
-
Return all the FieldValueTypeInformations.
- get(W) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.IterableSideInputHandler
-
Returns an
Iterable
of values representing the side input for the given window. - get(W) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.MultimapSideInputHandler
-
Returns an
Iterable
of keys representing the side input for the given window. - getAcceptedIssuers() - Method in class org.apache.beam.sdk.io.splunk.CustomX509TrustManager
- getAccessKey() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
- getAccountName() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
- getAccum() - Method in interface org.apache.beam.sdk.state.CombiningState
-
Read the merged accumulator for this state cell.
- getAccumTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.LazyAggregateCombineFn
- getAccumTVariable() - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
- getAccumTVariable() - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
-
Returns the
TypeVariable
ofAccumT
. - getAccumTVariable() - Method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
-
Returns the
TypeVariable
ofAccumT
. - getAccumTVariable() - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
-
Returns the
TypeVariable
ofAccumT
. - getAccumTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.ArrayAgg.ArrayAggArray
-
Returns the
TypeVariable
ofAccumT
. - getAccumTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations.BitXOr
-
Returns the
TypeVariable
ofAccumT
. - getAccumTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions.PositionAwareCombineFn
-
Returns the
TypeVariable
ofAccumT
. - getAccumTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn
-
Returns the
TypeVariable
ofAccumT
. - getAccumTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
-
Returns the
TypeVariable
ofAccumT
. - getAccumTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
-
Returns the
TypeVariable
ofAccumT
. - getAccumTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
-
Returns the
TypeVariable
ofAccumT
. - getAccumTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
-
Returns the
TypeVariable
ofAccumT
. - getAccumTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
-
Returns the
TypeVariable
ofAccumT
. - getAccumTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
-
Returns the
TypeVariable
ofAccumT
. - getAccumTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.SimpleCombineFn
-
Returns the
TypeVariable
ofAccumT
. - getAccumTVariable() - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
-
Returns the
TypeVariable
ofAccumT
. - getAccumTVariable() - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
-
Returns the
TypeVariable
ofAccumT
. - getAccumTVariable() - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
-
Returns the
TypeVariable
ofAccumT
. - getAccumTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CountIf.CountIfFn
-
Returns the
TypeVariable
ofAccumT
. - getAccumTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CovarianceFn
-
Returns the
TypeVariable
ofAccumT
. - getAccumTVariable() - Method in class org.apache.beam.sdk.extensions.ordered.combiner.DefaultSequenceCombiner
-
Returns the
TypeVariable
ofAccumT
. - getAccumTVariable() - Method in class org.apache.beam.runners.flink.FlinkStreamingAggregationsTranslators.ConcatenateAsIterable
-
Returns the
TypeVariable
ofAccumT
. - getAccumTVariable() - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
-
Returns the
TypeVariable
ofAccumT
. - getAccumTVariable() - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
-
Returns the
TypeVariable
ofAccumT
. - getAccumTVariable() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Concatenate
-
Returns the
TypeVariable
ofAccumT
. - getAccumTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggByte
-
Returns the
TypeVariable
ofAccumT
. - getAccumTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggString
-
Returns the
TypeVariable
ofAccumT
. - getAccumTVariable() - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
-
Returns the
TypeVariable
ofAccumT
. - getAccumTVariable() - Method in class org.apache.beam.sdk.transforms.Top.TopCombineFn
-
Returns the
TypeVariable
ofAccumT
. - getAccumTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.VarianceFn
-
Returns the
TypeVariable
ofAccumT
. - getAccumulatorCoder(CoderRegistry, Coder<TimestampedValue<KV<EventKeyT, KV<Long, EventT>>>>) - Method in class org.apache.beam.sdk.extensions.ordered.combiner.DefaultSequenceCombiner
- getAccumulatorCoder(CoderRegistry, Coder) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
- getAccumulatorCoder(CoderRegistry, Coder) - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
- getAccumulatorCoder(CoderRegistry, Coder<DataT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
- getAccumulatorCoder(CoderRegistry, Coder<DataT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
- getAccumulatorCoder(CoderRegistry, Coder<InputT>) - Method in class org.apache.beam.sdk.extensions.sql.impl.LazyAggregateCombineFn
- getAccumulatorCoder(CoderRegistry, Coder<InputT>) - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
- getAccumulatorCoder(CoderRegistry, Coder<InputT>) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
- getAccumulatorCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.ArrayAgg.ArrayAggArray
- getAccumulatorCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations.BitXOr
- getAccumulatorCoder(CoderRegistry, Coder<InputT>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions.PositionAwareCombineFn
- getAccumulatorCoder(CoderRegistry, Coder<InputT>) - Method in class org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn
- getAccumulatorCoder(CoderRegistry, Coder<InputT>) - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
- getAccumulatorCoder(CoderRegistry, Coder<V>) - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
- getAccumulatorCoder(CoderRegistry, Coder<V>) - Method in class org.apache.beam.sdk.transforms.Combine.SimpleCombineFn
- getAccumulatorCoder(CoderRegistry, Coder<InputT>) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
- getAccumulatorCoder(CoderRegistry, Coder<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Concatenate
- getAccumulatorCoder(CoderRegistry, Coder<byte[]>) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggByte
- getAccumulatorCoder(CoderRegistry, Coder<String>) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggString
- getAccumulatorCoder(CoderRegistry, Coder<InputT>) - Method in interface org.apache.beam.sdk.transforms.CombineFnBase.GlobalCombineFn
-
Returns the
Coder
to use for accumulatorAccumT
values, or null if it is not able to be inferred. - getAccumulatorCoder(CoderRegistry, Coder<Boolean>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CountIf.CountIfFn
- getAccumulatorCoder(CoderRegistry, Coder<Double>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
- getAccumulatorCoder(CoderRegistry, Coder<Integer>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
- getAccumulatorCoder(CoderRegistry, Coder<Long>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
- getAccumulatorCoder(CoderRegistry, Coder<Row>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CovarianceFn
- getAccumulatorCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.runners.flink.FlinkStreamingAggregationsTranslators.ConcatenateAsIterable
- getAccumulatorCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.VarianceFn
- getAccumulatorCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
- getAccumulatorCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
-
Deprecated.
- getAccumulatorCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
- getAccumulatorCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.transforms.Top.TopCombineFn
- getAccumulatorCoder(CoderRegistry, Coder<V>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
- getActiveWorkRefreshPeriodMillis() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getAdditionalInputs() - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.ParDoSingle
- getAdditionalInputs() - Method in class org.apache.beam.sdk.io.WriteFiles
- getAdditionalInputs() - Method in class org.apache.beam.sdk.transforms.Combine.Globally
-
Returns the side inputs of this
Combine
, tagged with the tag of thePCollectionView
. - getAdditionalInputs() - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
-
Returns the side inputs of this
Combine
, tagged with the tag of thePCollectionView
. - getAdditionalInputs() - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
-
Returns the side inputs of this
ParDo
, tagged with the tag of thePCollectionView
. - getAdditionalInputs() - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
-
Returns the side inputs of this
ParDo
, tagged with the tag of thePCollectionView
. - getAdditionalInputs() - Method in class org.apache.beam.sdk.transforms.PTransform
-
Returns all
PValues
that are consumed as inputs to thisPTransform
that are independent of the expansion of thePTransform
withinPTransform.expand(PInput)
. - getAdditionalOutputTags() - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
- getAddresses() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
- getAlgorithm() - Method in enum class org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions.AutoscalingAlgorithmType
-
Returns the string representation of this type.
- getAlgorithm() - Method in class org.apache.beam.sdk.io.aws2.s3.SSECustomerKey
- getAliases() - Static method in class org.apache.beam.sdk.managed.ManagedSchemaTransformProvider
- getAll() - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
-
Retrieve all HL7v2 Messages from a PCollection of message IDs (such as from PubSub notification subscription).
- getAll() - Method in class org.apache.beam.sdk.values.PCollectionList
-
Returns an immutable List of all the
PCollections
in thisPCollectionList
. - getAll() - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
-
Returns an immutable Map from tag to corresponding
PCollection
, for all the members of thisPCollectionRowTuple
. - getAll() - Method in class org.apache.beam.sdk.values.PCollectionTuple
-
Returns an immutable Map from
TupleTag
to correspondingPCollection
, for all the members of thisPCollectionTuple
. - getAll() - Method in class org.apache.beam.sdk.values.TupleTagList
-
Returns an immutable List of all the
TupleTags
in thisTupleTagList
. - getAll(String) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
-
Like
CoGbkResult.getAll(TupleTag)
but using a String instead of aTupleTag
. - getAll(TupleTag<V>) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
-
Returns the values from the table represented by the given
TupleTag<V>
as anIterable<V>
(which may be empty if there are no results). - getAllFields() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
-
If true, all fields are being accessed.
- getAllIds(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- getAllJobs() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
- getAllMetadata() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
-
Deprecated.use schema options instead.
- getAllowDuplicates() - Method in class org.apache.beam.sdk.transforms.Redistribute.RedistributeArbitrarily
- getAllowDuplicates() - Method in class org.apache.beam.sdk.transforms.Redistribute.RedistributeByKey
- getAllowedLateness() - Method in class org.apache.beam.sdk.values.WindowingStrategy
- getAllowedTimestampSkew() - Method in class org.apache.beam.sdk.transforms.DoFn
-
Deprecated.This method permits a
DoFn
to emit elements behind the watermark. These elements are considered late, and if behind theallowed lateness
of a downstreamPCollection
may be silently dropped. See https://github.com/apache/beam/issues/18065 for details on a replacement. - getAllowedTimestampSkew() - Method in class org.apache.beam.sdk.transforms.WithTimestamps
-
Deprecated.This method permits a to elements to be emitted behind the watermark. These elements are considered late, and if behind the
allowed lateness
of a downstreamPCollection
may be silently dropped. See https://github.com/apache/beam/issues/18065 for details on a replacement. - getAllowlist() - Method in class org.apache.beam.sdk.expansion.service.ExpansionServiceConfig
- getAllowNonRestoredState() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getAllPartitionsCreatedAfter(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
-
Fetches all partitions with a
PartitionMetadataAdminDao.COLUMN_CREATED_AT
less than the given timestamp. - getAllRows(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- getAllWorkerStatuses(long, TimeUnit) - Method in class org.apache.beam.runners.fnexecution.status.BeamWorkerStatusGrpcService
-
Get all the statuses from all connected SDK harnesses within specified timeout.
- getAlpha() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPFieldRef
- getAlsoStartLoopbackWorker() - Method in interface org.apache.beam.sdk.expansion.service.ExpansionServiceOptions
- getAndMaybeCreateSplitOutput(ReaderOutput<OutputT>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase.ReaderAndOutput
- getAnnotatedConstructor(Class<?>) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
- getAnnotatedCreateMethod(Class<?>) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
- getAnnotations() - Method in class org.apache.beam.sdk.transforms.PTransform
-
Returns annotations map to provide additional hints to the runner.
- getApiKey() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
- getApiPrefix() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
Generates the API endpoint prefix based on the set values.
- getApiRootUrl() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
The root URL for the Dataflow API.
- getApiServiceDescriptor() - Method in class org.apache.beam.sdk.fn.server.GrpcFnServer
-
Get an
Endpoints.ApiServiceDescriptor
describing the endpoint thisGrpcFnServer
is bound to. - getAppend() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider.Configuration
- getAppId() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
- getApplicationName() - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher.TableAndQuery
- getAppliedFn(CoderRegistry, Coder<? extends KV<K, ? extends Iterable<InputT>>>, WindowingStrategy<?, ?>) - Method in class org.apache.beam.sdk.transforms.Combine.GroupedValues
-
Returns the
Combine.CombineFn
bound to its coders. - getApplyMethod(ScalarFn) - Static method in class org.apache.beam.sdk.extensions.sql.impl.ScalarFnReflector
-
Gets the method annotated with
ScalarFn.ApplyMethod
fromscalarFn
. - getAppName() - Method in interface org.apache.beam.sdk.options.ApplicationNameOptions
-
Name of application, for display purposes.
- getAppProfileId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
-
Returns the app profile being read from.
- getApproximateArrivalTimestamp() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
- getArgument() - Method in class org.apache.beam.sdk.schemas.logicaltypes.Date
- getArgument() - Method in class org.apache.beam.sdk.schemas.logicaltypes.DateTime
- getArgument() - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
- getArgument() - Method in class org.apache.beam.sdk.schemas.logicaltypes.NanosDuration
- getArgument() - Method in class org.apache.beam.sdk.schemas.logicaltypes.NanosInstant
- getArgument() - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
- getArgument() - Method in class org.apache.beam.sdk.schemas.logicaltypes.PassThroughLogicalType
- getArgument() - Method in class org.apache.beam.sdk.schemas.logicaltypes.Time
- getArgument() - Method in class org.apache.beam.sdk.schemas.logicaltypes.UuidLogicalType
- getArgument() - Method in interface org.apache.beam.sdk.schemas.Schema.LogicalType
-
An optional argument to configure the type.
- getArguments() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getArgumentType() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.LocalMktDate
- getArgumentType() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCDateOnly
- getArgumentType() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimeOnly
- getArgumentType() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimestamp
- getArgumentType() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimeOnly
- getArgumentType() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimestamp
- getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.Date
- getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.DateTime
- getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
- getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.MicrosInstant
- getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.NanosDuration
- getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.NanosInstant
- getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
- getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.PassThroughLogicalType
- getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.PythonCallable
- getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.SchemaLogicalType
- getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.Time
- getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.UuidLogicalType
- getArgumentType() - Method in interface org.apache.beam.sdk.schemas.Schema.LogicalType
-
A schema type representing how to interpret the argument.
- getArgumentTypes(Method) - Method in class org.apache.beam.sdk.values.TypeDescriptor
-
Returns a list of argument types for the given method, which must be a part of the class.
- getArity() - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeInformation
- getArity() - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueTypeInformation
- getArray(int) - Method in class org.apache.beam.sdk.values.Row
-
Get an array value by field index,
IllegalStateException
is thrown if schema doesn't match. - getArray(String) - Method in class org.apache.beam.sdk.values.Row
-
Get an array value by field name,
IllegalStateException
is thrown if schema doesn't match. - getArtifact(ArtifactApi.GetArtifactRequest, StreamObserver<ArtifactApi.GetArtifactResponse>) - Method in class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
- getArtifact(RunnerApi.ArtifactInformation) - Static method in class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
- getArtifactPort() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver.ServerConfiguration
- getArtifactStagingPath() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver.ServerConfiguration
- getATN() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
- getATN() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- getAttachedMode() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getAttachmentBytes() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record
-
Gets the attachment data of the message as a byte array, if any.
- getAttempted() - Method in class org.apache.beam.sdk.metrics.MetricResult
-
Return the value of this metric across all attempts of executing all parts of the pipeline.
- getAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
-
Returns the given attribute value.
- getAttributeId() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
- getAttributeId() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
- getAttributeMap() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
-
Returns the full map of attributes.
- getAttributeMap() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
- getAttributes() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
- getAttributes() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration
- getAttributes() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
- getAttributes() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
- getAttributesMap() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
- getAttributesMap() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration
- getAuthenticator() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- getAuthenticator() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- getAuthToken(String, String) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverConfiguration
-
Certain embedded scenarios and so on actually allow for having no authentication at all.
- getAutoCommit() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- getAutoOffsetResetConfig() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- getAutoscalingAlgorithm() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
-
The autoscaling algorithm to use for the workerpool.
- getAutosharding() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
- getAutoSharding() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
- getAutoWatermarkInterval() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getAvroBytesToRowFunction(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
-
Returns a function mapping encoded AVRO
GenericRecord
s to BeamRow
s. - getAvroFilterFormatFunction(boolean) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.PortableBigQueryDestinations
- getAwsCredentialsProvider() - Method in interface org.apache.beam.sdk.io.aws2.options.AwsOptions
-
AwsCredentialsProvider
used to configure AWS service clients. - getAwsRegion() - Method in interface org.apache.beam.sdk.io.aws2.options.AwsOptions
-
Region used to configure AWS service clients.
- getAzureConnectionString() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
- getAzureCredentialsProvider() - Method in interface org.apache.beam.sdk.io.azure.options.AzureOptions
-
The credential instance that should be used to authenticate against Azure services.
- getBacking() - Method in class org.apache.beam.sdk.fn.data.WeightedList
- getBacklogBytes(String) - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthSempClient
- getBacklogBytes(String) - Method in class org.apache.beam.sdk.io.solace.broker.SempBasicAuthClientExecutor
- getBacklogBytes(String) - Method in interface org.apache.beam.sdk.io.solace.broker.SempClient
-
Retrieves the size of the backlog (in bytes) for the specified queue.
- getBacklogCheckTime() - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicy.PartitionContext
-
The time at which latest offset for the partition was fetched in order to calculate backlog.
- getBadRecordErrorHandler() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getBadRecordErrorHandler() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- getBadRecordErrorHandler() - Method in class org.apache.beam.sdk.io.WriteFiles
- getBadRecordRouter() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- getBadRecordRouter() - Method in class org.apache.beam.sdk.io.WriteFiles
- getBagUserStateSpecs() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.ExecutableProcessBundleDescriptor
-
Get a mapping from PTransform id to user state input id to
bag user states
that are used during execution. - getBaseAutoValueClass(TypeDescriptor<?>) - Static method in class org.apache.beam.sdk.schemas.utils.AutoValueUtils
- getBaseName() - Method in class org.apache.beam.sdk.metrics.LabeledMetricNameUtils.ParsedMetricName
- getBaseType() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.LocalMktDate
- getBaseType() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCDateOnly
- getBaseType() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimeOnly
- getBaseType() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimestamp
- getBaseType() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimeOnly
- getBaseType() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimestamp
- getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.Date
- getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.DateTime
- getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
- getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.MicrosInstant
- getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.NanosDuration
- getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.NanosInstant
- getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
- getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.PassThroughLogicalType
- getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.PythonCallable
- getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.SchemaLogicalType
- getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.Time
- getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.UuidLogicalType
- getBaseType() - Method in interface org.apache.beam.sdk.schemas.Schema.LogicalType
-
The base
Schema.FieldType
used to store values of this type. - getBaseValue(int) - Method in class org.apache.beam.sdk.values.Row
-
Returns the base type for this field.
- getBaseValue(int, Class<T>) - Method in class org.apache.beam.sdk.values.Row
-
Returns the base type for this field.
- getBaseValue(String) - Method in class org.apache.beam.sdk.values.Row
-
Returns the base type for this field.
- getBaseValue(String, Class<T>) - Method in class org.apache.beam.sdk.values.Row
-
Returns the base type for this field.
- getBaseValues() - Method in class org.apache.beam.sdk.values.Row
-
Return a list of data values.
- getBatchClient() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor
- getBatchCombinePerKeyOperator(FlinkStreamingTranslationContext, PCollection<KV<K, InputT>>, Map<Integer, PCollectionView<?>>, List<PCollectionView<?>>, Coder<WindowedValue<KV<K, AccumT>>>, CombineFnBase.GlobalCombineFn<InputT, AccumT, ?>, WindowDoFnOperator<K, AccumT, OutputT>, TypeInformation<WindowedValue<KV<K, OutputT>>>) - Static method in class org.apache.beam.runners.flink.FlinkStreamingAggregationsTranslators
- getBatchDuration() - Method in class org.apache.beam.runners.spark.io.CreateStream
- getBatchDuration(SerializablePipelineOptions) - Static method in class org.apache.beam.runners.spark.translation.TranslationUtils
-
Retrieves the batch duration in milliseconds from Spark pipeline options.
- getBatches() - Method in class org.apache.beam.runners.spark.io.CreateStream
-
Get the underlying queue representing the mock stream of micro-batches.
- getBatching() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration
- getBatchingParams() - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches
-
Returns user supplied parameters for batching.
- getBatchingParams() - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches.WithShardedKey
-
Returns user supplied parameters for batching.
- getBatchInitialCount() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
The initial size of a batch; used in the absence of the QoS system having significant data to determine a better batch size.
- getBatchIntervalMillis() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
- getBatchMaxBytes() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
The maximum number of bytes to include in a batch.
- getBatchMaxCount() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
The maximum number of writes to include in a batch.
- getBatchService() - Method in interface org.apache.beam.sdk.io.snowflake.services.SnowflakeServices
- getBatchService() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeServicesImpl
- getBatchSize() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
- getBatchSize() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteConfiguration
- getBatchSize() - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches.BatchingParams
- getBatchSizeBytes() - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
- getBatchSizeBytes() - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText
- getBatchSizeBytes() - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
- getBatchSizeBytes() - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches.BatchingParams
- getBatchTargetLatency() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
Target latency for batch requests.
- getBeamCheckpointDir() - Method in class org.apache.beam.runners.spark.translation.streaming.Checkpoint.CheckpointDir
- getBeamRelInput(RelNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils
- getBeamSchemaFromProto(String, String) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoByteUtils
-
Retrieves a Beam Schema from a Protocol Buffer message.
- getBeamSchemaFromProtoSchema(String, String) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoByteUtils
-
Parses the given Protocol Buffers schema string, retrieves the Descriptor for the specified message name, and constructs a Beam Schema from it.
- getBeamSplitSource() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceSplit
- getBeamSqlTable() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
- getBeamSqlUdfs() - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.UdfUdafProvider
-
For UDFs implement
BeamSqlUdf
. - getBearerToken() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
- getBigQueryEndpoint() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
-
BQ endpoint to use.
- getBigQueryLocation() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.TestBigQueryOptions
- getBigQueryProject() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getBigtableChangeStreamInstanceId() - Method in interface org.apache.beam.sdk.io.gcp.bigtable.changestreams.BigtableChangeStreamTestOptions
- getBigtableClientOverride() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
-
Returns the Bigtable client override.
- getBigtableOptions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Deprecated.read options are configured directly on BigtableIO.read(). Use
BigtableIO.Read.populateDisplayData(DisplayData.Builder)
to view the current configurations. - getBigtableOptions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
-
Deprecated.write options are configured directly on BigtableIO.write(). Use
BigtableIO.Write.populateDisplayData(DisplayData.Builder)
to view the current configurations. - getBlobServiceEndpoint() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
-
The Azure Blobstore service endpoint used by the Blob service client.
- getBlobstoreClientFactoryClass() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
- getBlockOffset() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmShard
-
Return the absolute position within the Ism file where the data block begins.
- getBloomFilterPosition() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.Footer
- getBody() - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsMessage
-
Message body.
- getBody() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
- getBoolean() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
- getBoolean(int) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Boolean
value by field index,ClassCastException
is thrown if schema doesn't match. - getBoolean(String) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.BOOLEAN
value by field name,IllegalStateException
is thrown if schema doesn't match. - getBoolean(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- getBoolean(Map<String, Object>, String, Boolean) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- getBootstrapServers() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
- getBootstrapServers() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
-
Sets the bootstrap servers for the Kafka consumer.
- getBootstrapServers() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration
- getBoundedness() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSource
- getBoundednessOfRelNode(RelNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
-
This method returns the Boundedness of a RelNode.
- getBoundedTrie(MetricName) - Method in class org.apache.beam.runners.jet.metrics.JetMetricsContainer
- getBoundedTrie(MetricName) - Method in interface org.apache.beam.sdk.metrics.MetricsContainer
-
Return the
BoundedTrie
that should be used for implementing the givenmetricName
in this container. - getBoundedTries() - Method in class org.apache.beam.sdk.metrics.MetricQueryResults
-
Return the metric results for the bounded tries that matched the filter.
- getBqStreamingApiLoggingFrequencySec() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getBranch() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getBroadcastSizeEstimate() - Method in class org.apache.beam.runners.spark.util.SideInputBroadcast
- getBucket() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
-
Returns the bucket name associated with this GCS path, or an empty string if this is a relative path component.
- getBucket(GcsPath) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
-
Get the
Bucket
from Cloud Storage path or propagates an exception. - getBucketKeyEnabled() - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
- getBucketKeyEnabled() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
-
Whether to use an S3 Bucket Key for object encryption with server-side encryption using AWS KMS (SSE-KMS) or not.
- getBufferSize() - Method in class org.apache.beam.sdk.fn.stream.BufferingStreamObserver
- getBuilderCreator(Class<?>, Schema, FieldValueTypeSupplier) - Static method in class org.apache.beam.sdk.schemas.utils.AutoValueUtils
-
Try to find an accessible builder class for creating an AutoValue class.
- getBuiltinMethods() - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BeamBuiltinFunctionProvider
- getBulkDirective() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Document
- getBulkEndPoint() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
- getBulkIO() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
- getBundle() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleParameter
-
FHIR R4 bundle resource object as a string.
- getBundle(OutputReceiverFactory, TimerReceiverFactory, StateRequestHandler, BundleProgressHandler) - Method in interface org.apache.beam.runners.fnexecution.control.StageBundleFactory
-
Get a new
bundle
for processing the data in an executable stage. - getBundle(OutputReceiverFactory, TimerReceiverFactory, StateRequestHandler, BundleProgressHandler, BundleFinalizationHandler) - Method in interface org.apache.beam.runners.fnexecution.control.StageBundleFactory
- getBundle(OutputReceiverFactory, TimerReceiverFactory, StateRequestHandler, BundleProgressHandler, BundleFinalizationHandler, BundleCheckpointHandler) - Method in interface org.apache.beam.runners.fnexecution.control.StageBundleFactory
- getBundle(OutputReceiverFactory, StateRequestHandler, BundleProgressHandler) - Method in interface org.apache.beam.runners.fnexecution.control.StageBundleFactory
-
Get a new
bundle
for processing the data in an executable stage. - getBundle(OutputReceiverFactory, StateRequestHandler, BundleProgressHandler, BundleFinalizationHandler) - Method in interface org.apache.beam.runners.fnexecution.control.StageBundleFactory
- getBundle(OutputReceiverFactory, StateRequestHandler, BundleProgressHandler, BundleFinalizationHandler, BundleCheckpointHandler) - Method in interface org.apache.beam.runners.fnexecution.control.StageBundleFactory
- getBundleFinalizer() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- getBundleProcessorCacheTimeout() - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
- getBundleSize() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
- getByte() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
- getByte(int) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.BYTE
value by field index,ClassCastException
is thrown if schema doesn't match. - getByte(String) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.BYTE
value by field name,IllegalStateException
is thrown if schema doesn't match. - getBytes() - Method in class org.apache.beam.sdk.io.range.ByteKey
-
Returns a newly-allocated
byte[]
representing thisByteKey
. - getBytes(int) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.BYTES
value by field index,ClassCastException
is thrown if schema doesn't match. - getBytes(String) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.BYTES
value by field name,IllegalStateException
is thrown if schema doesn't match. - getBytes(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- getBytes(Map<String, Object>, String, byte[]) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- getBytesPerOffset() - Method in class org.apache.beam.sdk.io.OffsetBasedSource
-
Returns approximately how many bytes of data correspond to a single offset in this source.
- getBytesToRowFn(Schema) - Static method in class org.apache.beam.sdk.io.TFRecordReadSchemaTransformProvider
- getCacheCandidates() - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
-
Get the map of cache candidates hold by the evaluation context.
- getCacheTokens() - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandler
-
Retrieves a list of valid cache tokens.
- getCalciteConnectionProperties() - Method in interface org.apache.beam.sdk.extensions.sql.impl.BeamSqlPipelineOptions
- getCallable() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf.Configuration
- getCandidatesForGroupByKeyAndWindowTranslation() - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
-
Get the map of GBK transforms to their full names, which are candidates for group by key and window translation which aims to reduce memory usage.
- getCaseEnumType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
-
Returns the
EnumerationType
that is used to represent the case type. - getCaseSensitive() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getCaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType.Value
-
Returns the enumeration that specified which OneOf field is set.
- getCatalog() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- getCatalog(String) - Method in interface org.apache.beam.sdk.extensions.sql.meta.catalog.CatalogManager
-
Attempts to fetch the catalog with this name.
- getCatalog(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.catalog.EmptyCatalogManager
- getCatalog(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.catalog.InMemoryCatalogManager
- getCatalogConfig() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getCatalogManager() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
- getCatalogName() - Method in class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig
- getCatalogName() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration
- getCatalogProperties() - Method in class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig
- getCatalogProperties() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration
- getCatalogs() - Method in interface org.apache.beam.sdk.extensions.sql.meta.catalog.CatalogRegistrar
- getCatalogs() - Method in class org.apache.beam.sdk.extensions.sql.meta.catalog.InMemoryCatalogRegistrar
- getCause() - Method in class org.apache.beam.runners.jet.FailedRunningPipelineResults
- getCellsMutatedPerColumn(String, String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema
-
Return the total number of cells affected when the specified column is mutated.
- getCellsMutatedPerRow(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema
-
Return the total number of cells affected with the given row is deleted.
- getCEPFieldRefFromParKeys(ImmutableBitSet) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPUtils
-
Transform the partition columns into serializable CEPFieldRef.
- getCepKind() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPOperator
- getCEPPatternFromPattern(Schema, RexNode, Map<String, RexNode>) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPUtils
-
Construct a list of
CEPPattern
s from aRexNode
. - getChangeSequenceNumber() - Method in class org.apache.beam.sdk.io.gcp.bigquery.RowMutationInformation
-
The value supplied to the BigQuery
_CHANGE_SEQUENCE_NUMBER
pseudo-column. - getChangeStreamContinuationTokens() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
- getChangeStreamContinuationTokens() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
- getChangeStreamDao() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.DaoFactory
- getChangeStreamDao() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.DaoFactory
-
Creates and returns a singleton DAO instance for querying a partition change stream.
- getChangeStreamName() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.DaoFactory
- getChangeStreamName() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration
- getChangeStreamNamePrefix() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
-
Return the prefix used to identify the rows belonging to this job.
- getChangeStreamNamePrefix() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
- getChannelFactory() - Method in class org.apache.beam.sdk.io.CompressedSource
- getChannelNames() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
- getChannelzShowOnlyWindmillServiceChannels() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getCharset() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.XmlConfiguration
- getCheckpointDir() - Method in interface org.apache.beam.runners.spark.SparkCommonPipelineOptions
- getCheckpointDurationMillis() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
- getCheckpointingInterval() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getCheckpointingInterval() - Method in interface org.apache.beam.sdk.io.kafka.KafkaIO.Read.FakeFlinkPipelineOptions
- getCheckpointingMode() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getCheckpointMark() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceSplit
- getCheckpointMark() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
- getCheckpointMark() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
- getCheckpointMark() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
-
Returns a
UnboundedSource.CheckpointMark
representing the progress of thisUnboundedReader
. - getCheckpointMarkCoder() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource
- getCheckpointMarkCoder() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedSourceImpl
- getCheckpointMarkCoder() - Method in class org.apache.beam.sdk.io.solace.read.UnboundedSolaceSource
- getCheckpointMarkCoder() - Method in class org.apache.beam.sdk.io.UnboundedSource
-
Returns a
Coder
for encoding and decoding the checkpoints for this source. - getCheckpointTimeoutMillis() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getCheckStopReadingFn() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getChildPartitions() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
-
List of child partitions yielded within this record.
- getChildRels(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall
- getClass(String, String) - Static method in class org.apache.beam.sdk.schemas.transforms.providers.StringCompiler
- getClasses() - Method in class org.apache.beam.sdk.values.TypeDescriptor
-
Returns a set of
TypeDescriptor
s, one for each superclass (including this class). - getClassName() - Method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Gets the name of the Java class that this CloudObject represents.
- getClazz() - Method in class org.apache.beam.sdk.schemas.utils.ReflectUtils.ClassWithSchema
- getClient(String) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
- getClientBuilderFactory() - Method in interface org.apache.beam.sdk.io.aws2.options.AwsOptions
- getClientFactory() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
- getClientInfo() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- getClientInfo(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- getClock() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
- getCloningBehavior() - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
Deprecated.Use
TestPipeline
with theDirectRunner
. - getCloseStream() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
- getClosingBehavior() - Method in class org.apache.beam.sdk.values.WindowingStrategy
- getClosure() - Method in class org.apache.beam.sdk.transforms.Contextful
-
Returns the closure.
- getCluster() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.QueryStatementConverter
- getClusterId() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
- getClustering() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- getClusteringFields() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
- getClusterName() - Method in interface org.apache.beam.runners.jet.JetPipelineOptions
- getClusterType() - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
- getCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.TypeCode
-
Returns the type code of the column.
- getCodec(DestinationT) - Method in class org.apache.beam.sdk.extensions.avro.io.DynamicAvroDestinations
-
Return an AVRO codec for a given destination.
- getCodeJarPathname() - Method in interface org.apache.beam.runners.jet.JetPipelineOptions
- getCoder() - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeInformation
- getCoder() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.FlinkStateNamespaceKeySerializer
- getCoder() - Method in class org.apache.beam.runners.fnexecution.control.RemoteOutputReceiver
- getCoder() - Method in class org.apache.beam.runners.fnexecution.data.RemoteInputDestination
- getCoder() - Method in class org.apache.beam.sdk.coders.DelegateCoder
-
Returns the coder used to encode/decode the intermediate values produced/consumed by the coding functions of this
DelegateCoder
. - getCoder() - Method in class org.apache.beam.sdk.fn.data.DataEndpoint
- getCoder() - Method in class org.apache.beam.sdk.fn.data.TimerEndpoint
- getCoder() - Method in interface org.apache.beam.sdk.io.singlestore.SingleStoreIO.RowMapperWithCoder
- getCoder() - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Record
-
The coder for the record, or null if there is no coder.
- getCoder() - Static method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
-
Returns a
Coder
suitable forIntervalWindow
. - getCoder() - Method in class org.apache.beam.sdk.values.PCollection
-
Returns the
Coder
used by thisPCollection
to encode and decode the values stored in it. - getCoder(Class<? extends T>, Class<T>, Map<Type, ? extends Coder<?>>, TypeVariable<?>) - Method in class org.apache.beam.sdk.coders.CoderRegistry
-
Deprecated.This method is to change in an unknown backwards incompatible way once support for this functionality is refined.
- getCoder(Class<T>) - Method in class org.apache.beam.sdk.coders.CoderRegistry
-
Returns the
Coder
to use for values of the given class. - getCoder(CoderRegistry) - Method in class org.apache.beam.sdk.io.kafka.ConfluentSchemaRegistryDeserializerProvider
- getCoder(CoderRegistry) - Method in interface org.apache.beam.sdk.io.kafka.DeserializerProvider
- getCoder(Pipeline) - Static method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord
- getCoder(TypeDescriptor<OutputT>, TypeDescriptor<InputT>, Coder<InputT>) - Method in class org.apache.beam.sdk.coders.CoderRegistry
-
Deprecated.This method is to change in an unknown backwards incompatible way once support for this functionality is refined.
- getCoder(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.coders.CoderRegistry
-
Returns the
Coder
to use for values of the given type. - getCoderArguments() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmRecordCoder
- getCoderArguments() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.MetadataKeyCoder
- getCoderArguments() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.SingletonKeyedWorkItemCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.AtomicCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.Coder
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.CustomCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.KvCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.MapCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.NullableCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.OptionalCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.ShardedKeyCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.SnappyCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.SortedMapCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.TimestampPrefixingWindowCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.VarLongCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.ZstdCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResultCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.io.hadoop.WritableCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.io.kafka.ProducerRecordCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.io.kafka.TopicPartitionCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.io.ReadableFileCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.testing.TestStream.TestStreamCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow.Coder
- getCoderArguments() - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow.Coder
- getCoderArguments() - Method in class org.apache.beam.sdk.values.PCollectionViews.ValueOrMetadataCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
- getCoderArguments() - Method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.values.WindowedValues.FullWindowedValueCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.values.WindowedValues.ValueOnlyWindowedValueCoder
-
Deprecated.
- getCoderInternal() - Method in interface org.apache.beam.sdk.values.PCollectionView
-
Deprecated.this method will be removed entirely. The
PCollection
underlying a side input, including itsCoder
, is part of the side input's specification with aParDo
transform, which will obtain that information via a package-private channel. - getCoderInternal() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
- getCoderProvider() - Static method in class org.apache.beam.sdk.coders.SerializableCoder
-
Returns a
CoderProvider
which uses theSerializableCoder
if possible for all types. - getCoderProvider() - Static method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
-
Returns a
CoderProvider
which uses theAvroCoder
if possible for all types. - getCoderProvider() - Static method in class org.apache.beam.sdk.extensions.protobuf.DynamicProtoCoder
- getCoderProvider() - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
- getCoderProvider() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResultCoder
- getCoderProvider() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.OffsetByteRangeCoder
- getCoderProvider() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.SubscriptionPartitionCoder
- getCoderProvider() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UuidCoder
- getCoderProvider() - Static method in class org.apache.beam.sdk.io.hadoop.WritableCoder
- getCoderProvider() - Static method in class org.apache.beam.sdk.io.splunk.SplunkEventCoder
- getCoderProviders() - Method in interface org.apache.beam.sdk.coders.CoderProviderRegistrar
-
Returns a list of
coder providers
which will be registered by default within eachcoder registry
instance. - getCoderProviders() - Method in class org.apache.beam.sdk.coders.DefaultCoder.DefaultCoderProviderRegistrar
- getCoderProviders() - Method in class org.apache.beam.sdk.coders.SerializableCoder.SerializableCoderProviderRegistrar
- getCoderProviders() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtobufCoderProviderRegistrar
- getCoderProviders() - Method in class org.apache.beam.sdk.io.amqp.AmqpMessageCoderProviderRegistrar
- getCoderProviders() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryCoderProviderRegistrar
- getCoderProviders() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubCoderProviderRegistrar
- getCoderProviders() - Method in class org.apache.beam.sdk.io.hadoop.WritableCoder.WritableCoderProviderRegistrar
- getCoderProviders() - Method in class org.apache.beam.sdk.io.hbase.HBaseCoderProviderRegistrar
- getCoderRegistry() - Method in class org.apache.beam.sdk.Pipeline
-
Returns the
CoderRegistry
that thisPipeline
uses. - getCoderTranslators() - Method in class org.apache.beam.sdk.extensions.avro.AvroGenericCoderRegistrar
- getCoderURNs() - Method in class org.apache.beam.sdk.extensions.avro.AvroGenericCoderRegistrar
- getCoGbkResultSchema() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
-
Returns the
CoGbkResultSchema
associated with thisKeyedPCollectionTuple
. - getCohorts() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter
-
Returns a list of sets of expressions that should be on the same level.
- getCollations() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
- getCollection() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple.TaggedKeyedPCollection
-
Returns the underlying PCollection of this TaggedKeyedPCollection.
- getCollectionElementType() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
- getColumnDelimiter() - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
- getColumnDelimiter() - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText
- getColumnDelimiter() - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
- getColumns() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- getColumns() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration
- getColumns() - Method in class org.apache.beam.sdk.io.snowflake.data.SnowflakeTableSchema
- getColumns(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema
- getCombineFn() - Method in class org.apache.beam.sdk.extensions.sql.impl.UdafImpl
- getCombineFn() - Method in class org.apache.beam.sdk.transforms.Combine.GloballyAsSingletonView
- getComment() - Method in class org.apache.beam.sdk.extensions.sql.meta.Table
- getCommitDeadline() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getCommitRetrySettings() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getCommitted() - Method in class org.apache.beam.sdk.metrics.MetricResult
-
Return the value of this metric across all successfully completed parts of the pipeline.
- getCommittedOrNull() - Method in class org.apache.beam.sdk.metrics.MetricResult
-
Return the value of this metric across all attempts of executing all parts of the pipeline.
- getCommitTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.TransactionResult
-
Returns the commit timestamp of the read / write transaction.
- getCommitTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The timestamp at which the modifications within were committed in Cloud Spanner.
- getCommitTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionEventRecord
-
Returns the timestamp at which the key range change occurred.
- getComponents() - Method in class org.apache.beam.sdk.coders.AtomicCoder
- getComponents() - Method in class org.apache.beam.sdk.coders.StructuredCoder
- getComponents() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResultCoder
- getComponents() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Path
-
Hierarchy list of component paths making up the full path, starting with the top-level child component path.
- getComponents() - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
- getComponents() - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow.Coder
- getComponents() - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
- getComponents() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
- getComponents() - Method in class org.apache.beam.sdk.values.WindowedValues.FullWindowedValueCoder
- getComponents(AvroGenericCoder) - Method in class org.apache.beam.sdk.extensions.avro.AvroGenericCoderTranslator
- getComponentType() - Method in class org.apache.beam.sdk.values.TypeDescriptor
-
Returns the component type if this type is an array type, otherwise returns
null
. - getCompression() - Method in class org.apache.beam.sdk.io.FileIO.ReadableFile
-
Returns the method with which this file will be decompressed in
FileIO.ReadableFile.open()
. - getCompression() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration
-
See
Compression
for expected values. - getCompression() - Method in class org.apache.beam.sdk.io.TextRowCountEstimator
- getCompression() - Method in class org.apache.beam.sdk.io.TFRecordReadSchemaTransformConfiguration
- getCompression() - Method in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformConfiguration
- getCompressionCodecName() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.ParquetConfiguration
- getComputeNumShards() - Method in class org.apache.beam.sdk.io.WriteFiles
- getConfig() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.QueryStatementConverter
- getConfig() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceProviderFromDataSourceConfiguration
- getConfigProperties() - Method in class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig
- getConfigProperties() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration
- getConfigUpdates() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
- getConfiguration() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.HadoopInputFormatBoundedSource
- getConfiguration() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
- getConfiguration() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
- getConfiguration() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Write
- getConfiguration() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.WriteRowMutations
- getConfigurationMap() - Method in class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
-
Configuration Map Getter.
- getConfigurationRow() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransform
- getConfigurationRow() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteSchemaTransformProvider.BigQueryWriteSchemaTransform
- getConfiguredLoggerFromOptions(SdkHarnessOptions) - Static method in interface org.apache.beam.sdk.options.SdkHarnessOptions
-
Configure log manager's default log level and log level overrides from the sdk harness options, and return the list of configured loggers.
- getConfluentSchemaRegistrySubject() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- getConfluentSchemaRegistryUrl() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- getConnection(InfluxDbIO.DataSourceConfiguration, boolean) - Static method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO
- getConnectionInitSql() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
- getConnectionInitSql() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
- getConnectionProperties() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
- getConnectionProperties() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
- getConnector() - Method in enum class org.apache.beam.io.debezium.Connectors
-
Class connector to debezium.
- getConnectStringPrefix() - Method in class org.apache.beam.sdk.extensions.sql.impl.JdbcDriver
- getConnectTimeout() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
- getConstructorCreator(TypeDescriptor<?>, Constructor, Schema, FieldValueTypeSupplier, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
- getConstructorCreator(TypeDescriptor<?>, Schema, FieldValueTypeSupplier) - Static method in class org.apache.beam.sdk.schemas.utils.AutoValueUtils
-
Try to find an accessible constructor for creating an AutoValue class.
- getConstructorCreator(TypeDescriptor<T>, Constructor<T>, Schema, FieldValueTypeSupplier, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.POJOUtils
- getConsumerConfig() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getConsumerConfigUpdates() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- getConsumerFactoryFn() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getConsumerFactoryFn() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- getConsumerPollingTimeout() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getContainerImageBaseRepository() - Method in class org.apache.beam.runners.dataflow.DataflowRunnerInfo
-
Provides the version/tag for constructing the container image path.
- getContent() - Method in class org.apache.beam.sdk.io.tika.ParseResult
-
Returns the extracted text.
- getContentEncoding() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
- getContentType() - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.CreateOptions
-
The content type for the created file, eg "text/plain".
- getContentType() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
- getContext() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
- getContext() - Method in class org.apache.beam.sdk.io.cdap.Plugin
-
Gets the context of a plugin.
- getContiguousSequenceRangeReevaluationFrequency() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler.OrderedProcessingGlobalSequenceHandler
-
How frequently the combiner should reevaluate the maximum range? This parameter only affects the behaviour of streaming pipelines.
- getContinuationTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
- getContinuationTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
-
Return a trigger to use after a
GroupByKey
to preserve the intention of this trigger. - getContinuationTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.Trigger.OnceTrigger
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterAll
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterEach
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterFirst
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterPane
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterSynchronizedProcessingTime
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.FromEndOfWindow
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.DefaultTrigger
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.Never.NeverTrigger
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.OrFinallyTrigger
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.Repeatedly
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.ReshuffleTrigger
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
-
Subclasses should override this to return the
Trigger.getContinuationTrigger()
of thisTrigger
. - getConversionOptions(ObjectNode) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTableProvider
- getConvertedSchemaInformation(Schema, TypeDescriptor<T>) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroSchemaInformationProvider
- getConvertedSchemaInformation(Schema, TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.utils.RowSchemaInformationProvider
- getConvertedSchemaInformation(Schema, TypeDescriptor<T>) - Method in interface org.apache.beam.sdk.schemas.utils.SchemaInformationProvider
- getConvertedSchemaInformation(Schema, TypeDescriptor<T>, SchemaRegistry) - Static method in class org.apache.beam.sdk.schemas.utils.ConvertHelpers
-
Get the coder used for converting from an inputSchema to a given type.
- getConvertedType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.DateConversion
- getConvertedType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.LocalTimestampMicrosConversion
- getConvertedType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.LocalTimestampMillisConversion
- getConvertedType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimeMicrosConversion
- getConvertedType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimeMillisConversion
- getConvertedType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimestampMicrosConversion
- getConvertedType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimestampMillisConversion
- getConvertedType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.DateConversion
- getConvertedType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimeConversion
- getConvertedType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimeMicrosConversion
- getConvertedType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimestampConversion
- getConvertedType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimestampMicrosConversion
- getConvertPrimitive(Schema.FieldType, TypeDescriptor<?>, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.ConvertHelpers
-
Returns a function to convert a Row into a primitive type.
- getCorrelationId() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
- getCosmosClientBuilder() - Method in interface org.apache.beam.sdk.io.azure.cosmos.CosmosOptions
- getCosmosKey() - Method in interface org.apache.beam.sdk.io.azure.cosmos.CosmosOptions
-
The Azure Cosmos key used to perform authentication for accessing resource.
- getCosmosServiceEndpoint() - Method in interface org.apache.beam.sdk.io.azure.cosmos.CosmosOptions
-
The Azure Cosmos service endpoint used by the Cosmos client.
- getCount() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel
- getCount() - Method in class org.apache.beam.sdk.metrics.DistributionResult
- getCount() - Method in class org.apache.beam.sdk.testing.CoderProperties.TestElementByteSizeObserver
- getCountBackoffs() - Method in class org.apache.beam.io.requestresponse.Monitoring
-
Count invocations of
BackOff.nextBackOffMillis()
. - getCountCacheReadFailures() - Method in class org.apache.beam.io.requestresponse.Monitoring
-
Count
Cache
read failures. - getCountCacheReadNonNulls() - Method in class org.apache.beam.io.requestresponse.Monitoring
-
Count associated non-null values resulting from
Cache
reads. - getCountCacheReadNulls() - Method in class org.apache.beam.io.requestresponse.Monitoring
-
Count associated null values resulting from
Cache
reads. - getCountCacheReadRequests() - Method in class org.apache.beam.io.requestresponse.Monitoring
-
Count number of attempts to read from the
Cache
. - getCountCacheWriteFailures() - Method in class org.apache.beam.io.requestresponse.Monitoring
-
Count
Cache
write failures. - getCountCacheWriteRequests() - Method in class org.apache.beam.io.requestresponse.Monitoring
-
Count number of attempts to write to the
Cache
. - getCountCacheWriteSuccesses() - Method in class org.apache.beam.io.requestresponse.Monitoring
-
Count
Cache
write successes. - getCountCalls() - Method in class org.apache.beam.io.requestresponse.Monitoring
-
Count invocations of
Caller.call(RequestT)
. - getCountEndPoint() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
- getCounter(MetricName) - Method in class org.apache.beam.runners.jet.metrics.JetMetricsContainer
- getCounter(MetricName) - Method in interface org.apache.beam.sdk.metrics.MetricsContainer
-
Return the
Counter
that should be used for implementing the givenmetricName
in this container. - getCounters() - Method in class org.apache.beam.sdk.metrics.MetricQueryResults
-
Return the metric results for the counters that matched the filter.
- getCounters(MetricFilter) - Method in class org.apache.beam.runners.spark.metrics.WithMetricsSupport
- getCounters(MetricFilter) - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.WithMetricsSupport
- getCountFailures() - Method in class org.apache.beam.io.requestresponse.Monitoring
-
Count failures resulting from
Call
's successfulCaller
invocation. - getCountRequests() - Method in class org.apache.beam.io.requestresponse.Monitoring
-
Count incoming request elements processed by
Call
'sDoFn
. - getCountResponses() - Method in class org.apache.beam.io.requestresponse.Monitoring
-
Count outgoing responses resulting from
Call
's successfulCaller
invocation. - getCountryOfResidence() - Method in class org.apache.beam.sdk.extensions.sql.example.model.Customer
- getCountSetup() - Method in class org.apache.beam.io.requestresponse.Monitoring
-
Count invocations of
SetupTeardown.setup()
. - getCountShouldBackoff() - Method in class org.apache.beam.io.requestresponse.Monitoring
-
Count when
CallShouldBackoff.isTrue()
is found true. - getCountSleeps() - Method in class org.apache.beam.io.requestresponse.Monitoring
-
Count invocations of
Sleeper.sleep(long)
. - getCountTeardown() - Method in class org.apache.beam.io.requestresponse.Monitoring
-
Count invocations of
SetupTeardown.teardown()
. - getCpu() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
- getCpuRate() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
- getCreatedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
The time at which this partition was first detected and created in the metadata table.
- getCreatedAtIndexName() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataTableNames
- getCreateDisposition() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
- getCreateDisposition() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.WriteBuilder.Configuration
- getCreateDisposition() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
- getCreateFromSnapshot() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
-
If set, the snapshot from which the job should be created.
- getCreateTime() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
-
Gets create time.
- getCreator(TypeDescriptor<T>, Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
-
Get an object creator for an AVRO-generated SpecificRecord.
- getCredential() - Method in interface org.apache.beam.sdk.extensions.gcp.auth.CredentialFactory
- getCredential() - Method in class org.apache.beam.sdk.extensions.gcp.auth.GcpCredentialFactory
-
Returns a default GCP
Credentials
or null when it fails. - getCredential() - Method in class org.apache.beam.sdk.extensions.gcp.auth.NoopCredentialFactory
- getCredential() - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsUserCredentialFactory
-
Returns
Credentials
as configured byGoogleAdsOptions
. - getCredentialFactoryClass() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
-
The class of the credential factory that should be created and used to create credentials.
- getCredentials() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getCrossProduct() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider.Configuration
- getCsvConfiguration() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration
- getCsvFormat() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTableProvider.CsvToRow
- getCsvRecord() - Method in class org.apache.beam.sdk.io.csv.CsvIOParseError
-
The CSV record associated with the caught
Exception
. - getCurrent() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
- getCurrent() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
- getCurrent() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
-
Gets the current record from the delegate reader.
- getCurrent() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
- getCurrent() - Method in class org.apache.beam.sdk.io.Source.Reader
-
Returns the value of the data item that was read by the last
Source.Reader.start()
orSource.Reader.advance()
call. - getCurrentBlock() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource.AvroReader
- getCurrentBlock() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
-
Returns the current block (the block that was read by the last successful call to
BlockBasedSource.BlockBasedReader.readNextBlock()
). - getCurrentBlockOffset() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource.AvroReader
- getCurrentBlockOffset() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
-
Returns the largest offset such that starting to read from that offset includes the current block.
- getCurrentBlockSize() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource.AvroReader
- getCurrentBlockSize() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
-
Returns the size of the current block in bytes as it is represented in the underlying file, if possible.
- getCurrentBundle() - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
- getCurrentBundleTimestamp() - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
- getCurrentContainer() - Static method in class org.apache.beam.sdk.metrics.MetricsEnvironment
-
Return the
MetricsContainer
for the current thread. - getCurrentDirectory() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsResourceId
- getCurrentDirectory() - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem.ClassLoaderResourceId
- getCurrentDirectory() - Method in interface org.apache.beam.sdk.io.fs.ResourceId
-
Returns the
ResourceId
that represents the current directory of thisResourceId
. - getCurrentKey() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.ExecutableStageDoFnOperator
- getCurrentOffset() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
- getCurrentOffset() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
- getCurrentOffset() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
-
Returns the starting offset of the
current record
, which has been read by the last successfulSource.Reader.start()
orSource.Reader.advance()
call. - getCurrentOutputWatermark() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- getCurrentParent() - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
-
Gets the parent composite transform to the current transform, if one exists.
- getCurrentRateLimit() - Method in class org.apache.beam.sdk.io.sparkreceiver.WrappedSupervisor
- getCurrentRecord() - Method in class org.apache.beam.sdk.io.BlockBasedSource.Block
-
Returns the current record.
- getCurrentRecordId() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
-
Returns a unique identifier for the current record.
- getCurrentRecordOffset() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
- getCurrentRelativeTime() - Method in interface org.apache.beam.sdk.state.Timer
-
Returns the current relative time used by
Timer.setRelative()
andTimer.offset(org.joda.time.Duration)
. - getCurrentRowAsStruct() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSet
-
Returns the record at the current pointer as a
Struct
. - getCurrentSchemaPlus() - Method in class org.apache.beam.sdk.extensions.sql.impl.JdbcConnection
-
Calcite-created
SchemaPlus
wrapper for the current schema. - getCurrentSource() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
- getCurrentSource() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource.AvroReader
- getCurrentSource() - Method in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
-
Returns a
Source
describing the same input that thisReader
currently reads (including items already read). - getCurrentSource() - Method in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
- getCurrentSource() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
- getCurrentSource() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
- getCurrentSource() - Method in class org.apache.beam.sdk.io.Source.Reader
-
Returns a
Source
describing the same input that thisReader
currently reads (including items already read). - getCurrentSource() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
-
Returns the
UnboundedSource
that created this reader. - getCurrentTimestamp() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
- getCurrentTimestamp() - Method in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
-
By default, returns the minimum possible timestamp.
- getCurrentTimestamp() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
- getCurrentTimestamp() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
- getCurrentTimestamp() - Method in class org.apache.beam.sdk.io.Source.Reader
-
Returns the timestamp associated with the current data item.
- getCurrentToken() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
- getCurrentTransform() - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
- getCurrentTransform() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
- getCurrentTransform() - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
- getCurrentTransform() - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
- getCurrentVersion() - Method in class org.apache.beam.runners.flink.translation.types.UnversionedTypeSerializerSnapshot
- getCurrentVersion() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.FlinkStateNamespaceKeySerializer.FlinkStateNameSpaceSerializerSnapshot
- getCursor() - Method in class org.apache.beam.sdk.io.redis.RedisCursor
- getCustomBeamRequirement() - Method in interface org.apache.beam.sdk.extensions.python.PythonExternalTransformOptions
- getCustomerId() - Method in class org.apache.beam.sdk.extensions.sql.example.model.Order
- getCustomerProvidedKey() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
- getCustomError() - Method in class org.apache.beam.sdk.extensions.gcp.util.CustomHttpErrors.MatcherAndError
- getCustomError(HttpRequestWrapper, HttpResponseWrapper) - Method in class org.apache.beam.sdk.extensions.gcp.util.CustomHttpErrors
- getDanglingDataSets() - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.BatchTranslationContext
- getData() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
- getData() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
-
Gets data.
- getData(Row) - Method in interface org.apache.beam.sdk.io.iceberg.DynamicDestinations
- getDataAsBytes() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
- getDatabase() - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration
- getDatabase() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- getDatabase() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
-
Getting a Snowflake database.
- getDatabase() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- getDatabase() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- getDatabaseAdminClient() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor
- getDatabaseClient() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor
- getDatabaseId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
- getDatabaseId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
- getDatabaseId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntityWithSummary
- getDatabaseId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKeyWithSummary
- getDatabaseId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.WriteWithSummary
- getDatabaseId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
- getDatabaseId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
- getDatabaseId() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration
- getDatabaseId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getDatabaseId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration
- getDatabaseId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration
- getDatabaseRole() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getDataBoostEnabled() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getDataCatalogEndpoint() - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogPipelineOptions
-
DataCatalog endpoint.
- getDataCatalogSegments() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath
- getDataCatalogSegments() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.TopicPath
-
Returns the data catalog segments.
- getDataClient() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.BigtableChangeStreamAccessor
- getDataCoder() - Method in class org.apache.beam.runners.dataflow.DataflowRunner.StreamingPCollectionViewWriterFn
- getDataflowClient() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
An instance of the Dataflow client.
- getDataflowEndpoint() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
Dataflow endpoint to use.
- getDataflowEndpoint() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
-
Dataflow endpoint to use.
- getDataflowJobFile() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
The path to write the translated Dataflow job specification out to at job submission time.
- getDataflowKmsKey() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
-
GCP Cloud KMS key for Dataflow pipelines and buckets created by GcpTempLocationFactory.
- getDataflowOptions() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
- getDataflowRunnerInfo() - Static method in class org.apache.beam.runners.dataflow.DataflowRunnerInfo
-
Returns an instance of
DataflowRunnerInfo
. - getDataflowServiceOptions() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
-
Service options are set by the user and configure the service.
- getDataflowWorkerJar() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
- getDataResource() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOError
- getDataSchema() - Method in interface org.apache.beam.sdk.io.iceberg.DynamicDestinations
- getDataset(String) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getDataset(String, String) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getDataset(String, String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Gets the specified
Dataset
resource by dataset ID. - getDataset(String, String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
-
Gets the specified
Dataset
resource by dataset ID. - getDataset(String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- getDataset(String, String, Map<String, String>) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getDataset(String, Map<String, String>) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getDataset(PCollection<T>) - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator.TranslationState
- getDataset(PCollection<T>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
- getDataSetOrThrow(String) - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.BatchTranslationContext
- getDatasetService(BigQueryOptions) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices
-
Returns a real, mock, or fake
BigQueryServices.DatasetService
. - getDatasetService(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl
- getDatasetService(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
- getDataSource() - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.DataSourceConfiguration
- getDataSource() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- getDataSourceConfiguration() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadConfiguration
- getDataSourceConfiguration() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteConfiguration
- getDataSourceConfiguration() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- getDataSourceProviderFn() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
-
Getting a DataSource provider function for connection credentials.
- getDataStreamOrThrow(String) - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator.StreamingTranslationContext
- getDataType() - Method in class org.apache.beam.sdk.io.snowflake.data.SnowflakeColumn
- getDateTime() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
- getDateTime(int) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.DATETIME
value by field index,IllegalStateException
is thrown if schema doesn't match. - getDateTime(String) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.DATETIME
value by field name,IllegalStateException
is thrown if schema doesn't match. - getDatumFactory() - Method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
-
Returns the datum factory used for encoding/decoding.
- getDatumReader() - Method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
-
Returns the DatumReader used for decoding.
- getDatumWriter() - Method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
-
Returns the DatumWriter used for encoding.
- getDatumWriterFactory(DestinationT) - Method in class org.apache.beam.sdk.extensions.avro.io.DynamicAvroDestinations
-
Return a
AvroSink.DatumWriterFactory
for a given destination. - getDayOfMonth() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
- getDayOfMonth() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
- getDbSize() - Method in class org.apache.beam.sdk.io.redis.RedisCursor
- getDebeziumConnectionProperties() - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration
- getDecimal() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
- getDecimal(int) - Method in class org.apache.beam.sdk.values.Row
-
Get a
BigDecimal
value by field index,ClassCastException
is thrown if schema doesn't match. - getDecimal(String) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.DECIMAL
value by field name,IllegalStateException
is thrown if schema doesn't match. - getDef() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner.NonCumulativeCostImpl
- getDef() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.RelMdNodeStats
- getDefault() - Static method in class org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter
- getDefaultCoder(TypeDescriptor<?>, CoderRegistry) - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
-
Returns the default coder for a given type descriptor.
- getDefaultDestination() - Method in class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
-
Returns the default destination.
- getDefaultEnvironmentConfig() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
- getDefaultEnvironmentType() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
- getDefaultHeaders() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
- getDefaultJobName() - Static method in class org.apache.beam.runners.jobsubmission.PortablePipelineJarUtils
- getDefaultOutputCoder() - Method in class org.apache.beam.sdk.io.Source
-
Deprecated.Override
Source.getOutputCoder()
instead. - getDefaultOutputCoder() - Method in class org.apache.beam.sdk.transforms.PTransform
-
Deprecated.Instead, the PTransform should explicitly call
PCollection.setCoder(org.apache.beam.sdk.coders.Coder<T>)
on the returned PCollection. - getDefaultOutputCoder(InputT) - Method in class org.apache.beam.sdk.transforms.PTransform
-
Deprecated.Instead, the PTransform should explicitly call
PCollection.setCoder(org.apache.beam.sdk.coders.Coder<T>)
on the returned PCollection. - getDefaultOutputCoder(InputT, PCollection<T>) - Method in class org.apache.beam.sdk.transforms.PTransform
-
Deprecated.Instead, the PTransform should explicitly call
PCollection.setCoder(org.apache.beam.sdk.coders.Coder<T>)
on the returned PCollection. - getDefaultOutputCoder(CoderRegistry, Coder) - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
- getDefaultOutputCoder(CoderRegistry, Coder<InputT>) - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
- getDefaultOutputCoder(CoderRegistry, Coder<InputT>) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
- getDefaultOutputCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
- getDefaultOutputCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
- getDefaultOutputCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.ArrayAgg.ArrayAggArray
- getDefaultOutputCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations.BitXOr
- getDefaultOutputCoder(CoderRegistry, Coder<InputT>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions.PositionAwareCombineFn
- getDefaultOutputCoder(CoderRegistry, Coder<InputT>) - Method in class org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn
- getDefaultOutputCoder(CoderRegistry, Coder<InputT>) - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
- getDefaultOutputCoder(CoderRegistry, Coder<V>) - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
- getDefaultOutputCoder(CoderRegistry, Coder<V>) - Method in class org.apache.beam.sdk.transforms.Combine.SimpleCombineFn
- getDefaultOutputCoder(CoderRegistry, Coder<DataT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
- getDefaultOutputCoder(CoderRegistry, Coder<DataT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
- getDefaultOutputCoder(CoderRegistry, Coder<InputT>) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
- getDefaultOutputCoder(CoderRegistry, Coder<Boolean>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CountIf.CountIfFn
- getDefaultOutputCoder(CoderRegistry, Coder<Row>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CovarianceFn
- getDefaultOutputCoder(CoderRegistry, Coder<TimestampedValue<KV<EventKeyT, KV<Long, EventT>>>>) - Method in class org.apache.beam.sdk.extensions.ordered.combiner.DefaultSequenceCombiner
- getDefaultOutputCoder(CoderRegistry, Coder<InputT>) - Method in class org.apache.beam.sdk.extensions.sql.impl.LazyAggregateCombineFn
- getDefaultOutputCoder(CoderRegistry, Coder<InputT>) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
- getDefaultOutputCoder(CoderRegistry, Coder<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Concatenate
- getDefaultOutputCoder(CoderRegistry, Coder<byte[]>) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggByte
- getDefaultOutputCoder(CoderRegistry, Coder<String>) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggString
- getDefaultOutputCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.transforms.Top.TopCombineFn
- getDefaultOutputCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.VarianceFn
- getDefaultOutputCoder(CoderRegistry, Coder<InputT>) - Method in interface org.apache.beam.sdk.transforms.CombineFnBase.GlobalCombineFn
-
Returns the
Coder
to use by default for outputOutputT
values, or null if it is not able to be inferred. - getDefaultOutputCoder(CoderRegistry, Coder<Double>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
- getDefaultOutputCoder(CoderRegistry, Coder<Integer>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
- getDefaultOutputCoder(CoderRegistry, Coder<Long>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
- getDefaultOutputCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.runners.flink.FlinkStreamingAggregationsTranslators.ConcatenateAsIterable
- getDefaultOutputCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
- getDefaultOutputCoder(CoderRegistry, Coder<V>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
- getDefaultOverrides(boolean) - Static method in class org.apache.beam.runners.spark.SparkTransformOverrides
- getDefaultPrecision(SqlTypeName) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRelDataTypeSystem
- getDefaultSdkHarnessLogLevel() - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
-
This option controls the default log level of all loggers without a log level override.
- getDefaultTimezone() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner
- getDefaultValue() - Method in interface org.apache.beam.sdk.values.PCollectionViews.HasDefaultValue
- getDefaultValue() - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn
-
Deprecated.Returns the default value that was specified.
- getDefaultValue() - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn2
-
Returns the default value that was specified.
- getDefaultWindowMappingFn() - Method in class org.apache.beam.sdk.extensions.sql.impl.TVFSlidingWindowFn
- getDefaultWindowMappingFn() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
- getDefaultWindowMappingFn() - Method in class org.apache.beam.sdk.transforms.windowing.PartitioningWindowFn
- getDefaultWindowMappingFn() - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
- getDefaultWindowMappingFn() - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
-
Return a
WindowMappingFn
that returns the earliest window that contains the end of the main-input window. - getDefaultWindowMappingFn() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
-
Returns the default
WindowMappingFn
to use to map main input windows to side input windows. - getDefaultWorkerLogLevel() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
-
Deprecated.This option controls the default log level of all loggers without a log level override.
- getDeidentifyConfig() - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
- getDeidentifyTemplateName() - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
- getDelay() - Method in class org.apache.beam.sdk.transforms.windowing.TimestampTransform.Delay
- getDelimiter() - Method in class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider.CsvWriteConfiguration
- getDelimiters() - Method in class org.apache.beam.sdk.io.TextRowCountEstimator
- getDeliveryMode() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
- getDeliveryMode() - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
- getDependencies() - Method in class org.apache.beam.sdk.expansion.service.ExpansionServiceConfig
- getDependencies(ConfigT, PipelineOptions) - Method in interface org.apache.beam.sdk.transforms.ExternalTransformBuilder
-
List the dependencies needed for this transform.
- getDependencies(RunnerApi.FunctionSpec, PipelineOptions) - Method in interface org.apache.beam.sdk.expansion.service.TransformProvider
- getDescription() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
-
If the field has a description, returns the description for the field.
- getDescription() - Method in class org.apache.beam.sdk.schemas.Schema.Field
-
Returns the field's description.
- getDescription() - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Failure
-
The description of what was being attempted when the failure occurred.
- getDescriptor(String) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoDomain
- getDescriptorFromTableSchema(TableSchema, boolean, boolean) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
- getDescriptorFromTableSchema(TableSchema, boolean, boolean) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
-
Given a BigQuery TableSchema, returns a protocol-buffer Descriptor that can be used to write data using the BigQuery Storage API.
- getDeserializer(Map<String, ?>, boolean) - Method in class org.apache.beam.sdk.io.kafka.ConfluentSchemaRegistryDeserializerProvider
- getDeserializer(Map<String, ?>, boolean) - Method in interface org.apache.beam.sdk.io.kafka.DeserializerProvider
- getDesiredNumUnboundedSourceSplits() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
The desired number of initial splits for UnboundedSources.
- getDestination() - Method in class org.apache.beam.runners.dataflow.util.PackageUtil.StagedFile
-
Staged target for this file.
- getDestination() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
- getDestination() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
-
Return the user destination object for this writer.
- getDestination() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record
-
Gets the destination (topic or queue) to which the message was sent.
- getDestination(String, String) - Method in interface org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService.ArtifactDestinationProvider
- getDestination(ValueInSingleWindow<Row>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.PortableBigQueryDestinations
- getDestination(ValueInSingleWindow<T>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
-
Returns an object that represents at a high level which table is being written to.
- getDestination(ValueInSingleWindow<T>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiDynamicDestinationsTableRow
- getDestination(UserT) - Method in class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
-
Returns an object that represents at a high level the destination being written to.
- getDestinationCoder() - Method in class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
-
Returns the coder for
FileBasedSink.DynamicDestinations
. - getDestinationCoder() - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
-
Returns the coder for
DynamicDestinations
. - getDestinationCoder() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiDynamicDestinationsTableRow
- getDestinationFile(boolean, FileBasedSink.DynamicDestinations<?, DestinationT, ?>, int, FileBasedSink.OutputFileHints) - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
- getDestinationFn() - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
- getDiagnostics() - Method in exception class org.apache.beam.sdk.schemas.transforms.providers.StringCompiler.CompileException
- getDictionary(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- getDictionary(Map<String, Object>, String, Map<String, Object>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- getDir() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.OrderKey
- getDirectoryTreatment() - Method in class org.apache.beam.sdk.io.TextRowCountEstimator
- getDisableAutoCommit() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
- getDisableMetrics() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getDiskSizeGb() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
-
Remote worker disk size, in gigabytes, or 0 to use the default size.
- getDistribution() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
- getDistribution(MetricName) - Method in class org.apache.beam.runners.jet.metrics.JetMetricsContainer
- getDistribution(MetricName) - Method in interface org.apache.beam.sdk.metrics.MetricsContainer
-
Return the
Distribution
that should be used for implementing the givenmetricName
in this container. - getDistributions() - Method in class org.apache.beam.sdk.metrics.MetricQueryResults
-
Return the metric results for the distributions that matched the filter.
- getDlqTransform(String) - Static method in class org.apache.beam.sdk.schemas.io.GenericDlq
- getDocToBulk() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
- getDocumentCount() - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
- getDoFn() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- getDoFn() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.PartialReduceBundleOperator
- getDoFn() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.WindowDoFnOperator
- getDoFn() - Method in class org.apache.beam.runners.spark.translation.SparkProcessContext
- getDoFnRunner() - Method in class org.apache.beam.runners.spark.translation.SparkProcessContext
- getDoFnRunner(PipelineOptions, DoFn<InputT, OutputT>, SideInputReader, AbstractParDoP.JetOutputManager, TupleTag<OutputT>, List<TupleTag<?>>, Coder<InputT>, Map<TupleTag<?>, Coder<?>>, WindowingStrategy<?, ?>, DoFnSchemaInformation, Map<String, PCollectionView<?>>) - Method in class org.apache.beam.runners.jet.processors.ParDoP
- getDoFnRunner(PipelineOptions, DoFn<KV<?, ?>, OutputT>, SideInputReader, AbstractParDoP.JetOutputManager, TupleTag<OutputT>, List<TupleTag<?>>, Coder<KV<?, ?>>, Map<TupleTag<?>, Coder<?>>, WindowingStrategy<?, ?>, DoFnSchemaInformation, Map<String, PCollectionView<?>>) - Method in class org.apache.beam.runners.jet.processors.StatefulParDoP
- getDoFnSchemaInformation(DoFn<?, ?>, PCollection<?>) - Static method in class org.apache.beam.sdk.transforms.ParDo
-
Extract information on how the DoFn uses schemas.
- getDouble() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
- getDouble(int) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.DOUBLE
value by field index,ClassCastException
is thrown if schema doesn't match. - getDouble(String) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.DOUBLE
value by field name,IllegalStateException
is thrown if schema doesn't match. - getDriverClassName() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
- getDriverClassName() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
- getDriverJars() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
- getDriverJars() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
- getDrop() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
- getDrop() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration
- getDrop() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider.Configuration
- getDropFields() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getDStream() - Method in class org.apache.beam.runners.spark.translation.streaming.UnboundedDataset
- getDumpHeapOnOOM() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
If true, save a heap dump before killing a thread or process which is GC thrashing or out of memory.
- getDuplicateCount() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus
- getDynamicDestinations() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSink
- getDynamicDestinations() - Method in class org.apache.beam.sdk.io.FileBasedSink
-
Return the
FileBasedSink.DynamicDestinations
used. - getEarliestBufferedSequence() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus
- getEarliestHL7v2SendTime(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Gets the earliest HL7v2 send time.
- getEarliestHL7v2SendTime(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- getEarlyTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
- getEffectiveInputWatermark() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- getElasticsearchHttpPort() - Method in interface org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOITCommon.ElasticsearchPipelineOptions
- getElasticsearchServer() - Method in interface org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOITCommon.ElasticsearchPipelineOptions
- getElemCoder() - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
- getElement() - Method in class org.apache.beam.sdk.io.gcp.bigquery.AvroWriteRequest
- getElementByteSize() - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches.BatchingParams
- getElementCoder() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.SingletonKeyedWorkItemCoder
- getElementCoders() - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
- getElementConverters() - Method in class org.apache.beam.sdk.transforms.DoFnSchemaInformation
-
The schema of the @Element parameter.
- getElementCount() - Method in class org.apache.beam.sdk.transforms.windowing.AfterPane
-
The number of elements after which this trigger may fire.
- getElements() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.stableinput.KeyedBufferingElementsHandler
- getElements() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.stableinput.NonKeyedBufferingElementsHandler
- getElements() - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Rate
- getElements() - Method in class org.apache.beam.sdk.testing.TestStream.ElementEvent
- getElements() - Method in class org.apache.beam.sdk.transforms.Create.Values
- getElementType() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
-
If the field is a container type, returns the element type.
- getEmptyMatchTreatment() - Method in class org.apache.beam.sdk.io.FileBasedSource
- getEmptyMatchTreatment() - Method in class org.apache.beam.sdk.io.FileIO.MatchConfiguration
- getEmptyMatchTreatment() - Method in class org.apache.beam.sdk.io.TextRowCountEstimator
- getEmulatorHost() - Method in interface org.apache.beam.sdk.io.gcp.firestore.FirestoreOptions
-
A host port pair to allow connecting to a Cloud Firestore emulator instead of the live service.
- getEmulatorHost() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getEnableBucketReadMetricCounter() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
-
If true, reports number of bytes read from each gcs bucket.
- getEnableBucketWriteMetricCounter() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
-
If true, reports number of bytes written to each gcs bucket.
- getEnableHeapDumps() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
-
If true and PipelineOption tempLocation is set, save a heap dump before shutting down the JVM due to GC thrashing or out of memory.
- getEnableLogViaFnApi() - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
-
This option controls whether logging will be redirected through the FnApi.
- getEnableSparkMetricSinks() - Method in interface org.apache.beam.runners.spark.SparkCommonPipelineOptions
- getEnableStableInputDrain() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getEnableStorageReadApiV2() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getEnableWebUI() - Method in interface org.apache.beam.runners.prism.PrismPipelineOptions
- getEncodedElementByteSize(byte[]) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
- getEncodedElementByteSize(T) - Method in class org.apache.beam.sdk.coders.NullableCoder
-
Overridden to short-circuit the default
StructuredCoder
behavior of encoding and counting the bytes. - getEncodedElementByteSize(HyperLogLogPlus) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.HyperLogLogPlusCoder
- getEncodedElementByteSize(TableRow) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
- getEncodedElementByteSize(ByteString) - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
- getEncodedElementByteSize(Boolean) - Method in class org.apache.beam.sdk.coders.BooleanCoder
- getEncodedElementByteSize(Byte) - Method in class org.apache.beam.sdk.coders.ByteCoder
-
Returns the size in bytes of the encoded value using this coder.
- getEncodedElementByteSize(Double) - Method in class org.apache.beam.sdk.coders.DoubleCoder
-
Returns the size in bytes of the encoded value using this coder.
- getEncodedElementByteSize(Float) - Method in class org.apache.beam.sdk.coders.FloatCoder
-
Returns the size in bytes of the encoded value using this coder.
- getEncodedElementByteSize(Integer) - Method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
-
Returns the size in bytes of the encoded value using this coder.
- getEncodedElementByteSize(Integer) - Method in class org.apache.beam.sdk.coders.TextualIntegerCoder
- getEncodedElementByteSize(Integer) - Method in class org.apache.beam.sdk.coders.VarIntCoder
- getEncodedElementByteSize(Long) - Method in class org.apache.beam.sdk.coders.BigEndianLongCoder
-
Returns the size in bytes of the encoded value using this coder.
- getEncodedElementByteSize(Long) - Method in class org.apache.beam.sdk.coders.VarLongCoder
- getEncodedElementByteSize(Short) - Method in class org.apache.beam.sdk.coders.BigEndianShortCoder
-
Returns the size in bytes of the encoded value using this coder.
- getEncodedElementByteSize(String) - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
-
Returns the size in bytes of the encoded value using this coder.
- getEncodedElementByteSize(Void) - Method in class org.apache.beam.sdk.coders.VoidCoder
- getEncodedElementByteSize(BigDecimal) - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
-
Returns the size in bytes of the encoded value using this coder.
- getEncodedElementByteSize(BigInteger) - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
-
Returns the size in bytes of the encoded value using this coder.
- getEncodedElementByteSize(Optional<T>) - Method in class org.apache.beam.sdk.coders.OptionalCoder
-
Overridden to short-circuit the default
StructuredCoder
behavior of encoding and counting the bytes. - getEncodedElementByteSize(IsmFormat.Footer) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.FooterCoder
- getEncodedElementByteSize(IsmFormat.KeyPrefix) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.KeyPrefixCoder
- getEncodedElementByteSize(RandomAccessData) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
- getEncodedElementByteSize(EncodedBoundedWindow) - Method in class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow.Coder
- getEncodedElementByteSize(BigQueryInsertError) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
- getEncodedElementByteSize(OffsetRange) - Method in class org.apache.beam.sdk.io.range.OffsetRange.Coder
- getEncodedElementByteSize(ByteString) - Method in class org.apache.beam.runners.fnexecution.wire.ByteStringCoder
- getEncodedElementByteSize(Instant) - Method in class org.apache.beam.sdk.coders.InstantCoder
- getEncodedElementByteSize(T) - Method in class org.apache.beam.sdk.coders.Coder
-
Returns the size in bytes of the encoded value using this coder.
- getEncodedElementByteSize(T) - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
-
Overridden to short-circuit the default
StructuredCoder
behavior of encoding and counting the bytes. - getEncodedElementByteSizeUsingCoder(Coder<T>, T) - Static method in class org.apache.beam.sdk.coders.Coder
- getEncodedRecord() - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Record
-
Nullable to account for failing to encode, or if there is no coder for the record at the time of failure.
- getEncodedTypeDescriptor() - Method in class org.apache.beam.runners.fnexecution.wire.ByteStringCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.BigEndianLongCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.BigEndianShortCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.ByteCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.Coder
-
Returns the
TypeDescriptor
for the type encoded. - getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.CollectionCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.DelegateCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.DequeCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.DoubleCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.DurationCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.FloatCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.InstantCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.IterableCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.KvCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.ListCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.MapCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.NullableCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.OptionalCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.SerializableCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.SetCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.SortedMapCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.TextualIntegerCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.VarIntCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.VarLongCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.VoidCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.io.range.OffsetRange.Coder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.io.splunk.SplunkEventCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.schemas.SchemaCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
- getEncodedWindow() - Method in class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow
- getEncodingPositions() - Method in class org.apache.beam.sdk.schemas.Schema
-
Gets the encoding positions for this schema.
- getEnd() - Method in class org.apache.beam.sdk.extensions.ordered.ContiguousSequenceRange
- getEnd() - Method in class org.apache.beam.sdk.io.cassandra.RingRange
- getEnd() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.HL7v2MessagePages
- getEnd() - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration
- getEndAtTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration
- getEndKey() - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
-
Returns the
ByteKey
representing the upper bound of thisByteKeyRange
. - getEndOffset() - Method in class org.apache.beam.sdk.io.OffsetBasedSource
-
Returns the specified ending offset of the source.
- getEndOffset() - Method in interface org.apache.beam.sdk.io.sparkreceiver.HasOffset
- getEndpoint() - Method in interface org.apache.beam.sdk.io.aws2.options.AwsOptions
-
Endpoint used to configure AWS service clients.
- getEndTime() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
- getEndTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionEndRecord
-
The end timestamp at which the change stream partition is terminated.
- getEndTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
The end time for querying this given partition.
- getEnumeratorCheckpointSerializer() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSource
- getEnvironment() - Method in class org.apache.beam.runners.fnexecution.environment.ProcessEnvironment
- getEnvironment() - Method in interface org.apache.beam.runners.fnexecution.environment.RemoteEnvironment
-
Return the environment that the remote handles.
- getEnvironment() - Method in class org.apache.beam.runners.fnexecution.environment.RemoteEnvironment.SimpleRemoteEnvironment
- getEnvironment() - Method in class org.apache.beam.runners.fnexecution.environment.StaticRemoteEnvironment
- getEnvironment() - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
- getEnvironmentCacheMillis() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
- getEnvironmentExpirationMillis() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
- getEnvironmentId() - Method in class org.apache.beam.sdk.values.WindowingStrategy
- getEnvironmentOption(PortablePipelineOptions, String) - Static method in interface org.apache.beam.sdk.options.PortablePipelineOptions
-
Return the value for the specified environment option or empty string if not present.
- getEnvironmentOptions() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
- getEquivalentFieldType(TableSchema.ColumnType) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema
-
Returns Beam equivalent of ClickHouse column type.
- getEquivalentSchema(TableSchema) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema
-
Returns Beam equivalent of ClickHouse schema.
- getError() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertError
- getError() - Method in class org.apache.beam.sdk.io.solace.data.Solace.PublishResult
-
The error details if the message could not be published.
- getError() - Method in class org.apache.beam.sdk.io.tika.ParseResult
-
Returns the parse error, if the file was parsed unsuccessfully.
- getError() - Method in class org.apache.beam.sdk.schemas.io.Failure
-
Information about the cause of the failure.
- getErrorAsString() - Method in class org.apache.beam.sdk.io.tika.ParseResult
-
Same as
ParseResult.getError()
, but returns the complete stack trace of the error as aString
. - getErrorHandling() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
- getErrorHandling() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
- getErrorHandling() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration
- getErrorHandling() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
- getErrorHandling() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
- getErrorHandling() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration
- getErrorHandling() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration
- getErrorHandling() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- getErrorHandling() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration
- getErrorHandling() - Method in class org.apache.beam.sdk.io.TFRecordReadSchemaTransformConfiguration
- getErrorHandling() - Method in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformConfiguration
- getErrorHandling() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider.Configuration
- getErrorHandling() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider.Configuration
- getErrorInfo(IOException) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
- getErrorMessage() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertError
- getErrorMessage() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOError
- getErrorRowSchema(Schema) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils
- getErrors() - Method in class org.apache.beam.sdk.io.csv.CsvIOParseResult
-
The
CsvIOParseError
PCollection
as a result of errors associated with parsing CSV records. - getEstimatedLowWatermark() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
- getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.impulse.BeamImpulseSource
- getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.azure.cosmos.CosmosIO.BoundedCosmosBDSource
- getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.BoundedSource
-
An estimate of the total size (in bytes) of the data that would be read from this source.
- getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BoundedElasticsearchSource
- getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.FileBasedSource
- getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
- getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.HadoopInputFormatBoundedSource
- getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read.BoundedGridFSSource
- getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.OffsetBasedSource
- getEvaluator() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getEvent() - Method in class org.apache.beam.sdk.extensions.ordered.UnprocessedEvent
- getEventCoder(Pipeline, Coder<KV<KeyT, KV<Long, EventT>>>) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler
-
Provide the event coder.
- getEventExaminer() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler
- getEvents() - Method in class org.apache.beam.sdk.testing.TestStream
-
Returns the sequence of
Events
in thisTestStream
. - getEx() - Method in class org.apache.beam.runners.jet.processors.ParDoP.Supplier
- getEx() - Method in class org.apache.beam.runners.jet.processors.StatefulParDoP.Supplier
- getEx() - Method in class org.apache.beam.runners.jet.processors.FlattenP.Supplier
- getException() - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Failure
-
The exception itself, e.g.
- getExceptionStacktrace() - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Failure
-
The full stacktrace.
- getExecutables() - Method in interface org.apache.beam.runners.direct.ExecutableGraph
- getExecutableStageIntermediateId(PipelineNode.PTransformNode) - Static method in class org.apache.beam.runners.fnexecution.translation.PipelineTranslatorUtils
- getExecuteStreamingSqlRetrySettings() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getExecutionEnvironment() - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.BatchTranslationContext
- getExecutionEnvironment() - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator.StreamingTranslationContext
- getExecutionModeForBatch() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getExecutionRetryDelay() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getExecutorService() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
-
Deprecated.use
ExecutorOptions.getScheduledExecutorService()
instead - getExpansionPort() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver.ServerConfiguration
- getExpansionServiceConfig() - Method in interface org.apache.beam.sdk.expansion.service.ExpansionServiceOptions
- getExpansionServiceConfigFile() - Method in interface org.apache.beam.sdk.expansion.service.ExpansionServiceOptions
- getExpectedAssertions() - Method in interface org.apache.beam.runners.spark.TestSparkPipelineOptions
- getExpectFileToNotExist() - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.CreateOptions
-
If true, the created file is expected to not exist.
- getExperiments() - Method in interface org.apache.beam.sdk.options.ExperimentalOptions
- getExperimentValue(PipelineOptions, String) - Static method in interface org.apache.beam.sdk.options.ExperimentalOptions
-
Return the value for the specified experiment or null if not present.
- getExpiration() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
- getExpiration() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record
-
Gets the message expiration time in milliseconds since the Unix epoch.
- getExplanation() - Method in class org.apache.beam.sdk.extensions.ordered.UnprocessedEvent
- getExplicitHashKey(T) - Method in interface org.apache.beam.sdk.io.aws2.kinesis.KinesisPartitioner.ExplicitPartitioner
-
Required hash value (128-bit integer) to determine explicitly the shard a record is assigned to based on the hash key range of each shard.
- getExplicitHashKey(T) - Method in interface org.apache.beam.sdk.io.aws2.kinesis.KinesisPartitioner
-
Optional hash value (128-bit integer) to determine explicitly the shard a record is assigned to based on the hash key range of each shard.
- getExpression() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf.Configuration
- getExpression(SchemaPlus, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
- getExpressionConverter() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.QueryStatementConverter
- getExtendedSequenceNumber() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
- getExtensionHosts() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
- getExtensionRegistry() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
-
Returns the
ExtensionRegistry
listing all known Protocol Buffers extension messages toT
registered with thisProtoCoder
. - getExternalSorterType() - Method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter.Options
-
Returns the external sorter type.
- getExtraInteger() - Method in class org.apache.beam.sdk.managed.testing.TestSchemaTransformProvider.Config
- getExtraString() - Method in class org.apache.beam.sdk.managed.testing.TestSchemaTransformProvider.Config
- getFactory() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.AvroConvertValueForGetter
- getFactory() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.AvroConvertValueForSetter
- getFactory() - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
- getFactory() - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
- getFactory(AwsOptions) - Static method in interface org.apache.beam.sdk.io.aws2.common.ClientBuilderFactory
-
Get a
ClientBuilderFactory
instance according toAwsOptions.getClientBuilderFactory()
. - getFailedBodies() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
- getFailedBodies() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.AbstractResult
- getFailedBodies() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.Result
-
Gets failed bodies with err.
- getFailedBundles() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
-
Gets failed FhirBundleResponse wrapped inside HealthcareIOError.
- getFailedFiles() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
- getFailedFiles() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.AbstractResult
- getFailedFiles() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.Result
-
Gets failed file imports with err.
- getFailedInserts() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
-
Returns a
PCollection
containing theTableRow
s that didn't make it to BQ. - getFailedInsertsWithErr() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
-
Returns a
PCollection
containing theBigQueryInsertError
s with detailed error information. - getFailedInsertsWithErr() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write.Result
- getFailedLatencyMetric() - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
- getFailedMessages() - Method in class org.apache.beam.sdk.io.jms.WriteJmsResult
- getFailedMutations() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteResult
- getFailedReads() - Method in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata.Result
-
Gets failed reads.
- getFailedReads() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read.Result
-
Gets failed reads.
- getFailedReads() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything.Result
-
Gets failed reads.
- getFailedReads() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read.Result
- getFailedReads() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.Result
- getFailedRowsTag() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiLoads
- getFailedRowsTupleTag() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingWriteTables
- getFailedSearches() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search.Result
-
Gets failed searches.
- getFailedStorageApiInserts() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
-
Return any rows that persistently fail to insert when using a storage-api method.
- getFailedToParseLines() - Method in class org.apache.beam.sdk.transforms.JsonToRow.ParseResult
-
Returns a
PCollection
containing theRow
s that didn't parse. - getFailOnCheckpointingErrors() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getFailsafeTableRow() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload
- getFailsafeTableRowPayload() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload
- getFailsafeValue() - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow
-
Returns the failsafe value of this
FailsafeValueInSingleWindow
. - getFailure() - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord
-
Information about why the record failed.
- getFailureCollector() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getFailures() - Method in class org.apache.beam.io.requestresponse.Result
- getFanout() - Method in class org.apache.beam.sdk.transforms.Combine.GloballyAsSingletonView
- getFasterCopy() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getFetchSize() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
- getFhirBundleParameter() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleResponse
-
FhirBundleParameter represents a FHIR bundle in JSON format to be executed on a FHIR store.
- getFhirStore() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundles
- getField() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPMeasure
- getField() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema.KeyPart
- getField() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
- getField(int) - Method in class org.apache.beam.sdk.schemas.Schema
-
Return a field by index.
- getField(String) - Method in class org.apache.beam.sdk.schemas.Schema
- getFieldAccessDescriptor() - Method in class org.apache.beam.sdk.transforms.DoFnSchemaInformation
-
Effective FieldAccessDescriptor applied by DoFn.
- getFieldCount() - Method in class org.apache.beam.sdk.schemas.Schema
-
Return the count of fields.
- getFieldCount() - Method in class org.apache.beam.sdk.values.Row
-
Return the size of data fields.
- getFieldCount() - Method in class org.apache.beam.sdk.values.RowWithGetters
- getFieldCount() - Method in class org.apache.beam.sdk.values.RowWithStorage
- getFieldDescription(T) - Static method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
- getFieldId() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor
- getFieldName() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor
- getFieldNames() - Method in class org.apache.beam.sdk.schemas.Schema
-
Return the list of all field names.
- getFieldOptionById(int) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoDomain
- getFieldRef(CEPOperation) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPUtils
-
a function that finds a pattern reference recursively.
- getFieldRename() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor
- getFields() - Method in class org.apache.beam.sdk.schemas.Schema
- getFields() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider.Configuration
- getFields() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider.Configuration
- getFields(Class<?>) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
- getFieldsAccessed() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
- getFieldType(OneOfType.Value) - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
- getFieldType(Schema, CEPOperation) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPUtils
- getFieldTypes(TypeDescriptor<?>, Schema, FieldValueTypeSupplier) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
- getFieldTypes(TypeDescriptor<?>, Schema, FieldValueTypeSupplier) - Static method in class org.apache.beam.sdk.schemas.utils.POJOUtils
- getFieldTypes(TypeDescriptor<T>, Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
-
Get field types for an AVRO-generated SpecificRecord or a POJO.
- getFileDescriptor(String) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoDomain
- getFileDescriptorPath() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
- getFileDescriptorPath() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
- getFileDescriptorPath() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- getFileDescriptorPath() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration
- getFileFormat() - Method in class org.apache.beam.sdk.io.iceberg.IcebergDestination
-
File format for created files.
- getFileInputSplitMaxSizeMB() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getFileLocation() - Method in class org.apache.beam.sdk.io.tika.ParseResult
-
Returns the absolute path to the input file.
- getFilename() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsResourceId
- getFilename() - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem.ClassLoaderResourceId
- getFilename() - Method in class org.apache.beam.sdk.io.csv.CsvIOParseError
-
The filename associated with the caught
Exception
. - getFilename() - Method in interface org.apache.beam.sdk.io.fs.ResourceId
-
Returns the name of the file or directory denoted by this
ResourceId
. - getFilename(BoundedWindow, PaneInfo, int, int, Compression) - Method in interface org.apache.beam.sdk.io.FileIO.Write.FileNaming
-
Generates the filename.
- getFileName() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- getFilenamePolicy(DestinationT) - Method in class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
-
Converts a destination into a
FileBasedSink.FilenamePolicy
. - getFilenamePrefix() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration
- getFilenameSuffix() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration
- getFilenameSuffix() - Method in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformConfiguration
- getFileOrPatternSpec() - Method in class org.apache.beam.sdk.io.FileBasedSource
- getFileOrPatternSpecProvider() - Method in class org.apache.beam.sdk.io.FileBasedSource
- getFilepattern() - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration
-
The filepattern used to match and read files.
- getFilePattern() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTable
- getFilePattern() - Method in class org.apache.beam.sdk.io.TextRowCountEstimator
- getFilePattern() - Method in class org.apache.beam.sdk.io.TFRecordReadSchemaTransformConfiguration
- getFilesList() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
-
Getting list of names of staged files.
- getFilesList() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeStreamingServiceConfig
-
Getter for a list of staged files which are will be loaded to Snowflake.
- getFilesToStage() - Method in interface org.apache.beam.sdk.options.FileStagingOptions
-
List of local files to make available to workers.
- getFileSystem() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- getFilter() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getFilterFormatFunction(boolean) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.PortableBigQueryDestinations
- getFilterString() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getFinishBundleBeforeCheckpointing() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getFinishedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
The time at which the connector finished processing this partition.
- getFirestoreDb() - Method in interface org.apache.beam.sdk.io.gcp.firestore.FirestoreOptions
-
The Firestore database ID to connect to.
- getFirestoreHost() - Method in interface org.apache.beam.sdk.io.gcp.firestore.FirestoreOptions
-
A host port pair to allow connecting to a Cloud Firestore instead of the default live service.
- getFirestoreProject() - Method in interface org.apache.beam.sdk.io.gcp.firestore.FirestoreOptions
-
The Firestore project ID to connect to.
- getFirstTimestamp() - Method in class org.apache.beam.runners.spark.translation.SparkStreamingTranslationContext
- getFlatComparators() - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueComparator
- getFlatJsonRows(Schema) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
- getFlexRSGoal() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
-
This option controls Flexible Resource Scheduling mode.
- getFlinkConfDir() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getFlinkMaster() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
-
The url of the Flink JobManager on which to execute pipelines.
- getFloat() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
- getFloat(int) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.FLOAT
value by field index,ClassCastException
is thrown if schema doesn't match. - getFloat(String) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.FLOAT
value by field name,IllegalStateException
is thrown if schema doesn't match. - getFn() - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.ParDoSingle
- getFn() - Method in class org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate
- getFn() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.stableinput.BufferingDoFnRunner
- getFn() - Method in class org.apache.beam.runners.spark.translation.DoFnRunnerWithMetrics
- getFn() - Method in class org.apache.beam.sdk.transforms.Combine.Globally
-
Returns the
CombineFnBase.GlobalCombineFn
used by this Combine operation. - getFn() - Method in class org.apache.beam.sdk.transforms.Combine.GroupedValues
-
Returns the
CombineFnBase.GlobalCombineFn
used by this Combine operation. - getFn() - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
-
Returns the
CombineFnBase.GlobalCombineFn
used by this Combine operation. - getFn() - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
- getFn() - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
- getFnApiDevContainerVersion() - Method in class org.apache.beam.runners.dataflow.DataflowRunnerInfo
-
Provides the version/tag for dev SDK FnAPI container image.
- getFnApiEnvironmentMajorVersion() - Method in class org.apache.beam.runners.dataflow.DataflowRunnerInfo
-
Provides the FnAPI environment's major version number.
- getForceSlotSharingGroup() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getForceUnalignedCheckpointEnabled() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getFormat() - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration
-
The format of the file(s) to read.
- getFormat() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration
- getFormat() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
- getFormat() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration
- getFormat() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
- getFormat() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
- getFormat() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- getFormat() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration
- getFormatClass() - Method in class org.apache.beam.sdk.io.cdap.Plugin
-
Gets InputFormat or OutputFormat class for a plugin.
- getFormatClass() - Method in enum class org.apache.beam.sdk.io.cdap.PluginConstants.Format
- getFormatClass() - Method in enum class org.apache.beam.sdk.io.cdap.PluginConstants.Hadoop
- getFormatName() - Method in enum class org.apache.beam.sdk.io.cdap.PluginConstants.Format
- getFormatProviderClass() - Method in class org.apache.beam.sdk.io.cdap.Plugin
-
Gets InputFormatProvider or OutputFormatProvider class for a plugin.
- getFormatProviderClass() - Method in enum class org.apache.beam.sdk.io.cdap.PluginConstants.FormatProvider
- getFormatProviderName() - Method in enum class org.apache.beam.sdk.io.cdap.PluginConstants.FormatProvider
- getFractionConsumed() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
- getFractionConsumed() - Method in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
-
Returns a value in [0, 1] representing approximately what fraction of the
current source
this reader has read so far, ornull
if such an estimate is not available. - getFractionConsumed() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
- getFractionConsumed() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
- getFractionConsumed() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
- getFractionConsumed() - Method in interface org.apache.beam.sdk.io.range.RangeTracker
-
Returns the approximate fraction of positions in the source that have been consumed by successful
RangeTracker.tryReturnRecordAt(boolean, PositionT)
calls, or 0.0 if no such calls have happened. - getFractionOfBlockConsumed() - Method in class org.apache.beam.sdk.io.BlockBasedSource.Block
-
Returns the fraction of the block already consumed, if possible, as a value in
[0, 1]
. - getFrom() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRange
-
Returns the range start timestamp (inclusive).
- getFrom() - Method in class org.apache.beam.sdk.io.range.OffsetRange
- getFrom(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.BytesThroughputEstimator
-
Returns the estimated throughput bytes for a specified time.
- getFrom(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.NullThroughputEstimator
-
Always returns 0.
- getFrom(Timestamp) - Method in interface org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.ThroughputEstimator
-
Returns the estimated throughput for a specified time.
- getFromRowFunction() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoDynamicMessageSchema
- getFromRowFunction() - Method in class org.apache.beam.sdk.schemas.SchemaCoder
-
Returns the toRow conversion function.
- getFromRowFunction() - Method in class org.apache.beam.sdk.values.PCollection
-
Returns the attached schema's fromRowFunction.
- getFromRowFunction(Class<T>) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
- getFromRowFunction(Class<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
-
Retrieve the function that converts a
Row
object to the specified type. - getFromRowFunction(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
-
Retrieve the function that converts a
Row
object to the specified type. - getFromSnapshotExclusive() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getFromSnapshotInclusive() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getFromSnapshotRefExclusive() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getFromSnapshotRefInclusive() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getFromTimestamp() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getFullCoder(Coder<T>, Coder<? extends BoundedWindow>) - Static method in class org.apache.beam.sdk.values.WindowedValues
-
Returns the
Coder
to use for aWindowedValue<T>
, using the given valueCoder and windowCoder. - getFullName(PTransform<?, ?>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
-
Returns the full name of the currently being translated transform.
- getFullPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath
- getFullPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.TopicPath
- getFunction() - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields.Fanout
- getFunction() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf
- getFunctionNames() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
- getFunctions(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
- getGapDuration() - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
- getGauge(MetricName) - Method in class org.apache.beam.runners.jet.metrics.JetMetricsContainer
- getGauge(MetricName) - Method in interface org.apache.beam.sdk.metrics.MetricsContainer
-
Return the
Gauge
that should be used for implementing the givenmetricName
in this container. - getGauges() - Method in class org.apache.beam.sdk.metrics.MetricQueryResults
-
Return the metric results for the gauges that matched the filter.
- getGauges(MetricFilter) - Method in class org.apache.beam.runners.spark.metrics.WithMetricsSupport
- getGauges(MetricFilter) - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.WithMetricsSupport
- getGcloudCancelCommand(DataflowPipelineOptions, String) - Static method in class org.apache.beam.runners.dataflow.util.MonitoringUtil
- getGcpCredential() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
-
The credential instance that should be used to authenticate against GCP services.
- getGcpOauthScopes() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
-
Controls the OAuth scopes that will be requested when creating
Credentials
with theGcpCredentialFactory
(which is the defaultCredentialFactory
). - getGcpTempLocation() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
-
A GCS path for storing temporary files in GCP.
- getGcsCustomAuditEntries() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
- getGcsEndpoint() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
-
GCS endpoint to use.
- getGcsHttpRequestReadTimeout() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
- getGcsHttpRequestWriteTimeout() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
- getGcsPerformanceMetrics() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
-
If true, reports metrics of certain operations, such as batch copies.
- getGcsReadCounterPrefix() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
- getGcsRewriteDataOpBatchLimit() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
- getGcsUploadBufferSizeBytes() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
-
The buffer size (in bytes) to use when uploading files to GCS.
- getGcsUtil() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
-
The GcsUtil instance that should be used to communicate with Google Cloud Storage.
- getGcsWriteCounterPrefix() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
- getGCThrashingPercentagePerPeriod() - Method in interface org.apache.beam.sdk.options.MemoryMonitorOptions
-
The GC thrashing threshold percentage.
- getGenericRecordToRowFunction(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
-
Returns a function mapping AVRO
GenericRecord
s to BeamRow
s for use inPCollection.setSchema(org.apache.beam.sdk.schemas.Schema, org.apache.beam.sdk.values.TypeDescriptor<T>, org.apache.beam.sdk.transforms.SerializableFunction<T, org.apache.beam.sdk.values.Row>, org.apache.beam.sdk.transforms.SerializableFunction<org.apache.beam.sdk.values.Row, T>)
. - getGetOffsetFn() - Method in class org.apache.beam.sdk.io.cdap.Plugin
-
Gets a
SerializableFunction
that defines how to get record offset for CDAPPlugin
class. - getGetReceiverArgsFromConfigFn() - Method in class org.apache.beam.sdk.io.cdap.Plugin
-
Gets a
SerializableFunction
that defines how to get constructor arguments forReceiver
usingPluginConfig
. - getGetters() - Method in class org.apache.beam.sdk.values.RowWithGetters
- getGetters(TypeDescriptor<T>, Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
-
Get generated getters for an AVRO-generated SpecificRecord or a POJO.
- getGetters(TypeDescriptor<T>, Schema, FieldValueTypeSupplier, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
-
Return the list of
FieldValueGetter
s for a Java Bean class - getGetters(TypeDescriptor<T>, Schema, FieldValueTypeSupplier, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.POJOUtils
- getGetterTarget() - Method in class org.apache.beam.sdk.values.RowWithGetters
- getGlobalConfigRefreshPeriod() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getGlobalSequenceCombiner() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler.OrderedProcessingGlobalSequenceHandler
-
Provide the global sequence combiner.
- getGoogleAdsClientId() - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsOptions
-
OAuth 2.0 Client ID identifying the application.
- getGoogleAdsClientSecret() - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsOptions
-
OAuth 2.0 Client Secret for the specified Client ID.
- getGoogleAdsCredential() - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsOptions
-
The credential instance that should be used to authenticate against the Google Ads API.
- getGoogleAdsCredentialFactoryClass() - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsOptions
-
The class of the credential factory to create credentials if none have been explicitly set.
- getGoogleAdsDeveloperToken() - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsOptions
-
Google Ads developer token for the user connecting to the Google Ads API.
- getGoogleAdsEndpoint() - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsOptions
-
Host endpoint to use for connections to the Google Ads API.
- getGoogleAdsRefreshToken() - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsOptions
-
OAuth 2.0 Refresh Token for the user connecting to the Google Ads API.
- getGoogleApiTrace() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GoogleApiDebugOptions
-
This option enables tracing of API calls to Google services used within the Apache Beam SDK.
- getGoogleCloudStorageReadOptions() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
- getGrammarFileName() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
- getGrammarFileName() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- getGroupFilesFileLoad() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
-
Choose to use a GBK when gathering a list of files in batch FILE_LOAD.
- getGroupingTableMaxSizeMb() - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
-
Size (in MB) of each grouping table used to pre-combine elements.
- getGson() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.DataChangeRecordToRow
- getGzipCompressHeapDumps() - Method in interface org.apache.beam.sdk.options.MemoryMonitorOptions
-
Controls if heap dumps that are copied to remote destination are gzipped compressed.
- getHadoopConfiguration() - Method in class org.apache.beam.sdk.io.cdap.Plugin
-
Gets a plugin Hadoop configuration.
- getHasError() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Document
- getHashCode() - Method in class org.apache.beam.io.debezium.KafkaSourceConsumerFn
- getHdfsConfiguration() - Method in interface org.apache.beam.sdk.io.hdfs.HadoopFileSystemOptions
- getHeaderAccessor() - Static method in class org.apache.beam.sdk.fn.server.GrpcContextHeaderAccessorProvider
- getHeaderColumns() - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
- getHeaderColumns() - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText
- getHeaderColumns() - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
- getHeaders() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
- getHeaders() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
- getHeartbeatMillis() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
The number of milliseconds after the stream is idle, which a heartbeat record will be emitted in the change stream query.
- getHighWatermark() - Method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder.SparkWatermarks
- getHintMaxNumWorkers() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
A hint to the QoS system for the intended max number of workers for a pipeline.
- getHistogram(MetricName, HistogramData.BucketType) - Method in interface org.apache.beam.sdk.metrics.MetricsContainer
-
Return the
Histogram
that should be used for implementing the givenmetricName
in this container. - getHistograms() - Method in class org.apache.beam.sdk.metrics.MetricQueryResults
-
Return the metric results for the sets that matched the filter.
- getHistograms(MetricFilter) - Method in class org.apache.beam.runners.spark.metrics.WithMetricsSupport
- getHistograms(MetricFilter) - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.WithMetricsSupport
- getHL7v2Message() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadResponse
-
Gets hl7v2Message.
- getHL7v2Message(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Gets a Hl7v2 message by its name from a Hl7v2 store.
- getHL7v2Message(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
-
Gets HL7v2 message.
- getHl7v2MessageId() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadParameter
-
HL7v2MessageId string.
- getHL7v2Store(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Gets an HL7v2 store.
- getHL7v2Store(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
-
Gets HL7v2 store.
- getHoldability() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- getHost() - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration
- getHost() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver.ServerConfiguration
- getHost() - Method in class org.apache.beam.sdk.expansion.service.ExpansionServer
-
Get the host that this
ExpansionServer
is bound to. - getHost() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getHostValue() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getHttpClient() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
- getHttpClientConfiguration() - Method in interface org.apache.beam.sdk.io.aws2.options.AwsOptions
-
HttpClientConfiguration
used to configure AWS service clients. - getHttpPipeline() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
- getHTTPReadTimeout() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getHTTPWriteTimeout() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getHumanReadableJsonRecord() - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Record
-
The failing record, encoded as JSON.
- getIcebergCatalog() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration
- getId() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmShard
-
Return the shard id.
- getId() - Method in interface org.apache.beam.runners.fnexecution.control.RemoteBundle
-
Get an id used to represent this bundle.
- getId() - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient.BundleProcessor.ActiveBundle
-
Returns an id used to represent this bundle.
- getId() - Method in class org.apache.beam.runners.jobsubmission.JobInvocation
- getId() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource
- getId() - Method in class org.apache.beam.sdk.extensions.sql.example.model.Customer
- getId() - Method in class org.apache.beam.sdk.extensions.sql.example.model.Order
- getId() - Method in interface org.apache.beam.sdk.fn.IdGenerator
- getId() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.ProjectPath
- getId() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SchemaPath
- getId() - Method in class org.apache.beam.sdk.values.TupleTag
-
Returns the id of this
TupleTag
. - getId() - Method in class org.apache.beam.sdk.values.ValueWithRecordId
- getIdAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
- getIdAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
-
Get the id attribute.
- getIdAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
-
Get the id attribute.
- getIdAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration
- getIdentifier() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.LocalMktDate
- getIdentifier() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCDateOnly
- getIdentifier() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimeOnly
- getIdentifier() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimestamp
- getIdentifier() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimeOnly
- getIdentifier() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimestamp
- getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.Date
- getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.DateTime
- getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
- getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.MicrosInstant
- getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.NanosDuration
- getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.NanosInstant
- getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
- getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.PassThroughLogicalType
- getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.PythonCallable
- getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.SchemaLogicalType
- getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.Time
- getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.UuidLogicalType
- getIdentifier() - Method in interface org.apache.beam.sdk.schemas.Schema.LogicalType
-
The unique identifier for this type.
- getIdleShutdownTimeout() - Method in interface org.apache.beam.runners.prism.PrismPipelineOptions
- getImpersonateServiceAccount() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
-
All API requests will be made as the given service account or target service account in an impersonation delegation chain instead of the currently selected account.
- getImplementor() - Method in class org.apache.beam.sdk.extensions.sql.impl.ScalarFunctionImpl
- getImplementor() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.CastFunctionImpl
- getImplementor(boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.UdafImpl
- getInboundObserver() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.ArrayAgg.ArrayAggArray
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations.BitXOr
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions.PositionAwareCombineFn
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.transforms.Combine.SimpleCombineFn
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CountIf.CountIfFn
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CovarianceFn
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.extensions.ordered.combiner.DefaultSequenceCombiner
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.runners.flink.FlinkStreamingAggregationsTranslators.ConcatenateAsIterable
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.extensions.sql.impl.LazyAggregateCombineFn
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Concatenate
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggByte
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggString
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.VarianceFn
- getIncompatibleGlobalWindowErrorMessage() - Method in interface org.apache.beam.sdk.transforms.CombineFnBase.GlobalCombineFn
-
Returns the error message for not supported default values in Combine.globally().
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.transforms.Top.TopCombineFn
- getIndex() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPFieldRef
- getIndex() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.OrderKey
- getIndex() - Method in class org.apache.beam.sdk.extensions.sql.impl.utils.SerializableRexInputRef
- getIndex() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
- getIndex() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- getIndex() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration
- getIndex() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
-
The zero-based index of this trigger firing that produced this pane.
- getIndex(TupleTag<?>) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
-
Returns the index for the given tuple tag, if the tag is present in this schema, -1 if it isn't.
- getIndexes() - Method in class org.apache.beam.sdk.extensions.sql.impl.utils.SerializableRexFieldAccess
- getIndexOffset() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmShard
-
Return the absolute position within the Ism file where the index block begins.
- getIndexPosition() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.Footer
- getInferMaps() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.SchemaConversionOptions
-
/** Controls whether to use the map or row FieldType for a TableSchema field that appears to represent a map (it is an array of structs containing only
key
andvalue
fields). - getInflightWaitSeconds() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StreamAppendClient
-
If the previous call to appendRows blocked due to flow control, returns how long the call blocked for.
- getIngestManager() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeStreamingServiceConfig
-
Getter for ingest manager which serves API to load data in streaming mode and retrieve a report about loaded data.
- getInitialBackoff() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
The initial backoff duration to be used before retrying a request for the first time.
- getInitializedProducer(SolaceIO.SubmissionMode) - Method in class org.apache.beam.sdk.io.solace.broker.JcsmpSessionService
- getInitializedProducer(SolaceIO.SubmissionMode) - Method in class org.apache.beam.sdk.io.solace.broker.SessionService
-
Returns a MessageProducer object for publishing messages to Solace.
- getInitialRestriction(InputT) - Method in class org.apache.beam.sdk.transforms.Watch.WatchGrowthFn
- getInitialRestriction(Map<String, String>) - Method in class org.apache.beam.io.debezium.KafkaSourceConsumerFn
- getInitialRestriction(PulsarSourceDescriptor) - Method in class org.apache.beam.sdk.io.pulsar.ReadFromPulsarDoFn
- getInitialWatermarkEstimatorState(InitialPipelineState) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.DetectNewPartitionsDoFn
- getInitialWatermarkEstimatorState(PartitionRecord) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.ReadChangeStreamPartitionDoFn
- getInitialWatermarkEstimatorState(PartitionMetadata) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
- getInitialWatermarkEstimatorState(PartitionMetadata) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
- getInitialWatermarkEstimatorState(Instant) - Method in class org.apache.beam.io.debezium.KafkaSourceConsumerFn
- getInitialWatermarkEstimatorState(Instant) - Method in class org.apache.beam.sdk.io.pulsar.ReadFromPulsarDoFn
- getInitialWatermarkEstimatorState(Instant) - Method in class org.apache.beam.sdk.transforms.Watch.WatchGrowthFn
- getInput() - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator.UnresolvedTranslation
- getInput() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
- getInput(PTransform<InputT, ?>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
- getInput(PTransform<T, ?>) - Method in class org.apache.beam.runners.flink.translation.utils.LookupPipelineVisitor
- getInput(PTransform<T, ?>) - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
- getInput(PTransform<T, ?>) - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
- getInput(RelNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils
- getInputDataSet(PValue) - Method in class org.apache.beam.runners.twister2.Twister2BatchTranslationContext
- getInputDataSet(PValue) - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
- getInputDoc() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Document
- getInputFile() - Method in interface org.apache.beam.runners.spark.structuredstreaming.examples.WordCount.WordCountOptions
- getinputFormatClass() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
- getinputFormatKeyClass() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
- getInputFormatProvider() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getinputFormatValueClass() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
- getInputId(PipelineNode.PTransformNode) - Static method in class org.apache.beam.runners.fnexecution.translation.PipelineTranslatorUtils
- getInputReceivers() - Method in interface org.apache.beam.runners.fnexecution.control.RemoteBundle
-
Get a map of PCollection ids to
receiver
s which consume input elements, forwarding them to the remote environment. - getInputReceivers() - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient.BundleProcessor.ActiveBundle
-
Get a map of PCollection ids to
receiver
s which consume input elements, forwarding them to the remote environment. - getInputs() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
- getInputs() - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
- getInputs(PTransform<?, ?>) - Method in class org.apache.beam.runners.flink.translation.utils.LookupPipelineVisitor
- getInputs(PTransform<?, ?>) - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
- getInputs(PTransform<InputT, ?>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
-
Returns the input of the currently being translated transform.
- getInputs(ResolvedNodes.ResolvedQueryStmt) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.QueryStatementConverter
-
Extract Zeta SQL resolved nodes that correspond to the inputs of the current node.
- getInputSchema() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getInputSchemas() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getInputSplitAssigner(SourceInputSplit[]) - Method in class org.apache.beam.runners.flink.translation.wrappers.SourceInputFormat
- getInputSplitAssigner(GenericInputSplit[]) - Method in class org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat
- getInputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
- getInputTVariable() - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
-
Returns the
TypeVariable
ofInputT
. - getInputTVariable() - Method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
-
Returns the
TypeVariable
ofInputT
. - getInputTVariable() - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
-
Returns the
TypeVariable
ofInputT
. - getInputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.ArrayAgg.ArrayAggArray
-
Returns the
TypeVariable
ofInputT
. - getInputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations.BitXOr
-
Returns the
TypeVariable
ofInputT
. - getInputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions.PositionAwareCombineFn
-
Returns the
TypeVariable
ofInputT
. - getInputTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn
-
Returns the
TypeVariable
ofInputT
. - getInputTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
-
Returns the
TypeVariable
ofInputT
. - getInputTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
-
Returns the
TypeVariable
ofInputT
. - getInputTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
-
Returns the
TypeVariable
ofInputT
. - getInputTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
-
Returns the
TypeVariable
ofInputT
. - getInputTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
-
Returns the
TypeVariable
ofInputT
. - getInputTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
-
Returns the
TypeVariable
ofInputT
. - getInputTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.SimpleCombineFn
-
Returns the
TypeVariable
ofInputT
. - getInputTVariable() - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
-
Returns the
TypeVariable
ofInputT
. - getInputTVariable() - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
-
Returns the
TypeVariable
ofInputT
. - getInputTVariable() - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
-
Returns the
TypeVariable
ofInputT
. - getInputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CountIf.CountIfFn
-
Returns the
TypeVariable
ofInputT
. - getInputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CovarianceFn
-
Returns the
TypeVariable
ofInputT
. - getInputTVariable() - Method in class org.apache.beam.sdk.extensions.ordered.combiner.DefaultSequenceCombiner
-
Returns the
TypeVariable
ofInputT
. - getInputTVariable() - Method in class org.apache.beam.runners.flink.FlinkStreamingAggregationsTranslators.ConcatenateAsIterable
-
Returns the
TypeVariable
ofInputT
. - getInputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.LazyAggregateCombineFn
-
Returns the
TypeVariable
ofInputT
. - getInputTVariable() - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
-
Returns the
TypeVariable
ofInputT
. - getInputTVariable() - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
-
Returns the
TypeVariable
ofInputT
. - getInputTVariable() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Concatenate
-
Returns the
TypeVariable
ofInputT
. - getInputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggByte
-
Returns the
TypeVariable
ofInputT
. - getInputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggString
-
Returns the
TypeVariable
ofInputT
. - getInputTVariable() - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
-
Returns the
TypeVariable
ofInputT
. - getInputTVariable() - Method in class org.apache.beam.sdk.transforms.Top.TopCombineFn
-
Returns the
TypeVariable
ofInputT
. - getInputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.VarianceFn
-
Returns the
TypeVariable
ofInputT
. - getInputType() - Method in class org.apache.beam.sdk.extensions.sql.impl.UdafImpl
- getInputType() - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
- getInputType() - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
-
Returns a
TypeDescriptor
capturing what is known statically about the input type of thisCombineFn
instance's most-derived class. - getInputTypeDescriptor() - Method in class org.apache.beam.sdk.transforms.DoFn
-
Returns a
TypeDescriptor
capturing what is known statically about the input type of thisDoFn
instance's most-derived class. - getInputTypeDescriptor() - Method in class org.apache.beam.sdk.transforms.InferableFunction
-
Returns a
TypeDescriptor
capturing what is known statically about the input type of thisInferableFunction
instance's most-derived class. - getInputValueCoder(Coder<KV<K, V>>) - Static method in class org.apache.beam.runners.dataflow.internal.DataflowGroupByKey
-
Returns the
Coder
of the values of the input to this transform. - getInputValueCoder(Coder<KV<K, V>>) - Static method in class org.apache.beam.sdk.transforms.GroupByKey
-
Returns the
Coder
of the values of the input to this transform. - getInsertBundleParallelism() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getInsertCount() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- getInsertDefault() - Method in class org.apache.beam.sdk.transforms.Combine.GloballyAsSingletonView
- getInsertErrors() - Method in class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy.Context
- getInspectConfig() - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
- getInspectConfig() - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText
- getInspectConfig() - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
- getInspectTemplateName() - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
- getInspectTemplateName() - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText
- getInspectTemplateName() - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
- getInstance() - Static method in class org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageContextFactory
- getInstance() - Static method in class org.apache.beam.runners.spark.metrics.MetricsAccumulator
- getInstance() - Static method in class org.apache.beam.runners.spark.translation.SparkExecutableStageContextFactory
- getInstance() - Static method in class org.apache.beam.sdk.io.googleads.DefaultGoogleAdsClientFactory
- getInstance() - Static method in class org.apache.beam.sdk.metrics.NoOpCounter
- getInstance() - Static method in class org.apache.beam.sdk.metrics.NoOpHistogram
- getInstance(String, String) - Static method in class org.apache.beam.sdk.schemas.transforms.providers.StringCompiler
- getInstance(SparkSession) - Static method in class org.apache.beam.runners.spark.structuredstreaming.metrics.MetricsAccumulator
-
Get the
MetricsAccumulator
on this driver. - getInstanceAdminClient() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.BigtableChangeStreamAccessor
- getInstanceConfigId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor
- getInstanceId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
-
Returns the instance id being written to.
- getInstanceId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration
- getInstanceId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration
- getInstanceId() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration
- getInstanceId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getInstanceId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration
- getInstanceId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration
- getInstructionId() - Method in class org.apache.beam.sdk.fn.data.LogicalEndpoint
- getInstructionRequestHandler() - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient
- getInstructionRequestHandler() - Method in interface org.apache.beam.runners.fnexecution.control.StageBundleFactory
- getInstructionRequestHandler() - Method in class org.apache.beam.runners.fnexecution.environment.ProcessEnvironment
- getInstructionRequestHandler() - Method in interface org.apache.beam.runners.fnexecution.environment.RemoteEnvironment
-
Return an
InstructionRequestHandler
which can communicate with the environment. - getInstructionRequestHandler() - Method in class org.apache.beam.runners.fnexecution.environment.RemoteEnvironment.SimpleRemoteEnvironment
- getInstructionRequestHandler() - Method in class org.apache.beam.runners.fnexecution.environment.StaticRemoteEnvironment
- getInt(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- getInt(Map<String, Object>, String, Integer) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- getInt16() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
- getInt16(int) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.INT16
value by field index,ClassCastException
is thrown if schema doesn't match. - getInt16(String) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.INT16
value by field name,IllegalStateException
is thrown if schema doesn't match. - getInt32() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
- getInt32(int) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.INT32
value by field index,ClassCastException
is thrown if schema doesn't match. - getInt32(String) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.INT32
value by field name,IllegalStateException
is thrown if schema doesn't match. - getInt64() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
- getInt64(int) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.INT64
value by field index,ClassCastException
is thrown if schema doesn't match. - getInt64(String) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.INT64
value by field name,IllegalStateException
is thrown if schema doesn't match. - getInterface() - Method in enum class org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention
- getInterfaces() - Method in class org.apache.beam.sdk.values.TypeDescriptor
-
Returns a set of
TypeDescriptor
s, one for each interface implemented by this class. - getIntersectingPartition(Range.ByteStringRange, Range.ByteStringRange) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ByteStringRangeHelper
-
Return the overlapping parts of 2 partitions.
- getIo() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
- getIr() - Method in class org.apache.beam.sdk.extensions.sbe.SbeSchema
- getIrOptions() - Method in class org.apache.beam.sdk.extensions.sbe.SbeSchema
- getIsLocalChannelProvider() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getIsWindmillServiceDirectPathEnabled() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getIterable(int) - Method in class org.apache.beam.sdk.values.Row
-
Get an iterable value by field index,
IllegalStateException
is thrown if schema doesn't match. - getIterable(String) - Method in class org.apache.beam.sdk.values.Row
-
Get an iterable value by field name,
IllegalStateException
is thrown if schema doesn't match. - getIterableComponentType(TypeDescriptor) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
-
For an array T[] or a subclass of Iterable
, return a TypeDescriptor describing T. - getJarPath() - Method in class org.apache.beam.sdk.extensions.sql.impl.ScalarFunctionImpl
-
Optional Beam filesystem path to the jar containing the bytecode for this function.
- getJavaClass(RelDataType) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamJavaTypeFactory
- getJavaClassLookupAllowlist() - Method in interface org.apache.beam.sdk.expansion.service.ExpansionServiceOptions
- getJavaClassLookupAllowlistFile() - Method in interface org.apache.beam.sdk.expansion.service.ExpansionServiceOptions
- getJAXBClass() - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
- getJdbcType() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
- getJdbcType() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
- getJdbcUrl() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
- getJdbcUrl() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
- getJdkAddOpenModules() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
-
Open modules needed for reflection that access JDK internals with Java 9+
- getJdkAddOpenModules() - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
-
Open modules needed for reflection that access JDK internals with Java 9+.
- getJdkAddRootModules() - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
-
Add modules to the default root set with Java 11+.
- getJetDefaultParallelism() - Method in interface org.apache.beam.runners.jet.JetPipelineOptions
- getJetLocalMode() - Method in interface org.apache.beam.runners.jet.JetPipelineOptions
- getJetProcessorsCooperative() - Method in interface org.apache.beam.runners.jet.JetPipelineOptions
- getJetServers() - Method in interface org.apache.beam.runners.jet.JetPipelineOptions
- getJfrRecordingDurationSec() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
- getJmsCorrelationID() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
- getJmsDeliveryMode() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
- getJmsDestination() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
- getJmsExpiration() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
- getJmsMessageID() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
- getJmsPriority() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
- getJmsRedelivered() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
- getJmsReplyTo() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
- getJmsTimestamp() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
- getJmsType() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
- getJob() - Method in exception class org.apache.beam.runners.dataflow.DataflowJobException
-
Returns the failed job.
- getJob() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator.JobSpecification
- getJob(JobReference) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
-
Gets the specified
Job
by the givenJobReference
. - getJob(JobReference) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
- getJob(String) - Method in class org.apache.beam.runners.dataflow.DataflowClient
-
Gets the Dataflow
Job
with the givenjobId
. - getJobCheckIntervalInSecs() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getJobEndpoint() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
- getJobFileZip() - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
- getJobId() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
-
Get the id of this job.
- getJobId() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerHarnessOptions
-
The identity of the Dataflow job.
- getJobId() - Method in class org.apache.beam.runners.dataflow.util.DataflowTemplateJob
- getJobInfo() - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.BatchTranslationContext
- getJobInfo() - Method in interface org.apache.beam.runners.flink.FlinkPortablePipelineTranslator.TranslationContext
- getJobInfo() - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator.StreamingTranslationContext
- getJobLabelsMap() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getJobMessages(String, long) - Method in class org.apache.beam.runners.dataflow.util.MonitoringUtil
-
Return job messages sorted in ascending order by timestamp.
- getJobMetrics(String) - Method in class org.apache.beam.runners.dataflow.DataflowClient
-
Gets the
JobMetrics
with the givenjobId
. - getJobMetrics(JobApi.GetJobMetricsRequest, StreamObserver<JobApi.GetJobMetricsResponse>) - Method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
- getJobMetrics(JobApi.GetJobMetricsRequest, StreamObserver<JobApi.GetJobMetricsResponse>) - Method in class org.apache.beam.runners.portability.testing.TestJobService
- getJobMonitoringPageURL(String, String) - Static method in class org.apache.beam.runners.dataflow.util.MonitoringUtil
-
Deprecated.this method defaults the region to "us-central1". Prefer using the overload with an explicit regionId parameter.
- getJobMonitoringPageURL(String, String, String) - Static method in class org.apache.beam.runners.dataflow.util.MonitoringUtil
- getJobName() - Method in interface org.apache.beam.sdk.options.PipelineOptions
- getJobs(JobApi.GetJobsRequest, StreamObserver<JobApi.GetJobsResponse>) - Method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
- getJobServerConfig() - Method in interface org.apache.beam.runners.portability.testing.TestPortablePipelineOptions
- getJobServerDriver() - Method in interface org.apache.beam.runners.portability.testing.TestPortablePipelineOptions
- getJobServerTimeout() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
- getJobServerUrl() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver
- getJobService(BigQueryOptions) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices
-
Returns a real, mock, or fake
BigQueryServices.JobService
. - getJobService(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl
- getJobService(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
- getJobType() - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
- getJoinColumns(boolean, List<Pair<RexNode, RexNode>>, int, Schema) - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamJoinTransforms
- getJsonBytesToRowFunction(Schema) - Static method in class org.apache.beam.sdk.schemas.utils.JsonUtils
-
Returns a
SimpleFunction
mapping JSON byte[] arrays to BeamRow
s. - getJsonClustering() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- getJsonFactory() - Static method in class org.apache.beam.sdk.extensions.gcp.util.Transport
- getJsonFactory() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- getJsonStringToRowFunction(Schema) - Static method in class org.apache.beam.sdk.schemas.utils.JsonUtils
- getJsonTimePartitioning() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- getJsonToRowWithErrFn() - Method in class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn.ParseWithError
- getKeep() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
- getKeep() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration
- getKeep() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider.Configuration
- getKeepFields() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getKey() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkBroadcastStateInternals
- getKey() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals
- getKey() - Method in interface org.apache.beam.runners.local.Bundle
-
Returns the key that was output in the most recent
GroupByKey
in the execution of this bundle. - getKey() - Method in class org.apache.beam.runners.local.StructuralKey
-
Returns the key that this
StructuralKey
was created from. - getKey() - Method in class org.apache.beam.runners.spark.stateful.SparkStateInternals
- getKey() - Method in class org.apache.beam.runners.spark.translation.SparkProcessContext
- getKey() - Method in class org.apache.beam.sdk.io.aws2.s3.SSECustomerKey
- getKey() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
- getKey() - Method in class org.apache.beam.sdk.metrics.MetricResult
- getKey() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Identifier
- getKey() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
-
The key for the display item.
- getKey() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
-
The key for the display item.
- getKey() - Method in class org.apache.beam.sdk.values.KV
-
Returns the key of this
KV
. - getKey() - Method in class org.apache.beam.sdk.values.ShardedKey
- getKey(Coder<K>) - Method in class org.apache.beam.runners.flink.adapter.FlinkKey
- getKey(WindowedValue<KeyedWorkItem<K, V>>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.WorkItemKeySelector
- getKey(WindowedValue<KV<K, InputT>>) - Method in class org.apache.beam.runners.flink.translation.types.KvKeySelector
- getKey(WindowedValue<KV<K, InputT>>) - Method in class org.apache.beam.runners.flink.translation.types.WindowedKvKeySelector
- getKey(WindowedValue<KV<K, V>>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.KvToFlinkKeyKeySelector
- getKey(WindowedValue<KV<KV<K, V>, Double>>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.SdfFlinkKeyKeySelector
- getKeyClass() - Method in enum class org.apache.beam.sdk.io.cdap.PluginConstants.Hadoop
- getKeyCoder() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.MetadataKeyCoder
- getKeyCoder() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.SingletonKeyedWorkItemCoder
- getKeyCoder() - Method in class org.apache.beam.sdk.coders.KvCoder
- getKeyCoder() - Method in class org.apache.beam.sdk.coders.MapCoder
- getKeyCoder() - Method in class org.apache.beam.sdk.coders.SortedMapCoder
- getKeyCoder() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
- getKeyCoder() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getKeyCoder() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
- getKeyCoder(Coder<KV<K, V>>) - Static method in class org.apache.beam.sdk.transforms.GroupByKey
-
Returns the
Coder
of the keys of the input to this transform, which is also used as theCoder
of the keys of the output of this transform. - getKeyCoder(Pipeline, Coder<KV<KeyT, KV<Long, EventT>>>) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler
-
Provide the key coder.
- getKeyComponent(int) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmRecord
-
Returns the key component at the specified index.
- getKeyComponentCoder(int) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmRecordCoder
-
Returns the key coder at the specified index.
- getKeyComponentCoders() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmRecordCoder
-
Returns the list of key component coders.
- getKeyComponents() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmRecord
-
Returns the list of key components.
- getKeyDeserializerProvider() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getKeyedCollections() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
-
Returns a list of
TaggedKeyedPCollections
for thePCollections
contained in thisKeyedPCollectionTuple
. - getKeyedResources() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search.Result
-
Gets resources with input SearchParameter key.
- getKeyParts(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema
- getKeyRange() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
-
Returns the range of keys that will be read from the table.
- getKeys() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
- getKeySerializer() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- getKeySet() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- getKeysJson() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.Mod
-
The primary keys of this specific modification.
- getKeystorePassword() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
- getKeystorePath() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
- getKeyTranslationFunction() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
- getKeyTypeDescriptor() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
- getKind() - Method in class org.apache.beam.sdk.extensions.sql.impl.QueryPlanner.QueryParameters
- getKind() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
- getKind() - Method in interface org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.PubsubClientFactory
-
Return the display name for this factory.
- getKind() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.Qualifier
- getKind() - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields.Fanout
- getKindString() - Method in class org.apache.beam.sdk.io.BoundedReadFromUnboundedSource
- getKindString() - Method in class org.apache.beam.sdk.io.Read.Bounded
- getKindString() - Method in class org.apache.beam.sdk.io.Read.Unbounded
- getKindString() - Method in class org.apache.beam.sdk.transforms.Combine.Globally
- getKindString() - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
- getKindString() - Method in class org.apache.beam.sdk.transforms.Combine.PerKeyWithHotKeyFanout
- getKindString() - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
- getKindString() - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
- getKindString() - Method in class org.apache.beam.sdk.transforms.PTransform
-
Returns the name to use by default for this
PTransform
(not including the names of any enclosingPTransform
s). - getKindString() - Method in class org.apache.beam.sdk.transforms.Tee
- getKindString() - Method in class org.apache.beam.sdk.transforms.windowing.Window
- getKindString() - Method in class org.apache.beam.sdk.values.PValueBase
-
Returns a
String
capturing the kind of thisPValueBase
. - getKinesisIOConsumerArns() - Method in interface org.apache.beam.sdk.io.aws2.kinesis.KinesisIOOptions
-
Used to enable / disable EFO.
- getKmsKey() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration
- getKmsKey() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
- getKV() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
- getLabel() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
-
Retrieve the optional label for an item.
- getLabel() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
-
The optional label for an item.
- getLabels() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
-
Labels that will be applied to the billing records for this job.
- getLabels() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
-
Gets labels.
- getLabels() - Method in class org.apache.beam.sdk.metrics.MetricName
-
Associated labels for the metric.
- getLanguage() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider.Configuration
- getLanguage() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider.Configuration
- getLanguageOptions() - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner
- getLastContiguousSequenceRange() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus
- getLastEmitted() - Method in class org.apache.beam.sdk.io.CountingSource.CounterMark
-
Returns the last value emitted by the reader.
- getLastFieldId() - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- getLastProcessedSequence() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus
- getLastRunTimestamp() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
- getLastUpdated() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
- getLastWatermarkedBatchTime() - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
- getLatencyNanos() - Method in class org.apache.beam.sdk.io.solace.data.Solace.PublishResult
-
The publishing latency in nanoseconds.
- getLatencyTrackingInterval() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getLatestBufferedSequence() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus
- getLatestHL7v2SendTime(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Gets the latest HL7v2 send time.
- getLatestHL7v2SendTime(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- getLateTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
- getLeaves() - Method in class org.apache.beam.runners.twister2.Twister2PipelineExecutionEnvironment
- getLeaves() - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
- getLegacyDevContainerVersion() - Method in class org.apache.beam.runners.dataflow.DataflowRunnerInfo
-
Provides the version/tag for legacy SDK FnAPI container image.
- getLegacyEnvironmentMajorVersion() - Method in class org.apache.beam.runners.dataflow.DataflowRunnerInfo
-
Provides the legacy environment's major version number.
- getLength() - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeSerializer
- getLength() - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueSerializer
- getLength() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.FlinkStateNamespaceKeySerializer
- getLength() - Method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeVarchar
- getLength() - Method in class org.apache.beam.sdk.schemas.logicaltypes.FixedBytes
- getLength() - Method in class org.apache.beam.sdk.schemas.logicaltypes.FixedString
- getLevel() - Method in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider.Configuration
- getLimitCountOfSortRel() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.AbstractBeamCalcRel
- getLinkUrl() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
-
Retrieve the optional link URL for an item.
- getLinkUrl() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
-
The optional link URL for an item.
- getList() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.Qualifier
- getListeners() - Method in interface org.apache.beam.runners.spark.SparkContextOptions
- getListOfMaps(Map<String, Object>, String, List<Map<String, Object>>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- getLiteralGqlQuery() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
- getLoadBalanceBundles() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
- getLocalhost() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
- getLocalJobServicePortFile() - Method in interface org.apache.beam.runners.portability.testing.TestUniversalRunner.Options
-
A file containing the job service port, since Gradle needs to know this filename statically to provide it in Beam testing options.
- getLocalValue() - Method in class org.apache.beam.runners.flink.metrics.MetricsAccumulator
- getLocalWindmillHostport() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getLocation() - Method in class org.apache.beam.sdk.extensions.sql.meta.Table
- getLocation() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
- getLocation() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
- getLocation() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
- getLocation() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
- getLockToAcquireForStateAccessDuringBundles() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
-
Subclasses may provide a lock to ensure that the state backend is not accessed concurrently during bundle execution.
- getLockToAcquireForStateAccessDuringBundles() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.ExecutableStageDoFnOperator
- getLogicalStartTime() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getLogicalType() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
- getLogicalType(Class<LogicalTypeT>) - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
-
Helper function for retrieving the concrete logical type subclass.
- getLogicalTypeName() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.DateConversion
- getLogicalTypeName() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.LocalTimestampMicrosConversion
- getLogicalTypeName() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.LocalTimestampMillisConversion
- getLogicalTypeName() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimeMicrosConversion
- getLogicalTypeName() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimeMillisConversion
- getLogicalTypeName() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimestampMicrosConversion
- getLogicalTypeName() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimestampMillisConversion
- getLogicalTypeName() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.DateConversion
- getLogicalTypeName() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimeConversion
- getLogicalTypeName() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimeMicrosConversion
- getLogicalTypeName() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimestampConversion
- getLogicalTypeName() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimestampMicrosConversion
- getLogicalTypeValue(int, Class<T>) - Method in class org.apache.beam.sdk.values.Row
-
Returns the Logical Type input type for this field.
- getLogicalTypeValue(String, Class<T>) - Method in class org.apache.beam.sdk.values.Row
-
Returns the Logical Type input type for this field.
- getLoginTimeout() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- getLoginTimeout() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- getLogLevel() - Method in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider.Configuration
- getLogMdc() - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
-
Whether to include SLF4J MDC in log entries.
- getLogTopicVerification() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getLong(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- getLong(Map<String, Object>, String, Long) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- getLowWatermark() - Method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder.SparkWatermarks
- getLowWatermark() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
- getMainOutputTag() - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.ParDoSingle
- getMainOutputTag() - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
Deprecated.Use
TestPipeline
with theDirectRunner
. - getMainOutputTag() - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
- getMainTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.OrFinallyTrigger
-
The main trigger, which will continue firing until the "until" trigger fires.
- getManifestListLocation() - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo
- getMap() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.Qualifier
- getMap(int) - Method in class org.apache.beam.sdk.values.Row
-
Get a MAP value by field index,
IllegalStateException
is thrown if schema doesn't match. - getMap(String) - Method in class org.apache.beam.sdk.values.Row
-
Get a MAP value by field name,
IllegalStateException
is thrown if schema doesn't match. - getMapKeyType() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
-
If the field is a map type, returns the key type.
- getMapKeyType() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
- getMapping() - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.Globally
- getMapping() - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.PerKey
- getMapType(TypeDescriptor, int) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
- getMapValueType() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
-
If the field is a map type, returns the key type.
- getMapValueType() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
- getMatcher() - Method in class org.apache.beam.sdk.extensions.gcp.util.CustomHttpErrors.MatcherAndError
- getMatchUpdatedFiles() - Method in class org.apache.beam.sdk.io.FileIO.MatchConfiguration
- getMaterialization() - Method in class org.apache.beam.sdk.transforms.ViewFn
-
Gets the materialization of this
ViewFn
. - getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryListFromMultimapViewFn
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryListViewFn
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryMapFromVoidKeyViewFn
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryMapViewFn
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryMultimapFromVoidKeyViewFn
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryMultimapViewFn
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableBackedListViewFn
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableViewFn
-
Deprecated.
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableViewFn2
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.ListViewFn
-
Deprecated.
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.ListViewFn2
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.MapViewFn
-
Deprecated.
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.MapViewFn2
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.MultimapViewFn
-
Deprecated.
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.MultimapViewFn2
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn
-
Deprecated.
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn2
- getMax() - Method in class org.apache.beam.sdk.metrics.DistributionResult
- getMaxAttempts() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
The maximum number of times a request will be attempted for a complete successful result.
- getMaxBufferingDuration() - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches.BatchingParams
- getMaxBufferingDurationMilliSec() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getMaxBundlesFromWindmillOutstanding() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
-
Maximum number of bundles outstanding from windmill before the worker stops requesting.
- getMaxBundleSize() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getMaxBundleTimeMills() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getMaxBytesFromWindmillOutstanding() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
-
Maximum number of bytes outstanding from windmill before the worker stops requesting.
- getMaxCacheMemoryUsage(PipelineOptions) - Method in class org.apache.beam.sdk.options.SdkHarnessOptions.DefaultMaxCacheMemoryUsageMb
- getMaxCacheMemoryUsage(PipelineOptions) - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions.MaxCacheMemoryUsageMb
- getMaxCacheMemoryUsageMb() - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
-
Size (in MB) for the process wide cache within the SDK harness.
- getMaxCacheMemoryUsageMbClass() - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
-
An instance of this class will be used to specify the maximum amount of memory to allocate to a cache within an SDK harness instance.
- getMaxCacheMemoryUsagePercent() - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
-
Size (in % [0 - 100]) for the process wide cache within the SDK harness.
- getMaxCommitDelay() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getMaxConnectionPoolConnections() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getMaxCumulativeBackoff() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getMaxElementCountToTriggerContinuousSequenceRangeReevaluation() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler.OrderedProcessingGlobalSequenceHandler
-
Number of new elements to trigger the re-evaluation.
- getMaxEndOffset(PipelineOptions) - Method in class org.apache.beam.sdk.io.FileBasedSource
- getMaxEndOffset(PipelineOptions) - Method in class org.apache.beam.sdk.io.OffsetBasedSource
-
Returns the actual ending offset of the current source.
- getMaxInvocationHistory() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver.ServerConfiguration
- getMaxLength() - Method in class org.apache.beam.sdk.schemas.logicaltypes.VariableBytes
- getMaxLength() - Method in class org.apache.beam.sdk.schemas.logicaltypes.VariableString
- getMaxNumericPrecision() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRelDataTypeSystem
- getMaxNumericScale() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRelDataTypeSystem
- getMaxNumRecords() - Method in class org.apache.beam.sdk.io.aws2.sqs.providers.SqsReadConfiguration
- getMaxNumRecords() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getMaxNumWorkers() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
-
The maximum number of workers to use for the workerpool.
- getMaxNumWritersPerBundle() - Method in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformConfiguration
- getMaxOutputElementsPerBundle() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler
-
Returns the maximum number of elements which will be output per each bundle.
- getMaxParallelism() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getMaxPrecision(SqlTypeName) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRelDataTypeSystem
- getMaxPreviewRecords() - Method in class org.apache.beam.sdk.io.cdap.context.BatchSourceContextImpl
- getMaxReadTime() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getMaxReadTimeSeconds() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- getMaxReadTimeSecs() - Method in class org.apache.beam.sdk.io.aws2.sqs.providers.SqsReadConfiguration
- getMaxRecordsPerBatch() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
- getMaxStackTraceDepthToReport() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getMaxStreamingBatchSize() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getMaxStreamingRowsToBatch() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getMD5() - Method in class org.apache.beam.sdk.io.aws2.s3.SSECustomerKey
- getMean() - Method in class org.apache.beam.sdk.metrics.DistributionResult
- getMean() - Method in class org.apache.beam.sdk.testing.CoderProperties.TestElementByteSizeObserver
- getMemoryMB() - Method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter.Options
-
Returns the configured size of the memory buffer.
- getMemoryMB() - Method in class org.apache.beam.sdk.extensions.sorter.ExternalSorter.Options
-
Returns the configured size of the memory buffer.
- getMessage() - Method in class org.apache.beam.io.requestresponse.ApiIOError
-
The
Exception
message. - getMessage() - Method in exception class org.apache.beam.sdk.coders.Coder.NonDeterministicException
- getMessage() - Method in class org.apache.beam.sdk.io.csv.CsvIOParseError
-
The caught
Throwable.getMessage()
. - getMessage() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.OutgoingMessage
-
Underlying Message.
- getMessage() - Method in exception class org.apache.beam.sdk.transforms.windowing.IncompatibleWindowException
- getMessageBacklog() - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicy.PartitionContext
-
Current backlog in messages (latest offset of the partition - last processed record offset).
- getMessageConverter(DestinationT, BigQueryServices.DatasetService) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiDynamicDestinationsTableRow
- getMessageId() - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsMessage
-
SQS message id.
- getMessageId() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
-
Returns the messageId of the message populated by Cloud Pub/Sub.
- getMessageId() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
- getMessageId() - Method in class org.apache.beam.sdk.io.solace.data.Solace.CorrelationKey
- getMessageId() - Method in class org.apache.beam.sdk.io.solace.data.Solace.PublishResult
-
The message id of the message that was published.
- getMessageId() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record
-
Gets the unique identifier of the message, a string for an application-specific message identifier.
- getMessageName() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
- getMessageName() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
- getMessageName() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- getMessageName() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration
- getMessageRecord() - Method in class org.apache.beam.sdk.io.pulsar.PulsarMessage
- getMessages() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read.Result
- getMessages() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.Result
- getMessageStream(JobApi.JobMessagesRequest, StreamObserver<JobApi.JobMessagesResponse>) - Method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
- getMessageType() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
-
Returns the Protocol Buffers
Message
type thisProtoCoder
supports. - getMessageType() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
-
Gets message type.
- getMetadata() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmRecord
-
Returns the metadata.
- getMetadata() - Method in class org.apache.beam.sdk.io.FileIO.ReadableFile
-
Returns the
MatchResult.Metadata
of the file. - getMetadata() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleParameter
-
String representing the metadata of the Bundle to be written.
- getMetadata() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadParameter
-
String representing the metadata of the messageId to be read.
- getMetadata() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadResponse
-
Gets metadata.
- getMetadata() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSet
-
Returns the gathered metadata for the change stream query so far.
- getMetadata() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The connector execution metadata for this record.
- getMetadata() - Method in class org.apache.beam.sdk.io.tika.ParseResult
-
Returns the extracted metadata.
- getMetadata() - Method in class org.apache.beam.sdk.values.PCollectionViews.ValueOrMetadata
- getMetadata(DestinationT) - Method in class org.apache.beam.sdk.extensions.avro.io.DynamicAvroDestinations
-
Return AVRO file metadata for a given destination.
- getMetadata(MetadataEntity) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getMetadata(MetadataScope, MetadataEntity) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getMetadata(String) - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
-
Deprecated.use schema options instead.
- getMetaData() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- getMetadataCoder() - Method in class org.apache.beam.sdk.io.ReadableFileCoder
- getMetadataKey() - Static method in class org.apache.beam.runners.dataflow.internal.IsmFormat
-
An object representing a wild card for a key component.
- getMetadataQuery() - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall
- getMetadataString(String) - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
-
Deprecated.use schema options instead.
- getMetadataTable() - Method in interface org.apache.beam.sdk.io.gcp.spanner.SpannerIO.SpannerChangeStreamOptions
-
Returns the name of the metadata table.
- getMetadataTableAdminDao() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.DaoFactory
- getMetadataTableDao() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.DaoFactory
- getMetadataTableDebugString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.DaoFactory
- getMetaStore() - Method in class org.apache.beam.sdk.extensions.sql.BeamSqlCli
- getMeters(MetricFilter) - Method in class org.apache.beam.runners.spark.metrics.WithMetricsSupport
- getMeters(MetricFilter) - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.WithMetricsSupport
- getMethod() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- getMethod() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
- getMethod() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
- getMethods(Class<?>) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
-
Returns the list of non private/protected, non-static methods in the class, caching the results.
- getMethodsMap(Class<?>) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
- getMetricGaugeName(String, int) - Static method in class org.apache.beam.sdk.io.kafka.KafkaSinkMetrics
-
Creates an MetricName based on topic name and partition id.
- getMetricGroup() - Method in class org.apache.beam.runners.flink.metrics.FlinkMetricContainer
- getMetricGroup() - Method in class org.apache.beam.runners.flink.metrics.FlinkMetricContainerWithoutAccumulator
- getMetricLabels() - Method in class org.apache.beam.sdk.metrics.LabeledMetricNameUtils.ParsedMetricName
- getMetrics() - Method in class org.apache.beam.runners.jobsubmission.JobInvocation
- getMetrics() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getMetricsContainer(String) - Method in class org.apache.beam.runners.flink.metrics.FlinkMetricContainer
- getMetricsContainer(String) - Method in class org.apache.beam.runners.flink.metrics.FlinkMetricContainerWithoutAccumulator
- getMetricsEnvironmentStateForCurrentThread() - Static method in class org.apache.beam.sdk.metrics.MetricsEnvironment
-
Returns the container holder for the current thread.
- getMetricsGraphiteHost() - Method in interface org.apache.beam.sdk.metrics.MetricsOptions
- getMetricsGraphitePort() - Method in interface org.apache.beam.sdk.metrics.MetricsOptions
- getMetricsHttpSinkUrl() - Method in interface org.apache.beam.sdk.metrics.MetricsOptions
- getMetricsMapName(long) - Static method in class org.apache.beam.runners.jet.metrics.JetMetricsContainer
- getMetricsPushPeriod() - Method in interface org.apache.beam.sdk.metrics.MetricsOptions
- getMetricsSink() - Method in interface org.apache.beam.sdk.metrics.MetricsOptions
- getMimeType() - Method in enum class org.apache.beam.sdk.io.FileBasedSink.CompressionType
-
Deprecated.
- getMimeType() - Method in interface org.apache.beam.sdk.io.FileBasedSink.OutputFileHints
-
Returns the MIME type that should be used for the files that will hold the output data.
- getMin() - Method in class org.apache.beam.sdk.metrics.DistributionResult
- getMinBundleSize() - Method in class org.apache.beam.sdk.io.OffsetBasedSource
-
Returns the minimum bundle size that should be used when splitting the source into sub-sources.
- getMinConnectionPoolConnections() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getMinCpuPlatform() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
-
Specifies a Minimum CPU platform for VM instances.
- getMinimumTimestamp() - Method in interface org.apache.beam.runners.local.Bundle
-
Return the minimum timestamp among elements in this bundle.
- getMinPauseBetweenCheckpoints() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getMinReadTimeMillis() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
- getMissingPartitionsFrom(List<Range.ByteStringRange>, ByteString, ByteString) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ByteStringRangeHelper
-
Return missing partitions within partitions that are within start and end.
- getMissingPartitionsFromEntireKeySpace(List<Range.ByteStringRange>) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ByteStringRangeHelper
-
Return missing partitions from the entire keyspace.
- getMode() - Method in class org.apache.beam.sdk.io.FileBasedSource
- getMode() - Method in class org.apache.beam.sdk.values.WindowingStrategy
- getModeNames() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
- getModifiableCollection() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteTable
- getMods() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The modifications within this record.
- getModType() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The type of operation that caused the modifications within this record.
- getMonitoringInfos() - Method in interface org.apache.beam.sdk.metrics.MetricsContainer
-
Return the cumulative values for any metrics in this container as MonitoringInfos.
- getMonthOfYear() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
- getMutableOutput(TupleTag<T>) - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
Deprecated.Use
TestPipeline
with theDirectRunner
. - getMutationInformation() - Method in class org.apache.beam.sdk.io.gcp.bigquery.RowMutation
- GetMutationsFromBeamRow() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.GetMutationsFromBeamRow
- getMutationType() - Method in class org.apache.beam.sdk.io.gcp.bigquery.RowMutationInformation
- getNaiveObjectSerializer() - Static method in class org.apache.beam.runners.flink.translation.utils.SerdeUtils
- getName() - Method in enum class org.apache.beam.io.debezium.Connectors
-
The name of this connector class.
- getName() - Method in class org.apache.beam.runners.jet.metrics.BoundedTrieImpl
- getName() - Method in class org.apache.beam.runners.jet.metrics.CounterImpl
- getName() - Method in class org.apache.beam.runners.jet.metrics.DistributionImpl
- getName() - Method in class org.apache.beam.runners.jet.metrics.GaugeImpl
- getName() - Method in class org.apache.beam.runners.jet.metrics.StringSetImpl
- getName() - Method in class org.apache.beam.sdk.extensions.sql.example.model.Customer
- getName() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPMeasure
- getName() - Method in enum class org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention
- getName() - Method in class org.apache.beam.sdk.extensions.sql.meta.Table
- getName() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
-
Gets name.
- getName() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath
- getName() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.TopicPath
- getName() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
-
The name of the column.
- getName() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema.Column
- getName() - Method in class org.apache.beam.sdk.io.snowflake.data.SnowflakeColumn
- getName() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Destination
-
Gets the name of the destination.
- getName() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Queue
- getName() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Topic
- getName() - Method in class org.apache.beam.sdk.metrics.DelegatingCounter
- getName() - Method in class org.apache.beam.sdk.metrics.DelegatingDistribution
- getName() - Method in class org.apache.beam.sdk.metrics.DelegatingGauge
- getName() - Method in class org.apache.beam.sdk.metrics.DelegatingHistogram
- getName() - Method in interface org.apache.beam.sdk.metrics.Metric
-
The
MetricName
given to this metric. - getName() - Method in class org.apache.beam.sdk.metrics.MetricName
-
The name of this metric.
- getName() - Method in class org.apache.beam.sdk.metrics.MetricNameFilter
-
If set, the metric must have this name to match this
MetricNameFilter
. - getName() - Method in class org.apache.beam.sdk.metrics.MetricResult
-
Return the name of the metric.
- getName() - Method in class org.apache.beam.sdk.metrics.NoOpCounter
- getName() - Method in class org.apache.beam.sdk.metrics.NoOpHistogram
- getName() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
-
Returns the field name.
- getName() - Method in class org.apache.beam.sdk.schemas.logicaltypes.FixedBytes
- getName() - Method in class org.apache.beam.sdk.schemas.logicaltypes.FixedString
- getName() - Method in class org.apache.beam.sdk.schemas.logicaltypes.VariableBytes
- getName() - Method in class org.apache.beam.sdk.schemas.logicaltypes.VariableString
- getName() - Method in class org.apache.beam.sdk.schemas.Schema.Field
-
Returns the field name.
- getName() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf.Configuration
- getName() - Method in class org.apache.beam.sdk.transforms.PTransform
-
Returns the transform name.
- getName() - Method in class org.apache.beam.sdk.values.PCollection
-
Returns the name of this
PCollection
. - getName() - Method in interface org.apache.beam.sdk.values.PValue
-
Returns the name of this
PValue
. - getName() - Method in class org.apache.beam.sdk.values.PValueBase
-
Returns the name of this
PValueBase
. - getName(int) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- getNameCount() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- getNameOverride() - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
- getNameOverride() - Method in class org.apache.beam.sdk.transforms.Top.TopCombineFn
- getNameOverride(String, T) - Static method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
- getNamespace() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getNamespace() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
- getNamespace() - Method in class org.apache.beam.sdk.metrics.MetricName
-
The namespace associated with this metric.
- getNamespace() - Method in class org.apache.beam.sdk.metrics.MetricNameFilter
-
The inNamespace that a metric must be in to match this
MetricNameFilter
. - getNamespace() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Identifier
- getNamespace() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
-
The namespace for the display item.
- getNamespace() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
-
The namespace for the display item.
- getNeedsAttributes() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
- getNeedsMessageId() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
- getNeedsOrderingKey() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
- getNestedFieldsAccessed() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
- getNetwork() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
-
GCE network for launching workers.
- getNetworkTimeout() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- getNewBigqueryClient(String) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
- getNewValuesJson() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.Mod
-
The new column values after the modification was applied.
- getNextId() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.UniqueIdGenerator
-
Return a random base64 encoded 8 byte string.
- getNextOffset() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark.PartitionMark
- getNextProcessingTimer() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
-
Finds the latest timer in
TimeDomain.PROCESSING_TIME
domain that has expired based on the current processing time. - getNextWindow() - Method in class org.apache.beam.sdk.extensions.timeseries.FillGaps.InterpolateData
- getNodeStats() - Method in interface org.apache.beam.sdk.extensions.sql.impl.planner.NodeStatsMetadata
- getNodeStats(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRelMetadataQuery
- getNodeStats(RelNode, BeamRelMetadataQuery) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils
- getNodeStats(RelNode, RelMetadataQuery) - Method in interface org.apache.beam.sdk.extensions.sql.impl.planner.NodeStatsMetadata.Handler
- getNodeStats(RelNode, RelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.RelMdNodeStats
- getNonCumulativeCost(RelNode, RelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner.NonCumulativeCostImpl
- getNonNullPrefix() - Method in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider.Configuration
- getNonSpeculativeIndex() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
-
The zero-based index of this trigger firing among non-speculative panes.
- getNonWildcardPrefix(String) - Static method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
-
Returns the prefix portion of the glob that doesn't contain wildcards.
- getNormalizeKeyLen() - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueComparator
- getNoSpilling() - Method in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformConfiguration
- getNotSupported() - Method in interface org.apache.beam.sdk.extensions.sql.meta.BeamSqlTableFilter
-
Identify parts of a predicate that are not supported by the IO push-down capabilities to be preserved in a
Calc
followingBeamIOSourceRel
. - getNotSupported() - Method in class org.apache.beam.sdk.extensions.sql.meta.DefaultTableFilter
-
Since predicate push-down is assumed not to be supported by default - return an unchanged list of filters to be preserved.
- getNotSupported() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryFilter
- getNotSupported() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.iceberg.IcebergFilter
- getNotSupported() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableFilter
- getNullable() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
- getNullableValue() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow
-
Workaround for autovalue code generation, which does not allow type variables to be instantiated with nullable actual parameters.
- getNullFirst() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.OrderKey
- getNullParams() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamSqlUnparseContext
- getNum() - Method in class org.apache.beam.runners.spark.io.ConsoleIO.Write.Unbound
- getNumber() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
-
Optionally returns the field index.
- getNumber() - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields.Fanout
- getNumber() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
- getNumber() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
- getNumber() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
- getNumberOfBufferedEvents() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus
- getNumberOfExecutionRetries() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getNumberOfKeys() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.Footer
- getNumberOfPartitionsInTransaction() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The total number of partitions for the given transaction.
- getNumberOfReceivedEvents() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus
- getNumberOfRecordsInTransaction() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The total number of data change records for the given transaction.
- getNumberOfRecordsRead() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSetMetadata
-
Returns the total number of records read from the change stream so far.
- getNumberOfRecordsRead() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The number of records read in the partition change stream query before reading this record.
- getNumberOfShardKeyCoders(List<?>) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmRecordCoder
- getNumberOfWorkerHarnessThreads() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
Number of threads to use on the Dataflow worker harness.
- getNumberOverride(int, T) - Static method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
- getNumBytes() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.WriteSuccessSummary
- getNumBytes() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteSuccessSummary
- getNumConcurrentCheckpoints() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getNumConsumers(PValue) - Method in class org.apache.beam.runners.flink.translation.utils.CountingPipelineVisitor
-
Calculate number of consumers of a given
PValue
. - getNumEntities(PipelineOptions, String, String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
-
Returns Number of entities available for reading.
- getNumExtractJobCalls() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
- getNumPartitions() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
- getNumQuerySplits() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
- getNumRows(BigQueryOptions, TableReference) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
-
It returns the number of rows for a given table.
- getNumSampledBytesPerFile() - Method in class org.apache.beam.sdk.io.TextRowCountEstimator
- getNumShards() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration
- getNumShards() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- getNumShards() - Method in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformConfiguration
- getNumShardsProvider() - Method in class org.apache.beam.sdk.io.WriteFiles
- getNumSplits() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSource
- getNumStorageWriteApiStreamAppendClients() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getNumStorageWriteApiStreams() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getNumStreamingKeys() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getNumStreams() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
- getNumWorkers() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
-
Number of workers to use when executing the Dataflow job.
- getNumWrites() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.WriteSuccessSummary
- getNumWrites() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteSuccessSummary
- getOauthToken() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- getOauthToken() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- getOAuthToken() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- getObject() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
-
Returns the object name associated with this GCS path, or an empty string if no object is specified.
- getObject(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- getObject(Map<String, Object>, String, Map<String, Object>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- getObject(GcsPath) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
-
Returns the
StorageObject
for the givenGcsPath
. - getObjectMapper() - Static method in class org.apache.beam.sdk.extensions.sql.TableUtils
- getObjectReuse() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getObjects(List<GcsPath>) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
-
Returns
StorageObjectOrIOExceptions
for the givenGcsPaths
. - getObservedTime() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOError
- getObservedTimestamp() - Method in class org.apache.beam.io.requestresponse.ApiIOError
-
The observed timestamp of the error.
- getObservedTimestamp() - Method in class org.apache.beam.sdk.io.csv.CsvIOParseError
-
The date and time when the
Exception
occurred. - getOffset() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
- getOffset() - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
- getOffset() - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
- getOffset() - Method in class org.apache.beam.sdk.transforms.windowing.TimestampTransform.AlignTo
- getOffsetConsumerConfig() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getOffsetDeduplication() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getOffsetLimit() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark
- getOffsetLimit() - Method in interface org.apache.beam.sdk.io.UnboundedSource.CheckpointMark
- getOldValuesJson() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.Mod
-
The old column values before the modification was applied.
- getOnCreateMatcher() - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
- getOneOfSchema() - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
-
Returns the schema of the underlying
Row
that is used to represent the union. - getOneOfTypes() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
- getOneRecord(Map<String, String>) - Method in class org.apache.beam.io.debezium.KafkaSourceConsumerFn
- getOnly() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
- getOnly() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration
- getOnly(String) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
-
Like
CoGbkResult.getOnly(TupleTag)
but using a String instead of a TupleTag. - getOnly(String, V) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
-
Like
CoGbkResult.getOnly(TupleTag, Object)
but using a String instead of a TupleTag. - getOnly(TupleTag<V>) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
-
If there is a singleton value for the given tag, returns it.
- getOnly(TupleTag<V>, V) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
-
If there is a singleton value for the given tag, returns it.
- getOnSuccessMatcher() - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
- getOnTimeBehavior() - Method in class org.apache.beam.sdk.values.WindowingStrategy
- getOperand0() - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall
- getOperandList() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCheckConstraint
- getOperandList() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlColumnDeclaration
- getOperandList() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateCatalog
- getOperandList() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateDatabase
- getOperandList() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateExternalTable
- getOperandList() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateFunction
- getOperandList() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlDropCatalog
- getOperandList() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlDropDatabase
- getOperandList() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlDropTable
- getOperandList() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlUseCatalog
- getOperandList() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlUseDatabase
- getOperands() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPCall
- getOperation() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPMeasure
- getOperation() - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo
- getOperationMode() - Method in class org.apache.beam.runners.twister2.BeamBatchTSetEnvironment
- getOperator() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPCall
- getOperator() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCheckConstraint
- getOperator() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlColumnDeclaration
- getOperator() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateFunction
- getOperator() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlUseCatalog
- getOperator() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlUseDatabase
- getOperatorChaining() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getOptionNames() - Method in class org.apache.beam.sdk.schemas.Schema.Options
- getOptions() - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator.TranslationState
- getOptions() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
- getOptions() - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
- getOptions() - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
- getOptions() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getOptions() - Method in class org.apache.beam.sdk.Pipeline
- getOptions() - Method in class org.apache.beam.sdk.schemas.Schema.Field
-
Returns the fields
Schema.Options
. - getOptions() - Method in class org.apache.beam.sdk.schemas.Schema
- getOptions() - Method in class org.apache.beam.sdk.testing.TestPipeline
- getOptionsId() - Method in interface org.apache.beam.sdk.options.PipelineOptions
-
Provides a process wide unique ID for this
PipelineOptions
object, assigned at graph construction time. - getOptionsSupplier() - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator.TranslationState
- getOptionsSupplier() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
- getOrCreate(BigtableConfig) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.BigtableChangeStreamAccessor
-
Create a BigtableAccess if it doesn't exist and store it in the cache for faster access.
- getOrCreate(SpannerConfig) - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor
- getOrCreateReader(PipelineOptions, CheckpointMarkT) - Method in class org.apache.beam.runners.spark.io.MicrobatchSource
- getOrCreateSession(SparkStructuredStreamingPipelineOptions) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.SparkSessionFactory
-
Gets active
SparkSession
or creates one usingSparkStructuredStreamingPipelineOptions
. - getOrDecode(Coder<T>) - Method in class org.apache.beam.runners.spark.translation.ValueAndCoderLazySerializable
- getOrDefault(K, V) - Method in interface org.apache.beam.sdk.state.MapState
-
A deferred lookup.
- getOrderingKey() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
-
Returns the ordering key of the message.
- getOrdinalPosition() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
-
The position of the column in the table.
- getOrphanedNewPartitions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler.OrphanedMetadataCleaner
-
Returns a list of NewPartition that have been around for a while and do not overlap with any missing partition.
- getOrThrowException() - Method in class org.apache.beam.sdk.io.cdap.context.FailureCollectorWrapper
- getOutboundObserver() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer
- getOutName(int) - Method in class org.apache.beam.sdk.values.TupleTag
-
If this
TupleTag
is tagging outputoutputIndex
of aPTransform
, returns the name that should be used by default for the output. - getOutput() - Method in interface org.apache.beam.runners.spark.structuredstreaming.examples.WordCount.WordCountOptions
- getOutput() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
- getOutput() - Method in class org.apache.beam.sdk.io.csv.CsvIOParseResult
-
The
CsvIOParseResult
PCollection
as a result of successfully parsing CSV records. - getOutput() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.ErrorHandling
- getOutput() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.ErrorHandling
- getOutput() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.ErrorHandling
- getOutput() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteResult
- getOutput() - Method in class org.apache.beam.sdk.schemas.transforms.providers.ErrorHandling
- getOutput() - Method in class org.apache.beam.sdk.transforms.errorhandling.ErrorHandler.DefaultErrorHandler
- getOutput() - Method in interface org.apache.beam.sdk.transforms.errorhandling.ErrorHandler
- getOutput() - Method in class org.apache.beam.sdk.transforms.errorhandling.ErrorHandler.PTransformErrorHandler
- getOutput(PTransform<?, OutputT>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
- getOutput(PTransform<?, T>) - Method in class org.apache.beam.runners.flink.translation.utils.LookupPipelineVisitor
- getOutput(PTransform<?, T>) - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
- getOutput(PTransform<?, T>) - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
- getOutput(TupleTag<T>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
- getOutputCoder() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.impulse.BeamImpulseSource
- getOutputCoder() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource
- getOutputCoder() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource
- getOutputCoder() - Method in class org.apache.beam.sdk.io.azure.cosmos.CosmosIO.BoundedCosmosBDSource
- getOutputCoder() - Method in class org.apache.beam.sdk.io.CompressedSource
-
Returns the delegate source's output coder.
- getOutputCoder() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BoundedElasticsearchSource
- getOutputCoder() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
- getOutputCoder() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedSourceImpl
- getOutputCoder() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.HadoopInputFormatBoundedSource
- getOutputCoder() - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read.BoundedGridFSSource
- getOutputCoder() - Method in class org.apache.beam.sdk.io.solace.read.UnboundedSolaceSource
- getOutputCoder() - Method in class org.apache.beam.sdk.io.Source
-
Returns the
Coder
to use for the data read from this source. - getOutputCoder() - Method in class org.apache.beam.sdk.io.TextSource
- getOutputCoder() - Method in class org.apache.beam.sdk.io.xml.XmlSource
- getOutputCoder(SerializableFunction<InputT, OutputT>, Coder<InputT>) - Method in class org.apache.beam.sdk.coders.CoderRegistry
-
Deprecated.This method is to change in an unknown backwards incompatible way once support for this functionality is refined.
- getOutputCoders() - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
- getOutputCoders() - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
- getOutputCoders(PTransform<?, ?>) - Method in class org.apache.beam.runners.flink.translation.utils.LookupPipelineVisitor
- getOutputExecutablePath() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
- getOutputFile() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
- getOutputFilePrefix() - Method in interface org.apache.beam.sdk.extensions.sql.example.BeamSqlDataCatalogExample.DCExamplePipelineOptions
-
Output file prefix.
- getOutputFormatProvider() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getOutputId(PipelineNode.PTransformNode) - Static method in class org.apache.beam.runners.fnexecution.translation.PipelineTranslatorUtils
- getOutputKvCoder(Coder<KV<K, V>>) - Static method in class org.apache.beam.runners.dataflow.internal.DataflowGroupByKey
-
Returns the
Coder
of the output of this transform. - getOutputKvCoder(Coder<KV<K, V>>) - Static method in class org.apache.beam.sdk.transforms.GroupByKey
-
Returns the
Coder
of the output of this transform. - getOutputManager() - Method in interface org.apache.beam.runners.spark.translation.SparkInputDataProcessor
- getOutputOrNull(ErrorHandling) - Static method in class org.apache.beam.sdk.schemas.transforms.providers.ErrorHandling
- getOutputParallelization() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
- getOutputParallelization() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadConfiguration
- getOutputPortSchemas() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getOutputPrefix() - Method in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformConfiguration
- getOutputs() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
- getOutputs() - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
- getOutputs(PTransform<?, ?>) - Method in class org.apache.beam.runners.flink.translation.utils.LookupPipelineVisitor
- getOutputs(PTransform<?, ?>) - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
- getOutputs(PTransform<?, OutputT>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
-
Returns the output of the currently being translated transform.
- getOutputSchema() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getOutputSchema(Schema, FieldAccessDescriptor) - Static method in class org.apache.beam.sdk.schemas.utils.SelectHelpers
-
Get the output schema resulting from selecting the given
FieldAccessDescriptor
from the given schema. - getOutputStrategyInternal(WindowingStrategy<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.Window
-
Get the output strategy of this
Window PTransform
. - getOutputStream() - Method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService.ArtifactDestination
- getOutputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
- getOutputTVariable() - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
-
Returns the
TypeVariable
ofOutputT
. - getOutputTVariable() - Method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
-
Returns the
TypeVariable
ofOutputT
. - getOutputTVariable() - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
-
Returns the
TypeVariable
ofOutputT
. - getOutputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.ArrayAgg.ArrayAggArray
-
Returns the
TypeVariable
ofOutputT
. - getOutputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations.BitXOr
-
Returns the
TypeVariable
ofOutputT
. - getOutputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions.PositionAwareCombineFn
-
Returns the
TypeVariable
ofOutputT
. - getOutputTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn
-
Returns the
TypeVariable
ofOutputT
. - getOutputTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
-
Returns the
TypeVariable
ofOutputT
. - getOutputTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
-
Returns the
TypeVariable
ofOutputT
. - getOutputTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
-
Returns the
TypeVariable
ofOutputT
. - getOutputTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
-
Returns the
TypeVariable
ofOutputT
. - getOutputTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
-
Returns the
TypeVariable
ofOutputT
. - getOutputTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
-
Returns the
TypeVariable
ofOutputT
. - getOutputTVariable() - Method in class org.apache.beam.sdk.transforms.Combine.SimpleCombineFn
-
Returns the
TypeVariable
ofOutputT
. - getOutputTVariable() - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
-
Returns the
TypeVariable
ofOutputT
. - getOutputTVariable() - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
-
Returns the
TypeVariable
ofOutputT
. - getOutputTVariable() - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
-
Returns the
TypeVariable
ofOutputT
. - getOutputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CountIf.CountIfFn
-
Returns the
TypeVariable
ofOutputT
. - getOutputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CovarianceFn
-
Returns the
TypeVariable
ofOutputT
. - getOutputTVariable() - Method in class org.apache.beam.sdk.extensions.ordered.combiner.DefaultSequenceCombiner
-
Returns the
TypeVariable
ofOutputT
. - getOutputTVariable() - Method in class org.apache.beam.runners.flink.FlinkStreamingAggregationsTranslators.ConcatenateAsIterable
-
Returns the
TypeVariable
ofOutputT
. - getOutputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.LazyAggregateCombineFn
-
Returns the
TypeVariable
ofOutputT
. - getOutputTVariable() - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
-
Returns the
TypeVariable
ofOutputT
. - getOutputTVariable() - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
-
Returns the
TypeVariable
ofOutputT
. - getOutputTVariable() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Concatenate
-
Returns the
TypeVariable
ofOutputT
. - getOutputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggByte
-
Returns the
TypeVariable
ofOutputT
. - getOutputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggString
-
Returns the
TypeVariable
ofOutputT
. - getOutputTVariable() - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
-
Returns the
TypeVariable
ofOutputT
. - getOutputTVariable() - Method in class org.apache.beam.sdk.transforms.Top.TopCombineFn
-
Returns the
TypeVariable
ofOutputT
. - getOutputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.VarianceFn
-
Returns the
TypeVariable
ofOutputT
. - getOutputType() - Method in class org.apache.beam.sdk.extensions.sql.impl.UdafImpl
- getOutputType() - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
- getOutputType() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf
- getOutputType() - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
-
Returns a
TypeDescriptor
capturing what is known statically about the output type of thisCombineFn
instance's most-derived class. - getOutputTypeDescriptor() - Method in class org.apache.beam.runners.dataflow.BatchStatefulParDoOverrides.BatchStatefulDoFn
- getOutputTypeDescriptor() - Method in class org.apache.beam.sdk.transforms.DoFn
-
Returns a
TypeDescriptor
capturing what is known statically about the output type of thisDoFn
instance's most-derived class. - getOutputTypeDescriptor() - Method in class org.apache.beam.sdk.transforms.InferableFunction
-
Returns a
TypeDescriptor
capturing what is known statically about the output type of thisInferableFunction
instance's most-derived class. - getOutputWatermarkHold() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.stableinput.BufferingDoFnRunner
- getOverlappingPartitions(List<Range.ByteStringRange>) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ByteStringRangeHelper
-
Return a list of overlapping partitions.
- getOverloadRatio() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
The target ratio between requests sent and successful requests.
- getOverrideWindmillBinary() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
-
Custom windmill_main binary to use with the streaming runner.
- getPaneInfo() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
- getPaneInfo() - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow
-
Returns the pane of this
FailsafeValueInSingleWindow
in its window. - getPaneInfo() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow
-
Returns the pane of this
ValueInSingleWindow
in its window. - getPaneInfo() - Method in interface org.apache.beam.sdk.values.WindowedValue
-
The
PaneInfo
associated with this WindowedValue. - getPaneInfo() - Method in class org.apache.beam.sdk.values.WindowedValues.ParamWindowedValueCoder
- getParallelism() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getParallelism() - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
- getParameters() - Method in class org.apache.beam.sdk.extensions.sql.impl.UdafImpl
- getParameters() - Method in class org.apache.beam.sdk.extensions.sql.impl.UdfImplReflectiveFunctionBase
-
Returns the parameters of this function.
- getParameters() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.CastFunctionImpl
- getParamWindowedValueCoder(Coder<T>) - Static method in class org.apache.beam.sdk.values.WindowedValues
-
Returns the
ParamWindowedValueCoder
from the given valueCoder. - getParent() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
-
Returns the parent path, or
null
if this path does not have a parent. - getParentId() - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo
- getParentLowWatermark() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
- getParentPartitions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
- getParentPartitions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
- getParents() - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall
- getParentTokens() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartition
-
The unique partition identifiers of the parent partitions where this child partition originated from.
- getParentTokens() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
The unique partition identifiers of the parent partitions where this child partition originated from.
- getParquetConfiguration() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration
- getParseFn() - Method in class org.apache.beam.sdk.io.solace.read.UnboundedSolaceSource
- getParser() - Method in class org.apache.beam.sdk.extensions.protobuf.DynamicProtoCoder
-
Get the memoized
Parser
, possibly initializing it lazily. - getParser() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
-
Get the memoized
Parser
, possibly initializing it lazily. - getPartition() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
- getPartition() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
- getPartition() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.StreamPartitionWithWatermark
- getPartition() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark.PartitionMark
- getPartition() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
- getPartition(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
-
Fetches the partition metadata row data for the given partition token.
- getPartition(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.InTransactionContext
-
Fetches the partition metadata row data for the given partition token.
- getPartitionColumn() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
- getPartitionCreatedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The time at which this partition was first detected and created in the metadata table.
- getPartitionEndTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The end time for the partition change stream query, which produced this record.
- getPartitionFields() - Method in class org.apache.beam.sdk.extensions.sql.meta.Table
- getPartitionFields() - Method in class org.apache.beam.sdk.io.iceberg.IcebergTableCreateConfig
- getPartitionFields() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration
- getPartitionKey() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
- getPartitionKey(T) - Method in interface org.apache.beam.sdk.io.aws2.kinesis.KinesisPartitioner.ExplicitPartitioner
- getPartitionKey(T) - Method in interface org.apache.beam.sdk.io.aws2.kinesis.KinesisPartitioner
-
Determines which shard in the stream the record is assigned to.
- getPartitionMetadataAdminDao() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.DaoFactory
-
Creates and returns a singleton DAO instance for admin operations over the partition metadata table.
- getPartitionMetadataDao() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.DaoFactory
-
Creates and returns a singleton DAO instance for accessing the partition metadata table.
- getPartitionQueryTimeout() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getPartitionReadTimeout() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getPartitionRunningAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The time at which the connector started processing this partition.
- getPartitions() - Method in class org.apache.beam.runners.spark.io.SourceRDD.Bounded
- getPartitions() - Method in class org.apache.beam.runners.spark.io.SourceRDD.Unbounded
- getPartitions() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark
- getPartitionScheduledAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The time at which this partition was scheduled to be queried.
- getPartitionSpec() - Method in class org.apache.beam.sdk.io.iceberg.IcebergTableCreateConfig
-
Partition spec destination, in the event that it must be dynamically created.
- getPartitionStartTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The start time for the partition change stream query, which produced this record.
- getPartitionsToReconcile(Instant, Instant) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler.PartitionReconciler
-
For missing partitions, try to organize the mismatched parent tokens in a way to fill the missing partitions.
- getPartitionToken() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The partition token that produced this change stream record.
- getPartitionToken() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The unique identifier of the partition that generated this record.
- getPartitionToken() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
Unique partition identifier, which can be used to perform a change stream query.
- getPartitionTokens() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionStartRecord
-
List of partitions yielded within this record.
- getPassword() - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration
- getPassword() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
- getPassword() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
- getPassword() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
- getPassword() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- getPassword() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- getPassword() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- getPath() - Method in class org.apache.beam.sdk.extensions.sql.impl.TableName
-
Table path up to the leaf table name.
- getPath() - Method in class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider.CsvWriteConfiguration
- getPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.ProjectPath
- getPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SchemaPath
- getPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath
- getPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.TopicPath
- getPath() - Method in class org.apache.beam.sdk.io.json.providers.JsonWriteTransformProvider.JsonWriteConfiguration
- getPath() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf.Configuration
- getPath() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Identifier
- getPath() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
-
The path for the display item within a component hierarchy.
- getPathPrefix() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
- getPathValidator() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
-
The path validator instance that should be used to validate paths.
- getPathValidatorClass() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
-
The class of the validator that should be created and used to validate paths.
- getPatientCompartments() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything.Result
-
Gets the patient compartment responses for GetPatientEverything requests.
- getPatientEverything() - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
-
Get the patient compartment for a FHIR Patient using the GetPatientEverything/$everything API.
- getPatientEverything(String, Map<String, Object>, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Fhir get patient everything http body.
- getPatientEverything(String, Map<String, Object>, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- getPatternCondition() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPPattern
- getPatternVar() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPPattern
- getPayload() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload
- getPayload() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
-
Returns the main PubSub message.
- getPayload() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
- getPayload() - Method in class org.apache.beam.sdk.io.mqtt.MqttRecord
- getPayload() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record
-
Gets the payload of the message as a byte array.
- getPayload() - Method in class org.apache.beam.sdk.schemas.io.Failure
-
Bytes containing the payload which has failed.
- getPayload() - Method in class org.apache.beam.sdk.schemas.logicaltypes.UnknownLogicalType
- getPayload(AvroGenericCoder) - Method in class org.apache.beam.sdk.extensions.avro.AvroGenericCoderTranslator
- getPayload(WindowedValues.ParamWindowedValueCoder<?>) - Static method in class org.apache.beam.sdk.values.WindowedValues.ParamWindowedValueCoder
-
Returns the serialized payload that will be provided when deserializing this coder.
- getPCollection() - Method in interface org.apache.beam.runners.local.Bundle
-
Returns the PCollection that the elements of this bundle belong to.
- getPCollection() - Method in interface org.apache.beam.sdk.values.PCollectionView
-
For internal use only.
- getPCollection() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
- getPCollectionConsumptionMap() - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
-
Get the map of
PCollection
to the number ofPTransform
consuming it. - getPCollectionInputs() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
- getPCollectionInputs() - Method in interface org.apache.beam.sdk.extensions.sql.impl.rel.BeamRelNode
- getPerDestinationOutputFilenames() - Method in class org.apache.beam.sdk.io.WriteFilesResult
-
Returns a
PCollection
of all output filenames generated by thisWriteFiles
organized by user destination type. - getPerElementConsumers(CollectionT) - Method in interface org.apache.beam.runners.direct.ExecutableGraph
- getPerElementInputs(ExecutableT) - Method in interface org.apache.beam.runners.direct.ExecutableGraph
- getPeriod() - Method in class org.apache.beam.sdk.extensions.sql.impl.TVFSlidingWindowFn
-
Amount of time between generated windows.
- getPeriod() - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
- getPeriod() - Method in class org.apache.beam.sdk.transforms.windowing.TimestampTransform.AlignTo
- getPeriodicStatusPageOutputDirectory() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getPerWorkerCounter(MetricName) - Method in interface org.apache.beam.sdk.metrics.MetricsContainer
-
Return the
Counter
that should be used for implementing the given per-worker invalid input: '{@code metricName) in this container.' - getPerWorkerMetricsUpdateReportingPeriodMillis() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getPgJsonb(int) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSet
-
Returns the record at the current pointer as
invalid reference
JsonB
- getPipeline() - Method in class org.apache.beam.io.requestresponse.Result
- getPipeline() - Method in class org.apache.beam.runners.jobsubmission.JobInvocation
-
Retrieve the job's pipeline.
- getPipeline() - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
- getPipeline() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedEventProcessorResult
- getPipeline() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Write.Result
- getPipeline() - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.WriteBatches.Result
- getPipeline() - Method in class org.apache.beam.sdk.io.csv.CsvIOParseResult
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata.Result
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read.Result
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search.Result
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.AbstractResult
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.Result
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything.Result
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read.Result
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.Result
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write.Result
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteResult
- getPipeline() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteResult
- getPipeline() - Method in class org.apache.beam.sdk.io.jms.WriteJmsResult
- getPipeline() - Method in class org.apache.beam.sdk.io.solace.write.SolaceOutput
- getPipeline() - Method in class org.apache.beam.sdk.io.WriteFilesResult
- getPipeline() - Method in class org.apache.beam.sdk.Pipeline.PipelineVisitor.Defaults
- getPipeline() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
- getPipeline() - Method in class org.apache.beam.sdk.transforms.JsonToRow.ParseResult
- getPipeline() - Method in class org.apache.beam.sdk.transforms.WithFailures.Result
- getPipeline() - Method in class org.apache.beam.sdk.values.PBegin
- getPipeline() - Method in class org.apache.beam.sdk.values.PCollectionList
- getPipeline() - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
- getPipeline() - Method in class org.apache.beam.sdk.values.PCollectionTuple
- getPipeline() - Method in class org.apache.beam.sdk.values.PDone
- getPipeline() - Method in interface org.apache.beam.sdk.values.PInput
- getPipeline() - Method in interface org.apache.beam.sdk.values.POutput
- getPipeline() - Method in class org.apache.beam.sdk.values.PValueBase
- getPipeline(JobApi.GetJobPipelineRequest, StreamObserver<JobApi.GetJobPipelineResponse>) - Method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
- getPipelineFromClasspath(String) - Static method in class org.apache.beam.runners.jobsubmission.PortablePipelineJarUtils
- getPipelineName() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getPipelineOptions() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineRegistrar.Options
- getPipelineOptions() - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
-
Returns the configured pipeline options.
- getPipelineOptions() - Method in class org.apache.beam.runners.direct.DirectRegistrar.Options
- getPipelineOptions() - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.BatchTranslationContext
- getPipelineOptions() - Method in interface org.apache.beam.runners.flink.FlinkPortablePipelineTranslator.TranslationContext
- getPipelineOptions() - Method in class org.apache.beam.runners.flink.FlinkRunner
-
For testing.
- getPipelineOptions() - Method in class org.apache.beam.runners.flink.FlinkRunnerRegistrar.Options
- getPipelineOptions() - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator.StreamingTranslationContext
- getPipelineOptions() - Method in class org.apache.beam.runners.flink.TestFlinkRunner
- getPipelineOptions() - Method in class org.apache.beam.runners.jet.JetRunnerRegistrar.Options
- getPipelineOptions() - Method in class org.apache.beam.runners.portability.testing.TestPortablePipelineOptions.TestPortablePipelineOptionsRegistrar
- getPipelineOptions() - Method in class org.apache.beam.runners.portability.testing.TestUniversalRunner.OptionsRegistrar
- getPipelineOptions() - Method in class org.apache.beam.runners.prism.PrismRegistrar.Options
- getPipelineOptions() - Method in class org.apache.beam.runners.spark.SparkRunnerRegistrar.Options
- getPipelineOptions() - Method in class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingRunnerRegistrar.Options
- getPipelineOptions() - Method in class org.apache.beam.runners.twister2.Twister2RunnerRegistrar.Options
- getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.gcp.options.GcpPipelineOptionsRegistrar
- getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.python.PythonExternalTransformOptionsRegistrar
- getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
- getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
- getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlPipelineOptionsRegistrar
- getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.sql.impl.JdbcConnection
- getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSinkRel
- getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
- getPipelineOptions() - Method in interface org.apache.beam.sdk.extensions.sql.impl.rel.BeamRelNode
-
Perform a DFS(Depth-First-Search) to find the PipelineOptions config.
- getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamValuesRel
- getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogPipelineOptionsRegistrar
- getPipelineOptions() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIOOptions.KinesisIOOptionsRegistrar
- getPipelineOptions() - Method in class org.apache.beam.sdk.io.aws2.options.AwsPipelineOptionsRegistrar
- getPipelineOptions() - Method in class org.apache.beam.sdk.io.azure.options.AzurePipelineOptionsRegistrar
- getPipelineOptions() - Method in class org.apache.beam.sdk.io.gcp.common.GcpIoPipelineOptionsRegistrar
- getPipelineOptions() - Method in class org.apache.beam.sdk.io.hdfs.HadoopFileSystemOptionsRegistrar
- getPipelineOptions() - Method in class org.apache.beam.sdk.options.DefaultPipelineOptionsRegistrar
- getPipelineOptions() - Method in class org.apache.beam.sdk.options.ManualDockerEnvironmentOptions.Options
- getPipelineOptions() - Method in interface org.apache.beam.sdk.options.PipelineOptionsRegistrar
- getPipelineOptions() - Method in class org.apache.beam.sdk.options.RemoteEnvironmentOptions.Options
- getPipelineOptions() - Method in interface org.apache.beam.sdk.state.StateContext
-
Returns the
PipelineOptions
specified with thePipelineRunner
. - getPipelineOptions() - Method in class org.apache.beam.sdk.transforms.CombineWithContext.Context
- getPipelineOptions() - Method in class org.apache.beam.sdk.transforms.DoFn.FinishBundleContext
- getPipelineOptions() - Method in class org.apache.beam.sdk.transforms.DoFn.StartBundleContext
- getPipelineOptions() - Method in class org.apache.beam.sdk.transforms.DoFn.WindowedContext
- getPipelineOptions() - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
Deprecated.Use
TestPipeline
with theDirectRunner
. - getPipelineOptions() - Method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHintsOptions.Options
- getPipelineOptionsFromClasspath(String) - Static method in class org.apache.beam.runners.jobsubmission.PortablePipelineJarUtils
- getPipelinePolicy() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
- getPipelineProto() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
-
Get the Runner API pipeline proto if available.
- getPipelineProto() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator.JobSpecification
- getPipelineRunners() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineRegistrar.Runner
- getPipelineRunners() - Method in class org.apache.beam.runners.direct.DirectRegistrar.Runner
- getPipelineRunners() - Method in class org.apache.beam.runners.flink.FlinkRunnerRegistrar.Runner
- getPipelineRunners() - Method in class org.apache.beam.runners.jet.JetRunnerRegistrar.Runner
- getPipelineRunners() - Method in class org.apache.beam.runners.portability.PortableRunnerRegistrar
- getPipelineRunners() - Method in class org.apache.beam.runners.portability.testing.TestUniversalRunner.RunnerRegistrar
- getPipelineRunners() - Method in class org.apache.beam.runners.prism.PrismRegistrar.Runner
- getPipelineRunners() - Method in class org.apache.beam.runners.spark.SparkRunnerRegistrar.Runner
- getPipelineRunners() - Method in class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingRunnerRegistrar.Runner
- getPipelineRunners() - Method in class org.apache.beam.runners.twister2.Twister2RunnerRegistrar.Runner
- getPipelineUrl() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
-
The URL of the staged portable pipeline.
- getPlanner() - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall
- getPlannerName() - Method in interface org.apache.beam.sdk.extensions.sql.impl.BeamSqlPipelineOptions
- getPluginClass() - Method in class org.apache.beam.sdk.io.cdap.Plugin
-
Gets the main class of a plugin.
- getPluginConfig() - Method in class org.apache.beam.sdk.io.cdap.Plugin
-
Gets a plugin config.
- getPluginProperties() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getPluginProperties(String) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getPluginType() - Method in class org.apache.beam.sdk.io.cdap.Plugin
-
Gets a plugin type.
- getPollInterval() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getPollIntervalMillis() - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration
-
The time, in milliseconds, to wait before polling for new files.
- getPort() - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration
- getPort() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver.ServerConfiguration
- getPort() - Method in class org.apache.beam.sdk.expansion.service.ExpansionServer
-
Get the port that this
ExpansionServer
is bound to. - getPort() - Method in class org.apache.beam.sdk.fn.data.RemoteGrpcPortRead
- getPort() - Method in class org.apache.beam.sdk.fn.data.RemoteGrpcPortWrite
- getPortNumber() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- getPortNumber() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- getPositionForFractionConsumed(double) - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
-
Returns a position
P
such that the range[start, P)
represents approximately the given fraction of the range[start, end)
. - getPrecision() - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.Globally
- getPrecision() - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.PerKey
- getPrecision() - Method in class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeNumber
- getPredefinedCsvFormat() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.CsvConfiguration
-
See
CSVFormat.Predefined.values()
for a list of allowed values. - getPreferGroupByKeyToHandleHugeValues() - Method in interface org.apache.beam.runners.spark.SparkCommonPipelineOptions
- getPrefix() - Method in class org.apache.beam.sdk.extensions.sql.impl.TableName
-
First element in the path.
- getPrefix() - Method in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider.Configuration
- getPrefixedEndpoint(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
- getPreviousWindow() - Method in class org.apache.beam.sdk.extensions.timeseries.FillGaps.InterpolateData
- getPrimary() - Method in class org.apache.beam.sdk.transforms.splittabledofn.SplitResult
-
Returns the primary restriction.
- getPrimaryKey() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
- getPriority() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
- getPriority() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record
-
Gets the priority level of the message (0-255, higher is more important).
- getPrismLocation() - Method in interface org.apache.beam.runners.prism.PrismPipelineOptions
- getPrismLogLevel() - Method in interface org.apache.beam.runners.prism.PrismPipelineOptions
- getPrismVersionOverride() - Method in interface org.apache.beam.runners.prism.PrismPipelineOptions
- getPrivateKey() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- getPrivateKeyPassphrase() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- getPrivateKeyPassphrase() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- getPrivateKeyPassphrase() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- getPrivateKeyPath() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- getPrivateKeyPath() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- getProcessBundleDescriptor() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.ExecutableProcessBundleDescriptor
- getProcessBundleDescriptor() - Method in interface org.apache.beam.runners.fnexecution.control.StageBundleFactory
- getProcessBundleDescriptor(String) - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClient
- getProcessBundleDescriptor(BeamFnApi.GetProcessBundleDescriptorRequest, StreamObserver<BeamFnApi.ProcessBundleDescriptor>) - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService
- getProcessingTimeAdvance() - Method in class org.apache.beam.sdk.testing.TestStream.ProcessingTimeEvent
- getProcessor(BeamFnApi.ProcessBundleDescriptor, List<RemoteInputDestination>) - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient
-
Provides
SdkHarnessClient.BundleProcessor
that is capable of processing bundles not containing timers or state accesses such as: Side inputs User state Remote references - getProcessor(BeamFnApi.ProcessBundleDescriptor, List<RemoteInputDestination>, StateDelegator) - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient
-
Provides
SdkHarnessClient.BundleProcessor
that is capable of processing bundles not containing timers. - getProcessor(BeamFnApi.ProcessBundleDescriptor, List<RemoteInputDestination>, StateDelegator, Map<String, Map<String, ProcessBundleDescriptors.TimerSpec>>) - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient
-
Provides
SdkHarnessClient.BundleProcessor
that is capable of processing bundles containing timers and state accesses such as: Side inputs User state Remote references - getProcessWideContainer() - Static method in class org.apache.beam.sdk.metrics.MetricsEnvironment
-
Return the
MetricsContainer
for the current process. - getProduced(ExecutableT) - Method in interface org.apache.beam.runners.direct.ExecutableGraph
- getProducedType() - Method in class org.apache.beam.runners.flink.translation.types.KvKeySelector
- getProducedType() - Method in class org.apache.beam.runners.flink.translation.types.WindowedKvKeySelector
- getProducedType() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.KvToFlinkKeyKeySelector
- getProducedType() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.SdfFlinkKeyKeySelector
- getProducedType() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.WorkItemKeySelector
- getProducer(CollectionT) - Method in interface org.apache.beam.runners.direct.ExecutableGraph
- getProducer(PValue) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
-
Get the
AppliedPTransform
that produced the providedPValue
. - getProducerConfig() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- getProducerConfigUpdates() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration
- getProducerFactoryFn() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- getProducersMapCardinality() - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
- getProfilingAgentConfiguration() - Method in interface org.apache.beam.runners.dataflow.options.DataflowProfilingOptions
- getProgress() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
-
Returns the progress made within the restriction so far.
- getProgress() - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.ReadFiles.BlockTracker
- getProgress() - Method in class org.apache.beam.sdk.transforms.PeriodicSequence.OutputRangeTracker
- getProgress() - Method in class org.apache.beam.sdk.transforms.splittabledofn.ByteKeyRangeTracker
- getProgress() - Method in class org.apache.beam.sdk.transforms.splittabledofn.GrowableOffsetRangeTracker
- getProgress() - Method in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
- getProgress() - Method in interface org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker.HasProgress
-
A representation for the amount of known completed and known remaining work.
- getProject() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
- getProject() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
-
Project id to use when launching jobs.
- getProject() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
-
Get the project path.
- getProject() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
- getProject() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
- getProjectedSchema() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
-
Returns the projected Schema after applying column pruning.
- getProjectId() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
-
Get the project this job exists in.
- getProjectId() - Method in class org.apache.beam.runners.dataflow.util.DataflowTemplateJob
- getProjectId() - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
- getProjectId() - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText
- getProjectId() - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
-
Returns the project id being written to.
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntityWithSummary
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKeyWithSummary
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.WriteWithSummary
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SchemaPath
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher.TableAndQuery
- getProperties() - Method in class org.apache.beam.runners.dataflow.DataflowRunnerInfo
- getProperties() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- getProperties() - Method in class org.apache.beam.sdk.extensions.sql.meta.Table
- getProperties() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
- getProtoBytesToRowFn(Class<T>) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoMessageSchema
- getProtoBytesToRowFromSchemaFunction(String, String) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoByteUtils
- getProtoBytesToRowFunction(String, String) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoByteUtils
- getProtoChangeStreamRecord() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSet
-
Returns the only change stream record proto at the current pointer of the result set.
- getProvidedSparkContext() - Method in interface org.apache.beam.runners.spark.SparkContextOptions
- getProvider(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.store.InMemoryMetaStore
- getProviderRuntimeValues() - Method in interface org.apache.beam.sdk.testing.TestPipeline.TestValueProviderOptions
- getProvisionInfo(ProvisionApi.GetProvisionInfoRequest, StreamObserver<ProvisionApi.GetProvisionInfoResponse>) - Method in class org.apache.beam.runners.fnexecution.provisioning.StaticGrpcProvisionService
- getProxyConfiguration() - Method in interface org.apache.beam.sdk.io.aws2.options.AwsOptions
-
ProxyConfiguration
used to configure AWS service clients. - getPTransformForInput() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaCSVTable
- getPTransformForInput() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
- getPTransformForInput() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.PayloadSerializerKafkaTable
- getPTransformForOutput() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaCSVTable
- getPTransformForOutput() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
- getPTransformForOutput() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.PayloadSerializerKafkaTable
- getPTransformId() - Method in class org.apache.beam.runners.fnexecution.data.RemoteInputDestination
- getPublishBatchWithOrderingKey() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
- getPublished() - Method in class org.apache.beam.sdk.io.solace.data.Solace.PublishResult
-
Whether the message was published or not.
- getPublishedResultsQueue() - Method in class org.apache.beam.sdk.io.solace.broker.JcsmpSessionService
- getPublishedResultsQueue() - Method in class org.apache.beam.sdk.io.solace.broker.SessionService
-
Returns the
Queue<Solace.PublishResult>
instance associated with this session, with the asynchronously received callbacks from Solace for message publications. - getPublishLatencyMetric() - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
- getPublishMonotonicNanos() - Method in class org.apache.beam.sdk.io.solace.data.Solace.CorrelationKey
- getPublishTimestamp() - Method in class org.apache.beam.sdk.io.pulsar.PulsarMessage
- getPublishTimestampFunction() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- getPubsubRootUrl() - Method in interface org.apache.beam.sdk.io.gcp.pubsub.PubsubOptions
-
Root URL for use with the Google Cloud Pub/Sub API.
- getPViews() - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
-
Return the current views creates in the pipeline.
- getQualifiers() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor
- getQuantifier() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPPattern
- getQueries() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
- getQuery() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration
-
Configures the BigQuery read job with the SQL query.
- getQuery() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration
- getQuery() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
- getQuery() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- getQuery() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration
- getQuery() - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher.TableAndQuery
- getQuery() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadConfiguration
- getQuery() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- getQuery() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
-
Getting a query which can be source for reading.
- getQuery() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- getQueryLocation() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration
-
BigQuery geographic location where the query job will be executed.
- getQueryName() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- getQueryStartedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSetMetadata
-
Returns the timestamp at which the change stream query for a
ChangeStreamResultSet
first started. - getQueryStartedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The time that the change stream query which produced this record started.
- getQueryString() - Method in interface org.apache.beam.sdk.extensions.sql.example.BeamSqlDataCatalogExample.DCExamplePipelineOptions
-
SQL Query.
- getQueue() - Method in class org.apache.beam.sdk.io.solace.broker.SessionServiceFactory
-
Getter for the queue.
- getQueue() - Method in class org.apache.beam.sdk.io.solace.read.UnboundedSolaceSource
- getQueueUrl() - Method in class org.apache.beam.sdk.io.aws2.sqs.providers.SqsReadConfiguration
- getQuotationMark() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
-
Getting a character that will surround
String
in staged CSV files. - getRamMegaBytes() - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
- getRange() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
-
Returns the current range.
- getRate() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
- getRate() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.NodeStats
- getRate() - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration
- getRateLimitPolicy() - Method in interface org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory
- getRateLimitPolicy() - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsIO.RateLimitPolicyFactory
- getRaw(ObjectT) - Method in interface org.apache.beam.sdk.schemas.FieldValueGetter
-
Returns the raw value of the getter before any further transformations.
- getRawBytesToRowFunction(Schema) - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
- getRawBytesToRowFunction(Schema) - Static method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider
- getRawPrivateKey() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- getRawPrivateKey() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- getRawPrivateKey() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- getRawStringToRowFunction(Schema) - Static method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider
- getRawType() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
-
Returns the raw class type.
- getRawType() - Method in class org.apache.beam.sdk.values.TypeDescriptor
- getRDD() - Method in class org.apache.beam.runners.spark.translation.BoundedDataset
- getRead() - Method in class org.apache.beam.io.requestresponse.Cache.Pair
- getReadCounterPrefix() - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.GcsCountersOptions
- getReaderCacheTimeoutSec() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
The amount of time before UnboundedReaders are considered idle and closed during streaming execution.
- getReaderCheckpoint(int, FlinkSourceReaderBase.ReaderAndOutput) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.unbounded.FlinkUnboundedSourceReader
- getReaderCheckpoint(int, FlinkSourceReaderBase.ReaderAndOutput) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.bounded.FlinkBoundedSourceReader
- getReaderCheckpoint(int, FlinkSourceReaderBase.ReaderAndOutput) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
-
Create
FlinkSourceSplit
for givensplitId
. - getReadOperation() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
- getReadQuery() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
- getReadResponse() - Method in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata.Result
-
Gets resources.
- getReadTime() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
- getReadTime() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
- getReadTimePercentage() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
- getReason() - Method in exception class org.apache.beam.sdk.coders.CannotProvideCoderException
- getReason() - Method in class org.apache.beam.sdk.extensions.ordered.UnprocessedEvent
- getReasons() - Method in exception class org.apache.beam.sdk.coders.Coder.NonDeterministicException
- getReceiptHandle() - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsMessage
-
SQS receipt handle.
- getReceiver() - Method in class org.apache.beam.runners.fnexecution.control.RemoteOutputReceiver
- getReceiver() - Method in class org.apache.beam.sdk.fn.data.DataEndpoint
- getReceiver() - Method in class org.apache.beam.sdk.fn.data.TimerEndpoint
- getReceiver() - Method in class org.apache.beam.sdk.io.solace.broker.JcsmpSessionService
- getReceiver() - Method in class org.apache.beam.sdk.io.solace.broker.SessionService
-
Returns a MessageReceiver object for receiving messages from Solace.
- getReceiverBuilder() - Method in class org.apache.beam.sdk.io.cdap.Plugin
-
Gets a
ReceiverBuilder
. - getReceiverClass() - Method in class org.apache.beam.sdk.io.cdap.Plugin
-
Gets Spark
Receiver
class for a CDAP plugin. - getReceiveTimestamp() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record
-
Gets the timestamp (in milliseconds since the Unix epoch) when the message was received by the Solace broker.
- getRecommendedSchema() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.DateConversion
- getRecommendedSchema() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.LocalTimestampMicrosConversion
- getRecommendedSchema() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.LocalTimestampMillisConversion
- getRecommendedSchema() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimeMicrosConversion
- getRecommendedSchema() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimeMillisConversion
- getRecommendedSchema() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimestampMicrosConversion
- getRecommendedSchema() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimestampMillisConversion
- getRecord() - Method in class org.apache.beam.sdk.io.gcp.bigquery.SchemaAndRecord
- getRecord() - Method in class org.apache.beam.sdk.io.kudu.TableAndRecord
- getRecord() - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord
-
Information about the record that failed.
- getRecordJfrOnGcThrashing() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
If true, save a JFR profile when GC thrashing is first detected.
- getRecordReadAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSetMetadata
-
Returns the timestamp at which a record was read from the
ChangeStreamResultSet
. - getRecordReadAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The time at which the record was fully read.
- getRecordSchema() - Method in class org.apache.beam.io.debezium.DebeziumIO.Read
- getRecordSequence() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
-
Indicates the order in which a record was put to the stream.
- getRecordSequence() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
Indicates the order in which this record was put into the change stream in the scope of a partition, commit timestamp and transaction tuple.
- getRecordSequence() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionEndRecord
-
Indicates the order in which a record was put to the stream.
- getRecordSequence() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionEventRecord
-
Indicates the order in which a record was put to the stream.
- getRecordSequence() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionStartRecord
-
Indicates the order in which a record was put to the stream.
- getRecordStreamEndedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSetMetadata
-
Returns the timestamp at which a record finished to be streamed.
- getRecordStreamEndedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The time at which the record finished streaming.
- getRecordStreamStartedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSetMetadata
-
Returns the timestamp at which a record first started to be streamed.
- getRecordStreamStartedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The time at which the record started to be streamed.
- getRecordTimestamp() - Method in interface org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecord
-
The timestamp associated with the record.
- getRecordTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The Cloud Spanner timestamp time when this record occurred.
- getRecordTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
-
Returns the timestamp that which this partition started being valid in Cloud Spanner.
- getRecordTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The timestamp at which the modifications within were committed in Cloud Spanner.
- getRecordTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.HeartbeatRecord
-
Indicates the timestamp for which the change stream query has returned all changes.
- getRecordTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionEndRecord
-
Indicates the timestamp for which the change stream partition is terminated.
- getRecordTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionEventRecord
-
Returns the timestamp at which the key range change occurred.
- getRecordTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionStartRecord
-
Returns the timestamp that which these partitions started being valid in Cloud Spanner.
- getRecordType() - Method in class org.apache.beam.sdk.coders.SerializableCoder
- getRedelivered() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record
-
Indicates whether the message has been redelivered due to a prior delivery failure.
- getRedistributeNumKeys() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getReferentialConstraints() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
- getRegexFromPattern(RexNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPUtils
-
Recursively construct a regular expression from a
RexNode
. - getRegion() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
-
Get the region this job exists in.
- getRegion() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
-
The Google Compute Engine region for creating Dataflow jobs.
- getRegionFromEnvironment() - Static method in class org.apache.beam.runners.dataflow.options.DefaultGcpRegionFactory
- getRegisteredOptions() - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
- getReidentifyConfig() - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
- getReidentifyTemplateName() - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
- getReIterableGroupByKeyResult() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getRelList() - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall
- getRelTypes() - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamCalcSplittingRule
- getRelTypes() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.BeamJavaUdfCalcRule
- getRelTypes() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.BeamZetaSqlCalcRule
- getRelTypes() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.BeamZetaSqlCalcSplittingRule
- getRemoteHeapDumpLocation() - Method in interface org.apache.beam.sdk.options.MemoryMonitorOptions
-
A remote file system to upload captured heap dumps to.
- getRemoteInputDestinations() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.ExecutableProcessBundleDescriptor
-
Get
RemoteInputDestination
s that input data are sent to theBeamFnApi.ProcessBundleDescriptor
over. - getRemoteOutputCoders() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.ExecutableProcessBundleDescriptor
-
Get all of the transforms materialized by this
ProcessBundleDescriptors.ExecutableProcessBundleDescriptor
and the JavaCoder
for the wire format of that transform. - getRepeatedTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.Repeatedly
- getReplacedByJob() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
-
Returns a new
DataflowPipelineJob
for the job that replaced this one, if applicable. - getReplacedByJob() - Method in class org.apache.beam.runners.dataflow.util.DataflowTemplateJob
- getReplacementTransform(AppliedPTransform<PCollection<? extends InputT>, PCollection<OutputT>, ParDo.SingleOutput<InputT, OutputT>>) - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory
- getReplacementTransform(AppliedPTransform<PCollection<? extends InputT>, PCollectionTuple, PTransform<PCollection<? extends InputT>, PCollectionTuple>>) - Method in class org.apache.beam.runners.direct.ParDoMultiOverrideFactory
- getReplacementTransform(AppliedPTransform<PCollection<ElemT>, PCollection<ElemT>, PTransform<PCollection<ElemT>, PCollection<ElemT>>>) - Method in class org.apache.beam.runners.spark.translation.streaming.CreateStreamingSparkView.Factory
- getReplicationGroupMessageId() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record
-
Gets the ID for the message within its replication group (if applicable).
- getReplyTo() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
- getReplyTo() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record
-
Gets the destination to which replies to this message should be sent.
- getReportCheckpointDuration() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getRequestAsString() - Method in class org.apache.beam.io.requestresponse.ApiIOError
-
The string representation of the request associated with the error.
- getRequestTimeStamp() - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsMessage
-
Timestamp the message was received at (in epoch millis).
- getRequiredSchema() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
-
Returns a Schema that includes all the fields required for a successful read.
- getRequirements() - Method in class org.apache.beam.sdk.transforms.Contextful
-
Returns the requirements needed to run the closure.
- getResidual() - Method in class org.apache.beam.sdk.transforms.splittabledofn.SplitResult
-
Returns the residual restriction.
- getResourceHints() - Method in class org.apache.beam.sdk.transforms.PTransform
-
Returns resource hints set on the transform.
- getResourceHints() - Method in interface org.apache.beam.sdk.transforms.resourcehints.ResourceHintsOptions
- getResources() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read.Result
-
Gets resources.
- getResources() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search.Result
-
Gets resources.
- getResourceType() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
- getResponse() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleResponse
-
HTTP response from the FHIR store after attempting to write the Bundle method.
- getResponseItemJson() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Document
- getResponses() - Method in class org.apache.beam.io.requestresponse.Result
- getRestrictionCoder() - Method in class org.apache.beam.io.debezium.KafkaSourceConsumerFn
- getRestrictionCoder() - Method in class org.apache.beam.sdk.io.pulsar.ReadFromPulsarDoFn
- getRestrictionCoder() - Method in class org.apache.beam.sdk.transforms.Watch.WatchGrowthFn
- getResult() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.TransactionResult
-
Returns the result of the transaction execution.
- getResult() - Method in class org.apache.beam.sdk.metrics.BoundedTrieResult
- getResultCoder(Pipeline) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler
-
Provide the result coder.
- getResultCount() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus
- getResults() - Method in class org.apache.beam.sdk.transforms.JsonToRow.ParseResult
-
Returns a
PCollection
containing theRow
s that have been parsed. - getRetainDockerContainers() - Method in interface org.apache.beam.sdk.options.ManualDockerEnvironmentOptions
- getRetainExternalizedCheckpointsOnCancellation() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getRetryableCodes() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getReturnType(RelDataTypeFactory) - Method in class org.apache.beam.sdk.extensions.sql.impl.ScalarFunctionImpl
- getReturnType(RelDataTypeFactory) - Method in class org.apache.beam.sdk.extensions.sql.impl.UdafImpl
- getReturnType(RelDataTypeFactory, SqlOperatorBinding) - Method in class org.apache.beam.sdk.extensions.sql.impl.ScalarFunctionImpl
- getRole() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- getRole() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- getRole() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- getRoot() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- getRootCause() - Method in exception class org.apache.beam.sdk.coders.CannotProvideCoderException
-
Returns the inner-most
CannotProvideCoderException
when they are deeply nested. - getRootCheckpointDir() - Method in class org.apache.beam.runners.spark.translation.streaming.Checkpoint.CheckpointDir
- getRootElement() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.XmlConfiguration
- getRootSchema() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- getRootTransforms() - Method in interface org.apache.beam.runners.direct.ExecutableGraph
- getRoutingKey() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
- getRow() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertError
- getRow() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertError
- getRow(int) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Row
value by field index,IllegalStateException
is thrown if schema doesn't match. - getRow(String) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.ROW
value by field name,IllegalStateException
is thrown if schema doesn't match. - getRowCount() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
- getRowCount() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.NodeStats
- getRowGroupSize() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.ParquetConfiguration
- getRowReceiver(TupleTag<T>) - Method in interface org.apache.beam.sdk.transforms.DoFn.MultiOutputReceiver
-
Returns a
DoFn.OutputReceiver
for publishingRow
objects to the given tag. - getRowRestriction() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration
- getRows() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
- getRows() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider.TableWithRows
- getRowSchema() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
- getRowSelector(Schema, FieldAccessDescriptor) - Static method in class org.apache.beam.sdk.schemas.utils.SelectHelpers
- getRowSelectorOptimized(Schema, FieldAccessDescriptor) - Static method in class org.apache.beam.sdk.schemas.utils.SelectHelpers
- getRowsWritten() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResult
-
The number of rows written in this batch.
- getRowToAvroBytesFunction(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
-
Returns a function mapping Beam
Row
s to encoded AVROGenericRecord
s. - getRowToBytesFn(String) - Static method in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformProvider
- getRowToGenericRecordFunction(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
-
Returns a function mapping Beam
Row
s to AVROGenericRecord
s for use inPCollection.setSchema(org.apache.beam.sdk.schemas.Schema, org.apache.beam.sdk.values.TypeDescriptor<T>, org.apache.beam.sdk.transforms.SerializableFunction<T, org.apache.beam.sdk.values.Row>, org.apache.beam.sdk.transforms.SerializableFunction<org.apache.beam.sdk.values.Row, T>)
. - getRowToJsonBytesFunction(Schema) - Static method in class org.apache.beam.sdk.schemas.utils.JsonUtils
-
Returns a
SimpleFunction
mapping BeamRow
s to JSON byte[] arrays. - getRowToJsonStringsFunction(Schema) - Static method in class org.apache.beam.sdk.schemas.utils.JsonUtils
- getRowToProtoBytes(String, String) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoByteUtils
- getRowToProtoBytesFn(Class<T>) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoMessageSchema
- getRowToProtoBytesFromSchema(String, String) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoByteUtils
- getRowToRawBytesFunction(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
- getRowToRawBytesFunction(String) - Static method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider
- getRowType() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The type of the primary keys and modified columns within this record.
- getRowType(RelDataTypeFactory) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteTable
- getRpcPriority() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getRule() - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall
- getRuleIndex() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.ArrayQualifierContext
- getRuleIndex() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionComponentContext
- getRuleIndex() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionContext
- getRuleIndex() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.FieldSpecifierContext
- getRuleIndex() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.MapQualifierContext
- getRuleIndex() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifiedComponentContext
- getRuleIndex() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifierListContext
- getRuleNames() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
- getRuleNames() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- getRuleSets() - Static method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRuleSets
- getRunner() - Method in interface org.apache.beam.sdk.options.PipelineOptions
-
The pipeline runner that will be used to execute the pipeline.
- getRunningAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
The time at which the connector started processing this partition.
- getS3ClientBuilder() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
-
Builder used to create the
S3Client
. - getS3ClientFactoryClass() - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
- getS3StorageClass() - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
- getS3StorageClass() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
-
The AWS S3 storage class used for creating S3 objects.
- getS3ThreadPoolSize() - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
- getS3ThreadPoolSize() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
-
Thread pool size, limiting the max concurrent S3 operations.
- getS3UploadBufferSizeBytes() - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
- getS3UploadBufferSizeBytes() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
-
Size of S3 upload chnks.
- getSafeFilepattern() - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration
- getSafeSchema() - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration
- getSamplePeriod() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
The length of time sampled request data will be retained.
- getSamplePeriodBucketSize() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
The size of buckets within the specified
samplePeriod
. - getSamplingStrategy() - Method in class org.apache.beam.sdk.io.TextRowCountEstimator
- getSasToken() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
- getSaveHeapDumpsToGcsPath() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
CAUTION: This option implies dumpHeapOnOOM, and has similar caveats.
- getSavepointPath() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getSaveProfilesToGcs() - Method in interface org.apache.beam.runners.dataflow.options.DataflowProfilingOptions
- getSbeFields() - Method in class org.apache.beam.sdk.extensions.sbe.SbeSchema
- getScale() - Method in class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeNumber
- getScan() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
- getScanType() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getScheduledAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
The time at which this partition was scheduled to be queried.
- getScheduledExecutorService() - Method in interface org.apache.beam.sdk.options.ExecutorOptions
-
The
ScheduledExecutorService
instance to use to create threads, can be overridden to specify aScheduledExecutorService
that is compatible with the user's environment. - getSchema() - Method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
-
Returns the schema used by this coder.
- getSchema() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoDynamicMessageSchema
- getSchema() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- getSchema() - Method in interface org.apache.beam.sdk.extensions.sql.meta.BeamSqlTable
-
Get the schema info of the table.
- getSchema() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.hcatalog.HCatalogTable
- getSchema() - Method in class org.apache.beam.sdk.extensions.sql.meta.SchemaBaseBeamTable
- getSchema() - Method in class org.apache.beam.sdk.extensions.sql.meta.Table
- getSchema() - Static method in class org.apache.beam.sdk.io.contextualtextio.RecordWithMetadata
- getSchema() - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration
-
The schema used by sources to deserialize data and create Beam Rows.
- getSchema() - Method in class org.apache.beam.sdk.io.gcp.bigquery.AvroWriteRequest
- getSchema() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
- getSchema() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
- getSchema() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
- getSchema() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getSchema() - Method in class org.apache.beam.sdk.io.iceberg.IcebergTableCreateConfig
-
Schema for the destination, in the event that it must be dynamically created.
- getSchema() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- getSchema() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration
- getSchema() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- getSchema() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
-
Getting a schema of a Snowflake table.
- getSchema() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- getSchema() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- getSchema() - Method in class org.apache.beam.sdk.schemas.SchemaCoder
-
Returns the schema associated with this type.
- getSchema() - Method in class org.apache.beam.sdk.schemas.utils.ReflectUtils.ClassWithSchema
- getSchema() - Method in class org.apache.beam.sdk.schemas.utils.ReflectUtils.TypeDescriptorWithSchema
- getSchema() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
- getSchema() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
-
Returns the schema used by this
CoGbkResult
. - getSchema() - Method in class org.apache.beam.sdk.values.PCollection
-
Returns the attached schema.
- getSchema() - Method in class org.apache.beam.sdk.values.Row.Builder
-
Return the schema for the row being built.
- getSchema() - Method in class org.apache.beam.sdk.values.Row.FieldValueBuilder
- getSchema() - Method in class org.apache.beam.sdk.values.Row
-
Return
Schema
which describes the fields. - getSchema(TableReference, BigQueryServices.DatasetService) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableSchemaCache
- getSchema(DestinationT) - Method in class org.apache.beam.sdk.extensions.avro.io.DynamicAvroDestinations
-
Return an AVRO schema for a given destination.
- getSchema(DestinationT) - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
-
Returns the table schema for the destination.
- getSchema(DestinationT) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiDynamicDestinationsTableRow
- getSchema(Class<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
- getSchema(Class<T>, Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
- getSchema(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.PortableBigQueryDestinations
- getSchema(PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Return a Beam
Schema
from the Pub/Sub schema resource, if exists. - getSchema(PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
-
Return a Beam
Schema
from the Pub/Sub schema resource, if exists. - getSchema(PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
-
Return a Beam
Schema
from the Pub/Sub schema resource, if exists. - getSchema(PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- getSchema(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
-
Retrieve a
Schema
for a givenTypeDescriptor
type. - getSchemaCoder() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageSchemaCoder
- getSchemaCoder(Class<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
-
Retrieve a
SchemaCoder
for a givenClass
type. - getSchemaCoder(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
-
Retrieve a
SchemaCoder
for a givenTypeDescriptor
type. - getSchemaId() - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo
- getSchemaIOProvider() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.avro.AvroTableProvider
- getSchemaIOProvider() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datastore.DataStoreV1TableProvider
- getSchemaIOProvider() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.pubsub.PubsubTableProvider
- getSchemaIOProvider() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.SchemaIOTableProviderWrapper
- getSchemaPath(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Return
PubsubClient.SchemaPath
fromPubsubClient.TopicPath
if exists. - getSchemaPath(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
-
Return
PubsubClient.SchemaPath
fromPubsubClient.TopicPath
if exists. - getSchemaPath(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
-
Return
PubsubClient.SchemaPath
fromPubsubClient.TopicPath
if exists. - getSchemaPath(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- getSchemaProvider(Class<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
-
Retrieve a registered
SchemaProvider
for a givenClass
. - getSchemaProvider(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
-
Retrieve a registered
SchemaProvider
for a givenTypeDescriptor
. - getSchemaProviders() - Method in class org.apache.beam.sdk.io.aws2.schemas.AwsSchemaRegistrar
- getSchemaProviders() - Method in class org.apache.beam.sdk.schemas.annotations.DefaultSchema.DefaultSchemaProviderRegistrar
- getSchemaProviders() - Method in interface org.apache.beam.sdk.schemas.SchemaProviderRegistrar
-
Returns a list of
schema providers
which will be registered by default within eachschema registry
instance. - getSchemaRegistry() - Method in class org.apache.beam.sdk.Pipeline
- getSchematizedData() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
-
Gets schematized data.
- getSchemaWithoutAttributes(Schema, List<String>) - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
- getScheme() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsResourceId
- getScheme() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
-
The uri scheme used by resources on this filesystem.
- getScheme() - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem.ClassLoaderResourceId
- getScheme() - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem
- getScheme() - Method in class org.apache.beam.sdk.io.FileSystem
-
Get the URI scheme which defines the namespace of the
FileSystem
. - getScheme() - Method in interface org.apache.beam.sdk.io.fs.ResourceId
-
Get the scheme which defines the namespace of the
ResourceId
. - getSdkComponents() - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
- getSdkContainerImage() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
-
Container image used to configure SDK execution environment on worker.
- getSdkHarnessContainerImageOverrides() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
Overrides for SDK harness container images.
- getSdkHarnessLogLevelOverrides() - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
-
This option controls the log levels for specifically named loggers.
- getSdkWorkerId() - Method in interface org.apache.beam.sdk.fn.server.HeaderAccessor
-
This method should be called from the request method.
- getSdkWorkerParallelism() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
- getSearchEndPoint() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
- getSeconds() - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Rate
- getSelectedFields() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration
- getSemiPersistDir() - Method in interface org.apache.beam.sdk.options.RemoteEnvironmentOptions
- getSempClientFactory() - Method in class org.apache.beam.sdk.io.solace.read.UnboundedSolaceSource
- getSenderTimestamp() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record
-
Gets the timestamp (in milliseconds since the Unix epoch) when the message was sent by the sender.
- getSendFacility() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
-
Gets send facility.
- getSendTime() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
-
Gets send time.
- getSequenceNumber() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
- getSequenceNumber() - Method in class org.apache.beam.sdk.io.gcp.bigquery.RowMutationInformation
-
Deprecated.
- getSequenceNumber() - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo
- getSequenceNumber() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record
-
Gets the sequence number of the message (if applicable).
- getSerializableFunctionUdfs() - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.UdfUdafProvider
-
For UDFs implement
SerializableFunction
. - getSerializableOptions() - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
- getSerializableOptions() - Method in class org.apache.beam.runners.spark.translation.SparkTranslationContext
- getSerializedATN() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
- getSerializedATN() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- getSerializedKey() - Method in class org.apache.beam.runners.flink.adapter.FlinkKey
- getSerializedWindowingStrategy() - Method in class org.apache.beam.sdk.expansion.service.WindowIntoTransformProvider.Configuration
- getSerializer(String, Schema, Map<String, Object>) - Static method in class org.apache.beam.sdk.schemas.io.payloads.PayloadSerializers
- getSerializer(Schema, Map<String, Object>) - Method in class org.apache.beam.sdk.extensions.avro.schemas.io.payloads.AvroPayloadSerializerProvider
- getSerializer(Schema, Map<String, Object>) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoPayloadSerializerProvider
- getSerializer(Schema, Map<String, Object>) - Method in class org.apache.beam.sdk.io.thrift.ThriftPayloadSerializerProvider
- getSerializer(Schema, Map<String, Object>) - Method in class org.apache.beam.sdk.schemas.io.payloads.JsonPayloadSerializerProvider
- getSerializer(Schema, Map<String, Object>) - Method in interface org.apache.beam.sdk.schemas.io.payloads.PayloadSerializerProvider
-
Get a PayloadSerializer.
- getServer() - Method in class org.apache.beam.sdk.fn.server.GrpcFnServer
-
Get the underlying
Server
contained by thisGrpcFnServer
. - getServerFactory() - Method in class org.apache.beam.runners.fnexecution.environment.DockerEnvironmentFactory.Provider
- getServerFactory() - Method in class org.apache.beam.runners.fnexecution.environment.EmbeddedEnvironmentFactory.Provider
- getServerFactory() - Method in interface org.apache.beam.runners.fnexecution.environment.EnvironmentFactory.Provider
-
Create the
ServerFactory
applicable to this environment. - getServerFactory() - Method in class org.apache.beam.runners.fnexecution.environment.ExternalEnvironmentFactory.Provider
- getServerName() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- getServerName() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- getServerName() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- getServerTransactionId() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The unique transaction id in which the modifications occurred.
- getService() - Method in class org.apache.beam.sdk.fn.server.GrpcFnServer
-
Get the service exposed by this
GrpcFnServer
. - getServiceAccount() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
-
Run the job as a specific service account, instead of the default GCE robot.
- getServiceURL(String) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getServiceURL(String, String) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getSessionProperties() - Method in class org.apache.beam.sdk.io.solace.broker.JcsmpSessionService
- getSessionProperties() - Method in class org.apache.beam.sdk.io.solace.broker.SessionService
-
Override this method and provide your specific properties, including all those related to authentication, and possibly others too.
- getSessionServiceFactory() - Method in class org.apache.beam.sdk.io.solace.read.UnboundedSolaceSource
- getSetFieldCreator(TypeDescriptor<T>, Schema, FieldValueTypeSupplier, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.POJOUtils
- getSetters(TypeDescriptor<?>, Schema, FieldValueTypeSupplier, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
-
Return the list of
FieldValueSetter
s for a Java Bean class - getSetters(TypeDescriptor<T>, Schema, FieldValueTypeSupplier, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.POJOUtils
- getSha256() - Method in class org.apache.beam.runners.dataflow.util.PackageUtil.StagedFile
-
The SHA-256 hash of the source file.
- getShard() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
- getShardId() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
- getShardingFunction() - Method in class org.apache.beam.sdk.io.WriteFiles
- getShardNameTemplate() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration
-
See
ShardNameTemplate
for the expected values. - getShardNumber() - Method in class org.apache.beam.sdk.values.ShardedKey
- getShardTemplate() - Method in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformConfiguration
- getSharedKeySize() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.KeyPrefix
- getShortTableUrn() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
-
Return shortened tablespec in datasets/[dataset]/tables/[table] format.
- getShortValue() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
-
Return the optional short value for an item, or null if none is provided.
- getShortValue() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
-
The optional short value for an item, or
null
if none is provided. - getShutdownSourcesAfterIdleMs() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getSideInput(String) - Method in interface org.apache.beam.runners.fnexecution.translation.BatchSideInputHandlerFactory.SideInputGetter
- getSideInputBroadcast(PCollection<T>, SideInputValues.Loader<T>) - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator.TranslationState
- getSideInputBroadcast(PCollection<T>, SideInputValues.Loader<T>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
- getSideInputDataSets() - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
- getSideInputKeys() - Method in class org.apache.beam.runners.twister2.translators.functions.DoFnFunction
-
get the tag id's of all the keys.
- getSideInputs() - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.ParDoSingle
- getSideInputs() - Method in class org.apache.beam.runners.twister2.Twister2PipelineExecutionEnvironment
- getSideInputs() - Method in class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
-
Override to specify that this object needs access to one or more side inputs.
- getSideInputs() - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
-
Specifies that this object needs access to one or more side inputs.
- getSideInputs() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiDynamicDestinationsTableRow
- getSideInputs() - Method in class org.apache.beam.sdk.transforms.Combine.Globally
-
Returns the side inputs used by this Combine operation.
- getSideInputs() - Method in class org.apache.beam.sdk.transforms.Combine.GroupedValues
- getSideInputs() - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
-
Returns the side inputs used by this Combine operation.
- getSideInputs() - Method in class org.apache.beam.sdk.transforms.Combine.PerKeyWithHotKeyFanout
-
Returns the side inputs used by this Combine operation.
- getSideInputs() - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
- getSideInputs() - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
- getSideInputs() - Method in class org.apache.beam.sdk.transforms.Requirements
-
The side inputs that this
Contextful
needs access to. - getSideInputs(Iterable<PCollectionView<?>>, JavaSparkContext, SparkPCollectionView) - Static method in class org.apache.beam.runners.spark.translation.TranslationUtils
-
Create SideInputs as Broadcast variables.
- getSideInputs(ExecutableStage) - Static method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors
- getSideInputSpecs() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.ExecutableProcessBundleDescriptor
-
Get a mapping from PTransform id to side input id to
side inputs
that are used during execution. - getSideInputWindow(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.WindowMappingFn
-
Returns the window of the side input corresponding to the given window of the main input.
- getSingleFileMetadata() - Method in class org.apache.beam.sdk.io.FileBasedSource
-
Returns the information about the single file that this source is reading from.
- getSinglePCollection() - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
-
Like
PCollectionRowTuple.get(String)
, but is a convenience method to get a single PCollection without providing a tag for that output. - getSingleTokenNewPartition(Range.ByteStringRange) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
-
Return a new NewPartition that only contains one token that matches the parentPartition.
- getSingleWorkerStatus(String, long, TimeUnit) - Method in class org.apache.beam.runners.fnexecution.status.BeamWorkerStatusGrpcService
-
Get the latest SDK worker status from the client's corresponding SDK harness.
- getSink() - Method in interface org.apache.beam.runners.fnexecution.control.ControlClientPool
-
Sink for control clients.
- getSink() - Method in class org.apache.beam.runners.fnexecution.control.MapControlClientPool
- getSink() - Method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
-
Returns the FileBasedSink for this write operation.
- getSink() - Method in class org.apache.beam.sdk.io.WriteFiles
- getSinkGroupId() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- getSinks() - Static method in class org.apache.beam.sdk.metrics.Lineage
-
Lineage
representing sinks. - getSize() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.FixedBytesField
-
Get the size.
- getSize() - Method in class org.apache.beam.sdk.extensions.sql.impl.TVFSlidingWindowFn
-
Size of the generated windows.
- getSize() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.DetectNewPartitionsDoFn
- getSize() - Method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeBinary
- getSize() - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
- getSize() - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
- getSize(StreamProgress) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.ReadChangeStreamPartitionDoFn
- getSize(PartitionMetadata, TimestampRange) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
- getSize(TimestampRange) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
- getSize(PulsarSourceDescriptor, OffsetRange) - Method in class org.apache.beam.sdk.io.pulsar.ReadFromPulsarDoFn
- getSketchFromByteBuffer(ByteBuffer) - Static method in class org.apache.beam.sdk.extensions.zetasketch.HllCount
-
Converts the passed-in sketch from
ByteBuffer
tobyte[]
, mappingnull ByteBuffer
s (representing empty sketches) to emptybyte[]
s. - getSkipHeaderLines() - Method in class org.apache.beam.sdk.io.TextRowCountEstimator
- getSkipKeyClone() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
- getSkipValueClone() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
- getSnapshot() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getSnapshotId() - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo
- getSnapshots() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteResult
- getSnowPipe() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- getSocketTimeout() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
- getSorterType() - Method in class org.apache.beam.sdk.extensions.sorter.ExternalSorter.Options
-
Returns the sorter type.
- getSource() - Method in class org.apache.beam.runners.dataflow.util.PackageUtil.StagedFile
-
The file to stage.
- getSource() - Method in class org.apache.beam.runners.flink.translation.wrappers.SourceInputSplit
- getSource() - Method in interface org.apache.beam.runners.fnexecution.control.ControlClientPool
-
Source of control clients.
- getSource() - Method in class org.apache.beam.runners.fnexecution.control.MapControlClientPool
- getSource() - Method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.Read
- getSource() - Method in class org.apache.beam.sdk.io.Read.Bounded
-
Returns the
BoundedSource
used to create thisRead
PTransform
. - getSource() - Method in class org.apache.beam.sdk.io.Read.Unbounded
-
Returns the
UnboundedSource
used to create thisRead
PTransform
. - getSource() - Method in class org.apache.beam.sdk.io.TextIO.Read
- getSource() - Method in class org.apache.beam.sdk.io.TFRecordIO.Read
- getSources() - Static method in class org.apache.beam.sdk.metrics.Lineage
-
Lineage
representing sources and optionally side inputs. - getSparkCheckpointDir() - Method in class org.apache.beam.runners.spark.translation.streaming.Checkpoint.CheckpointDir
- getSparkContext() - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
- getSparkContext() - Method in class org.apache.beam.runners.spark.translation.SparkTranslationContext
- getSparkContext(SparkPipelineOptions) - Static method in class org.apache.beam.runners.spark.translation.SparkContextFactory
- getSparkMaster() - Method in interface org.apache.beam.runners.spark.SparkCommonPipelineOptions
- getSparkPCollectionViewType() - Method in class org.apache.beam.runners.spark.util.SideInputBroadcast
- getSparkReceiverClass() - Method in class org.apache.beam.sdk.io.sparkreceiver.ReceiverBuilder
- getSparkSession() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.EvaluationContext
- getSparkSession() - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator.TranslationState
- getSparkSession() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
- getSplit() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.SerializableSplit
- getSplitBacklogBytes() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
- getSplitBacklogBytes() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
-
Returns the size of the backlog of unread data in the underlying data source represented by this split of this source.
- getSplitNumber() - Method in class org.apache.beam.runners.flink.translation.wrappers.SourceInputSplit
- getSplitPointsConsumed() - Method in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
-
Returns the total amount of parallelism in the consumed (returned and processed) range of this reader's current
BoundedSource
(as would be returned byBoundedSource.BoundedReader.getCurrentSource()
). - getSplitPointsConsumed() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
- getSplitPointsConsumed() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
- getSplitPointsConsumed() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
- getSplitPointsProcessed() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
-
Returns the total number of split points that have been processed.
- getSplitPointsRemaining() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource.AvroReader
- getSplitPointsRemaining() - Method in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
-
Returns the total amount of parallelism in the unprocessed part of this reader's current
BoundedSource
(as would be returned byBoundedSource.BoundedReader.getCurrentSource()
). - getSplitPointsRemaining() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
- getSplitPointsRemaining() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
- getSplitSerializer() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSource
- getSplitSources() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.UnboundedSourceWrapper
-
Visible so that we can check this in tests.
- getSplitState() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceSplit
- getSSEAlgorithm() - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
- getSSEAlgorithm() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
-
Algorithm for SSE-S3 encryption, e.g.
- getSSECustomerKey() - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
- getSSECustomerKey() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
-
SSE key for SSE-C encryption, e.g.
- getSSEKMSKeyId() - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
- getSSEKMSKeyId() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
-
KMS key id for SSE-KMS encyrption, e.g.
- getSsl() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- getStableUniqueNames() - Method in interface org.apache.beam.sdk.options.PipelineOptions
-
Whether to check for stable unique names on each transform.
- getStackTrace() - Method in class org.apache.beam.io.requestresponse.ApiIOError
-
The
Exception
stack trace. - getStackTrace() - Method in class org.apache.beam.sdk.io.csv.CsvIOParseError
-
The caught
Throwable.getStackTrace()
. - getStackTrace() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOError
- getStageBundleFactory(ExecutableStage) - Method in class org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext
- getStageBundleFactory(ExecutableStage) - Method in interface org.apache.beam.runners.fnexecution.control.ExecutableStageContext
- getStagedArtifacts(String) - Method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService
-
Returns the rewritten artifacts associated with this job, keyed by environment.
- getStageName() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- getStager() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
The resource stager instance that should be used to stage resources.
- getStagerClass() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
The class responsible for staging resources to be accessible by workers during job execution.
- getStagingBucketDir() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
-
Getting directory where files are staged.
- getStagingBucketDir() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeStreamingServiceConfig
-
Getter for a bucket name with directory where files were staged and waiting for loading.
- getStagingBucketName() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- getStagingBucketName() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- getStagingLocation() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
-
GCS path for staging local files, e.g.
- getStart() - Method in class org.apache.beam.sdk.extensions.ordered.ContiguousSequenceRange
- getStart() - Method in class org.apache.beam.sdk.io.cassandra.RingRange
- getStart() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.HL7v2MessagePages
- getStart() - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration
- getStartAtTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration
- getStartDate() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
- getStartDate() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
- getStartDate() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
- getStartingStrategy() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getStartKey() - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
-
Returns the
ByteKey
representing the lower bound of thisByteKeyRange
. - getStartOffset() - Method in class org.apache.beam.sdk.io.OffsetBasedSource
-
Returns the starting offset of the source.
- getStartPosition() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
- getStartPosition() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
- getStartPosition() - Method in interface org.apache.beam.sdk.io.range.RangeTracker
-
Returns the starting position of the current range, inclusive.
- getStartReadTime() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getStartTime() - Method in class org.apache.beam.sdk.io.CountingSource.CounterMark
-
Returns the time the reader was started.
- getStartTime() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.InitialPipelineState
- getStartTime() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
- getStartTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
-
It is the partition_start_time of the child partition token.
- getStartTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
It is the start time at which the partition started existing in Cloud Spanner.
- getStartTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionStartRecord
-
It is the partition start time of the partition tokens.
- getState() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
- getState() - Method in class org.apache.beam.runners.dataflow.util.DataflowTemplateJob
- getState() - Method in class org.apache.beam.runners.direct.DirectRunner.DirectPipelineResult
- getState() - Method in class org.apache.beam.runners.flink.FlinkDetachedRunnerResult
- getState() - Method in class org.apache.beam.runners.flink.FlinkRunnerResult
- getState() - Method in class org.apache.beam.runners.jet.FailedRunningPipelineResults
- getState() - Method in class org.apache.beam.runners.jet.JetPipelineResult
- getState() - Method in class org.apache.beam.runners.jobsubmission.JobInvocation
-
Retrieve the job's current state.
- getState() - Method in class org.apache.beam.runners.spark.SparkPipelineResult
- getState() - Method in class org.apache.beam.runners.spark.stateful.SparkStateInternals
- getState() - Method in class org.apache.beam.runners.spark.stateful.StateAndTimers
- getState() - Method in class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingPipelineResult
- getState() - Method in class org.apache.beam.runners.twister2.Twister2PipelineResult
- getState() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
The state in which the current partition is in.
- getState() - Method in interface org.apache.beam.sdk.PipelineResult
-
Retrieves the current state of the pipeline execution.
- getState() - Method in interface org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimator
-
Get current state of the
WatermarkEstimator
instance, which can be used to recreate theWatermarkEstimator
when processing the restriction. - getState() - Method in class org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimators.Manual
- getState() - Method in class org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimators.MonotonicallyIncreasing
- getState() - Method in class org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimators.WallTime
- getState(JobApi.GetJobStateRequest, StreamObserver<JobApi.JobStateEvent>) - Method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
- getState(JobApi.GetJobStateRequest, StreamObserver<JobApi.JobStateEvent>) - Method in class org.apache.beam.runners.portability.testing.TestJobService
- getStateBackend() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getStateBackendFactory() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
-
Deprecated.Please use setStateBackend below.
- getStateBackendStoragePath() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getStateCoder() - Method in interface org.apache.beam.sdk.transforms.Watch.Growth.TerminationCondition
-
Used to encode the state of this
Watch.Growth.TerminationCondition
. - getStateCoder(Pipeline) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler
-
Provide the state coder.
- getStateEvent() - Method in class org.apache.beam.runners.jobsubmission.JobInvocation
-
Retrieve the job's current state.
- getStateStream(JobApi.GetJobStateRequest, StreamObserver<JobApi.JobStateEvent>) - Method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
- getStaticCreator(TypeDescriptor<?>, Method, Schema, FieldValueTypeSupplier, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
- getStaticCreator(TypeDescriptor<?>, Method, Schema, FieldValueTypeSupplier, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.POJOUtils
- getStatistic() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteTable
- getStatistics(BaseStatistics) - Method in class org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat
- getStatistics(BaseStatistics) - Method in class org.apache.beam.runners.flink.translation.wrappers.SourceInputFormat
- getStatus() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteFailure
- getStatusCode() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOError
- getStatusDate() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus
- getStatusUpdateFrequency() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler
-
Determines the frequency of emission of the
OrderedProcessingStatus
elements. - getStepName() - Method in class org.apache.beam.runners.spark.translation.SparkProcessContext
- getStepNames() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator.JobSpecification
-
Returns the mapping of
AppliedPTransforms
to the internal step name for thatAppliedPTransform
. - getStopPipelineWatermark() - Method in interface org.apache.beam.runners.spark.TestSparkPipelineOptions
- getStopPosition() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
- getStopPosition() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
- getStopPosition() - Method in interface org.apache.beam.sdk.io.range.RangeTracker
-
Returns the ending position of the current range, exclusive.
- getStopReadTime() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getStorageApiAppendThresholdBytes() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getStorageApiAppendThresholdRecordCount() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getStorageClient(BigQueryOptions) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices
-
Returns a real, mock, or fake
BigQueryServices.StorageClient
. - getStorageClient(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl
- getStorageClient(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
- getStorageIntegrationName() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- getStorageIntegrationName() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
-
Getting Snowflake integration which is used in COPY statement.
- getStorageIntegrationName() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- getStorageLevel() - Method in interface org.apache.beam.runners.spark.SparkCommonPipelineOptions
- getStorageWriteApiMaxRequestSize() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getStorageWriteApiMaxRetries() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getStorageWriteApiTriggeringFrequencySec() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getStorageWriteMaxInflightBytes() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getStorageWriteMaxInflightRequests() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getStreamAppendClient(String, DescriptorProtos.DescriptorProto, boolean, AppendRowsRequest.MissingValueInterpretation) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.WriteStreamService
-
Create an append client for a given Storage API write stream.
- getStreamAppendClient(String, DescriptorProtos.DescriptorProto, boolean, AppendRowsRequest.MissingValueInterpretation) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.WriteStreamServiceImpl
- getStreamAppendClient(String, DescriptorProtos.DescriptorProto, boolean, AppendRowsRequest.MissingValueInterpretation) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- getStreaming() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getStreamingContext() - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
- getStreamingContext() - Method in class org.apache.beam.runners.spark.translation.SparkStreamingTranslationContext
- getStreamingService() - Method in interface org.apache.beam.sdk.io.snowflake.services.SnowflakeServices
- getStreamingService() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeServicesImpl
- getStreamingSideInputCacheExpirationMillis() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getStreamingSideInputCacheMb() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getStreamingTimeoutMs() - Method in interface org.apache.beam.runners.spark.SparkPortableStreamingPipelineOptions
- getStreamName() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
- getStreamSources() - Method in class org.apache.beam.runners.spark.translation.streaming.UnboundedDataset
- getStreamTableDebugString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.DaoFactory
- getString() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
- getString(int) - Method in class org.apache.beam.sdk.values.Row
-
Get a
String
value by field index,ClassCastException
is thrown if schema doesn't match. - getString(String) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.STRING
value by field name,IllegalStateException
is thrown if schema doesn't match. - getString(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- getString(Map<String, Object>, String, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- getStrings(Map<String, Object>, String, List<String>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
- getStringSet() - Method in class org.apache.beam.sdk.metrics.StringSetResult.EmptyStringSetResult
-
Returns an empty immutable set.
- getStringSet() - Method in class org.apache.beam.sdk.metrics.StringSetResult
- getStringSet(MetricName) - Method in class org.apache.beam.runners.jet.metrics.JetMetricsContainer
- getStringSet(MetricName) - Method in interface org.apache.beam.sdk.metrics.MetricsContainer
-
Return the
StringSet
that should be used for implementing the givenmetricName
in this container. - getStringSets() - Method in class org.apache.beam.sdk.metrics.MetricQueryResults
-
Return the metric results for the sets that matched the filter.
- getStuckCommitDurationMillis() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getSubmissionMode() - Method in class org.apache.beam.sdk.io.solace.broker.SessionServiceFactory
- getSubmissionMode() - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
- getSubnetwork() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
-
GCE subnetwork for launching workers.
- getSubProvider(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.FullNameTableProvider
- getSubProvider(String) - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider
-
Returns a sub-provider, e.g.
- getSubProviders() - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider
-
Returns all sub-providers, e.g.
- getSubSchema(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
-
If this is the root schema (in other words, a
CatalogManager
), the sub schema will be aCatalog
's metastore. - getSubSchemaNames() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
- getSubscription() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
- getSubscription() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
-
Get the subscription being read from.
- getSubscriptionName() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
- getSubscriptionProvider() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
-
Get the
ValueProvider
for the subscription being read from. - getSubSequenceNumber() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
- getSuccessfulBodies() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
- getSuccessfulBodies() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.AbstractResult
- getSuccessfulBodies() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.Result
-
Gets successful bodies from Write.
- getSuccessfulBundles() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
-
Gets successful FhirBundleResponse from execute bundles operation.
- getSuccessfulInserts() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
-
Returns a
PCollection
containing theTableRow
s that were written to BQ via the streaming insert API. - getSuccessfulPublish() - Method in class org.apache.beam.sdk.io.solace.write.SolaceOutput
- getSuccessfulStorageApiInserts() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
-
Return all rows successfully inserted using one of the storage-api insert methods.
- getSuccessfulTableLoads() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
-
Returns a
PCollection
containing theTableDestination
s that were successfully loaded using the batch load API. - getSuggestedFilenameSuffix() - Method in enum class org.apache.beam.sdk.io.FileBasedSink.CompressionType
-
Deprecated.
- getSuggestedFilenameSuffix() - Method in interface org.apache.beam.sdk.io.FileBasedSink.OutputFileHints
- getSuggestedSuffix() - Method in enum class org.apache.beam.sdk.io.Compression
- getSum() - Method in class org.apache.beam.sdk.metrics.DistributionResult
- getSum() - Method in class org.apache.beam.sdk.testing.CoderProperties.TestElementByteSizeObserver
- getSumAndReset() - Method in class org.apache.beam.sdk.testing.CoderProperties.TestElementByteSizeObserver
- getSummary() - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo
- getSupertype(Class<? super T>) - Method in class org.apache.beam.sdk.values.TypeDescriptor
-
Returns the generic form of a supertype.
- getSupported() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryFilter
- getSupported() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.iceberg.IcebergFilter
- getSupported() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableFilter
- getSupportedClass() - Method in interface org.apache.beam.runners.dataflow.util.CloudObjectTranslator
-
Gets the class this
CloudObjectTranslator
is capable of converting. - getSupportedClass() - Method in class org.apache.beam.runners.dataflow.util.RowCoderCloudObjectTranslator
- getSupportedClass() - Method in class org.apache.beam.runners.dataflow.util.SchemaCoderCloudObjectTranslator
- getSynchronizedProcessingOutputWatermark() - Method in interface org.apache.beam.runners.local.Bundle
-
Returns the processing time output watermark at the time the producing
Executable
committed this bundle. - getSynchronizedProcessingTime() - Method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder.SparkWatermarks
- getTable() - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration
- getTable() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertError
- getTable() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
-
Returns the table to read, or
null
if reading from a query instead. - getTable() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- getTable() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Returns the table reference, or
null
. - getTable() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
- getTable() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration
- getTable() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- getTable() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getTable() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration
- getTable() - Method in class org.apache.beam.sdk.io.kudu.TableAndRecord
- getTable() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadConfiguration
- getTable() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteConfiguration
- getTable() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- getTable() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
-
Getting a table as a source of reading or destination to write.
- getTable() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- getTable(TableReference) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Gets the specified
Table
resource by table ID. - getTable(TableReference) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
-
Gets the specified
Table
resource by table ID. - getTable(TableReference) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- getTable(TableReference, List<String>) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
- getTable(TableReference, List<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
- getTable(TableReference, List<String>) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- getTable(TableReference, List<String>, BigQueryServices.DatasetService.TableMetadataView) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
- getTable(TableReference, List<String>, BigQueryServices.DatasetService.TableMetadataView) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
- getTable(TableReference, List<String>, BigQueryServices.DatasetService.TableMetadataView) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- getTable(DestinationT) - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
-
Returns a
TableDestination
object for the destination. - getTable(DestinationT) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiDynamicDestinationsTableRow
- getTable(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
- getTable(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
- getTable(String) - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider
-
Get a specific table from this provider it is present, or null if it is not present.
- getTable(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.PortableBigQueryDestinations
- getTable(BigQueryOptions, TableReference) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
- getTableAdminClient() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.BigtableChangeStreamAccessor
- getTableByFullName(TableName) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
- getTableByFullName(TableName) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.FullNameTableProvider
- getTableConstraints(DestinationT) - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
-
Returns TableConstraints (including primary and foreign key) to be used when creating the table.
- getTableConstraints(DestinationT) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiDynamicDestinationsTableRow
- getTableConstraints(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.PortableBigQueryDestinations
- getTableCreateConfig() - Method in class org.apache.beam.sdk.io.iceberg.IcebergDestination
-
Metadata and constraints for creating a new table, if it must be done dynamically.
- getTableDescription() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- getTableId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Returns the table being read from.
- getTableId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration
- getTableId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration
- getTableId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
-
Return the metadata table name.
- getTableId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration
- getTableId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration
- getTableId() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
- getTableId() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Write
- getTableId() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.WriteRowMutations
- getTableIdentifier() - Method in class org.apache.beam.sdk.io.iceberg.IcebergDestination
-
The iceberg table identifier to write data to.
- getTableIdentifier() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getTableIdentifier() - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo
- getTableIdentifierString() - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo
- getTableImpl(TableReference, List<String>, BigQueryServices.DatasetService.TableMetadataView) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- getTableName() - Method in class org.apache.beam.sdk.extensions.sql.impl.TableName
-
Table name, the last element of the fully-specified table name with path.
- getTableName() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataTableNames
- getTableName() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The name of the table in which the modifications within this record occurred.
- getTableNames() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
- getTablePath(Table) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.QueryTrait
-
Returns a full table path (exlucding top-level schema) for a given ZetaSQL Table.
- getTableProperties() - Method in class org.apache.beam.sdk.io.iceberg.IcebergTableCreateConfig
- getTableProperties() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration
- getTableProvider() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
- getTableProvider() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
-
Returns the table to read, or
null
if reading from a query instead. - getTableProvider() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- getTableReference() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- getTableResource(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
- getTableRow() - Method in class org.apache.beam.sdk.io.gcp.bigquery.RowMutation
- getTables() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
- getTables() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.iceberg.IcebergTableProvider
- getTables() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.InMemoryMetaTableProvider
- getTables() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.ReadOnlyTableProvider
- getTables() - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider
-
Get all tables from this provider.
- getTables() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider
- getTables() - Method in class org.apache.beam.sdk.extensions.sql.meta.store.InMemoryMetaStore
- getTables() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema
- getTableSchema() - Method in class org.apache.beam.sdk.io.gcp.bigquery.SchemaAndRecord
- getTableSchema() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.WriteBuilder.Configuration
- getTableSchema() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
- getTableSchema(String, String) - Static method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO
-
Returns
TableSchema
for a given table. - getTableSchema(String, String) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogBeamSchema
-
Gets the table schema, or absent optional if the table doesn't exist in the database.
- getTableSpec() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration
-
Specifies a table for a BigQuery read job.
- getTableSpec() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration
- getTableSpec() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
-
Return the tablespec in [project:].dataset.tableid format.
- getTableStatistics(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.sql.meta.BaseBeamTable
- getTableStatistics(PipelineOptions) - Method in interface org.apache.beam.sdk.extensions.sql.meta.BeamSqlTable
-
Estimates the number of rows or the rate for unbounded Tables.
- getTableStatistics(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datagen.DataGeneratorTable
- getTableStatistics(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
- getTableStatistics(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTable
- getTableStatistics(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestBoundedTable
- getTableStatistics(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestUnboundedTable
- getTableStatistics(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTable
- getTableStatistics(PipelineOptions, SchemaIO) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datastore.DataStoreV1TableProvider
- getTableStatistics(PipelineOptions, SchemaIO) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.SchemaIOTableProviderWrapper
- getTableStringIdentifier(ValueInSingleWindow<Row>) - Method in interface org.apache.beam.sdk.io.iceberg.DynamicDestinations
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.avro.AvroTableProvider
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTableProvider
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigtable.BigtableTableProvider
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datagen.DataGeneratorTableProvider
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datastore.DataStoreV1TableProvider
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.iceberg.IcebergTableProvider
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.KafkaTableProvider
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTableProvider
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.parquet.ParquetTableProvider
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.pubsub.PubsubTableProvider
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.pubsublite.PubsubLiteTableProvider
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.ReadOnlyTableProvider
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.SchemaIOTableProviderWrapper
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.seqgen.GenerateSequenceTableProvider
- getTableType() - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider
-
Gets the table type this provider handles.
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTableProvider
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.store.InMemoryMetaStore
- getTableUrn(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
-
Return the tablespec in projects/[project]/datasets/[dataset]/tables/[table] format.
- getTag() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getTag() - Method in class org.apache.beam.sdk.values.TaggedPValue
-
Returns the local tag associated with the
PValue
. - getTag(int) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
-
Returns the tuple tag at the given index.
- getTagInternal() - Method in interface org.apache.beam.sdk.values.PCollectionView
-
Deprecated.this method will be removed entirely. The
PCollection
underlying a side input, is part of the side input's specification with aParDo
transform, which will obtain that information via a package-private channel. - getTagInternal() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
-
Returns a unique
TupleTag
identifying thisPCollectionView
. - getTargetDataset() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.TestBigQueryOptions
- getTargetParallelism() - Method in interface org.apache.beam.runners.direct.DirectOptions
- getTargetTable(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
- getTargetTableId(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
- getTempDatasetId() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getTempDirectory() - Method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
- getTempDirectoryProvider() - Method in class org.apache.beam.sdk.io.FileBasedSink
-
Returns the directory inside which temporary files will be written according to the configured
FileBasedSink.FilenamePolicy
. - getTempFilename() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
- getTemplateLocation() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
-
Where the runner should generate a template file.
- getTempLocation() - Method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter.Options
-
Returns the configured temporary location.
- getTempLocation() - Method in class org.apache.beam.sdk.extensions.sorter.ExternalSorter.Options
-
Returns the configured temporary location.
- getTempLocation() - Method in interface org.apache.beam.sdk.options.PipelineOptions
-
A pipeline level default location for storing temporary files.
- getTempRoot() - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
- GETTER_WITH_NULL_METHOD_ERROR - Static variable in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
- GetterBasedSchemaProvider - Class in org.apache.beam.sdk.schemas
-
Deprecated.new implementations should extend the
GetterBasedSchemaProviderV2
class' methods which receiveTypeDescriptor
s instead of ordinaryClass
es as arguments, which permits to support generic type signatures during schema inference - GetterBasedSchemaProvider() - Constructor for class org.apache.beam.sdk.schemas.GetterBasedSchemaProvider
-
Deprecated.
- GetterBasedSchemaProviderBenchmark - Class in org.apache.beam.sdk.jmh.schemas
-
Benchmarks for
GetterBasedSchemaProvider
on reading / writing fields based ontoRowFunction
/fromRowFunction
. - GetterBasedSchemaProviderBenchmark() - Constructor for class org.apache.beam.sdk.jmh.schemas.GetterBasedSchemaProviderBenchmark
- GetterBasedSchemaProviderV2 - Class in org.apache.beam.sdk.schemas
-
A newer version of
GetterBasedSchemaProvider
, which works withTypeDescriptor
s, and which by default delegates the old,Class
based methods, to the new ones. - GetterBasedSchemaProviderV2() - Constructor for class org.apache.beam.sdk.schemas.GetterBasedSchemaProviderV2
- getTerminateAfterSecondsSinceNewOutput() - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration
-
If no new files are found after this many seconds, this transform will cease to watch for new files.
- GetterTypeSupplier() - Constructor for class org.apache.beam.sdk.schemas.JavaBeanSchema.GetterTypeSupplier
- getTestMode() - Method in interface org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingPipelineOptions
-
Set to true to run the job in test mode.
- getTestTimeoutSeconds() - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
- getThrottleDuration() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
The amount of time an attempt will be throttled if deemed necessary based on previous success rate.
- getThroughputEstimate() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
- getTimeDomain() - Method in interface org.apache.beam.sdk.state.TimerSpec
- getTimePartitioning() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- getTimerDataIterator() - Method in class org.apache.beam.runners.spark.translation.SparkProcessContext
- getTimerFamilyId() - Method in class org.apache.beam.sdk.fn.data.LogicalEndpoint
- getTimerFamilyId() - Method in class org.apache.beam.sdk.fn.data.TimerEndpoint
- getTimerReceivers() - Method in interface org.apache.beam.runners.fnexecution.control.RemoteBundle
-
Get a map of (transform id, timer id) to
receiver
s which consume timers, forwarding them to the remote environment. - getTimerReceivers() - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient.BundleProcessor.ActiveBundle
- getTimers() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
- getTimers() - Method in class org.apache.beam.runners.spark.stateful.StateAndTimers
- getTimers(MetricFilter) - Method in class org.apache.beam.runners.spark.metrics.WithMetricsSupport
- getTimers(MetricFilter) - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.WithMetricsSupport
- getTimerSpec() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.TimerSpec
- getTimerSpecs() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.ExecutableProcessBundleDescriptor
-
Get a mapping from PTransform id to timer id to
timer specs
that are used during execution. - getTimes() - Method in class org.apache.beam.runners.spark.io.CreateStream
-
Get times so they can be pushed into the
GlobalWatermarkHolder
. - getTimestamp() - Method in class org.apache.beam.sdk.extensions.ordered.ContiguousSequenceRange
- getTimestamp() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Document
- getTimestamp() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload
- getTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.HeartbeatRecord
-
Indicates the timestamp for which the change stream query has returned all changes.
- getTimestamp() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getTimestamp() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
- getTimestamp() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
- getTimestamp() - Method in class org.apache.beam.sdk.metrics.GaugeResult.EmptyGaugeResult
- getTimestamp() - Method in class org.apache.beam.sdk.metrics.GaugeResult
- getTimestamp() - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow
-
Returns the timestamp of this
FailsafeValueInSingleWindow
. - getTimestamp() - Method in class org.apache.beam.sdk.values.TimestampedValue
- getTimestamp() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow
-
Returns the timestamp of this
ValueInSingleWindow
. - getTimestamp() - Method in interface org.apache.beam.sdk.values.WindowedValue
-
The timestamp of this value in event time.
- getTimestamp() - Method in class org.apache.beam.sdk.values.WindowedValues.ParamWindowedValueCoder
- getTimestamp(T, Instant) - Method in interface org.apache.beam.sdk.io.kafka.KafkaPublishTimestampFunction
-
Returns timestamp for element being published to Kafka.
- getTimeStamp() - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsMessage
-
Timestamp the message was sent at (in epoch millis).
- getTimestampAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
- getTimestampAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
-
Get the timestamp attribute.
- getTimestampAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
-
Get the timestamp attribute.
- getTimestampAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration
- getTimestampCombiner() - Method in interface org.apache.beam.sdk.state.WatermarkHoldState
-
Return the
TimestampCombiner
which will be used to determine a watermark hold time given an element timestamp, and to combine watermarks from windows which are about to be merged. - getTimestampCombiner() - Method in class org.apache.beam.sdk.values.WindowingStrategy
- getTimestampFn() - Method in class org.apache.beam.sdk.io.solace.read.UnboundedSolaceSource
- getTimestampForRecord(TimestampPolicy.PartitionContext, KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.CustomTimestampPolicyWithLimitedDelay
- getTimestampForRecord(TimestampPolicy.PartitionContext, KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicy
-
Returns record timestamp (aka event time).
- getTimestampForRecord(TimestampPolicy.PartitionContext, KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.LogAppendTimePolicy
- getTimestampForRecord(TimestampPolicy.PartitionContext, KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.ProcessingTimePolicy
- getTimestampForRecord(TimestampPolicy.PartitionContext, KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.TimestampFnPolicy
- getTimestampMillis() - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo
- getTimestampMsSinceEpoch() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.OutgoingMessage
-
Timestamp for element (ms since epoch).
- getTimestampPolicyFactory() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
- getTimestampPolicyFactory() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getTimestampTransforms() - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
-
The transforms applied to the arrival time of an element to determine when this trigger allows output.
- getTimestampType() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
- getTimeToLive() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record
-
The number of milliseconds before the message is discarded or moved to Dead Message Queue.
- getTimeZone() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
- getTimeZone() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
- getTimeZone() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
- getTiming() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
-
Return the timing of this pane.
- getTo() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRange
-
Returns the range end timestamp (exclusive).
- getTo() - Method in class org.apache.beam.sdk.io.range.OffsetRange
- getToken() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartition
-
Unique partition identifier, which can be used to perform a change stream query.
- getTokenNames() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
-
Deprecated.
- getTokenNames() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
-
Deprecated.
- getTokenWithCorrectPartition(Range.ByteStringRange, ChangeStreamContinuationToken) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamContinuationTokenHelper
-
Return the continuation token with correct partition.
- getToKV() - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
- getTopic() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
- getTopic() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
- getTopic() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
-
Get the topic being written to.
- getTopic() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
-
Get the topic being read from.
- getTopic() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration
- getTopic() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark.PartitionMark
- getTopic() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- getTopic() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
-
Sets the topic from which to read.
- getTopic() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
- getTopic() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration
- getTopic() - Method in class org.apache.beam.sdk.io.mqtt.MqttRecord
- getTopic() - Method in class org.apache.beam.sdk.io.pulsar.PulsarMessage
- getTopicName() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
- getTopicPartition() - Method in class org.apache.beam.sdk.io.kafka.KafkaSourceDescriptor
- getTopicPartitions() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getTopicPattern() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getTopicProvider() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
-
Get the
ValueProvider
for the topic being written to. - getTopicProvider() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
-
Get the
ValueProvider
for the topic being read from. - getTopics() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
- getTopics() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getToRowFunction() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoDynamicMessageSchema
- getToRowFunction() - Method in class org.apache.beam.sdk.schemas.SchemaCoder
-
Returns the fromRow conversion function.
- getToRowFunction() - Method in class org.apache.beam.sdk.values.PCollection
-
Returns the attached schema's toRowFunction.
- getToRowFunction(Class<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
-
Retrieve the function that converts an object of the specified type to a
Row
object. - getToRowFunction(Class<T>, Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
- getToRowFunction(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
-
Retrieve the function that converts an object of the specified type to a
Row
object. - getToSnapshot() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getToSnapshotRef() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getTotalBacklogBytes() - Method in interface org.apache.beam.sdk.io.jms.AutoScaler
-
Returns the size of the backlog of unread data in the underlying data source represented by all splits of this source.
- getTotalBacklogBytes() - Method in class org.apache.beam.sdk.io.jms.DefaultAutoscaler
- getTotalBacklogBytes() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
-
Returns the size of the backlog of unread data in the underlying data source represented by all splits of this source.
- getTotalFields() - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeInformation
- getTotalFields() - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueTypeInformation
- getTotalStreamDuration() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSetMetadata
-
Returns the total stream duration of change stream records so far.
- getTotalStreamTimeMillis() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The total streaming time (in millis) for this record.
- getToTimestamp() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getTrait() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.QueryStatementConverter
- getTraitDef() - Method in enum class org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention
- getTransactionIsolation() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- getTransactionTag() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The transaction tag associated with the given transaction.
- getTransform(RunnerApi.FunctionSpec, PipelineOptions) - Method in class org.apache.beam.sdk.expansion.service.ExpansionServiceSchemaTransformProvider
- getTransform(RunnerApi.FunctionSpec, PipelineOptions) - Method in interface org.apache.beam.sdk.expansion.service.TransformProvider
- getTransformId() - Method in class org.apache.beam.sdk.fn.data.DataEndpoint
- getTransformId() - Method in class org.apache.beam.sdk.fn.data.LogicalEndpoint
- getTransformId() - Method in class org.apache.beam.sdk.fn.data.TimerEndpoint
- getTransformingMap(Map<K1, V1>, Function<K1, K2>, Function<V1, V2>) - Static method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils
- getTransformNameMapping() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
Mapping of old PTransform names to new ones, specified as JSON
{"oldName":"newName",...}
. - getTransformPayloadTranslators() - Method in class org.apache.beam.runners.dataflow.DataflowRunner.DataflowTransformTranslator
- getTransformPayloadTranslators() - Method in class org.apache.beam.runners.dataflow.internal.DataflowGroupByKey.Registrar
- getTransformPayloadTranslators() - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.Registrar
- getTransformPayloadTranslators() - Method in class org.apache.beam.runners.spark.translation.streaming.StreamingTransformTranslator.SparkTransformsRegistrar
- getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOTranslation.ReadRegistrar
- getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOTranslation.WriteRegistrar
- getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQuerySchemaTransformTranslation.ReadWriteRegistrar
- getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubSubPayloadTranslation.ReadRegistrar
- getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubSubPayloadTranslation.WriteRegistrar
- getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.iceberg.IcebergSchemaTransformTranslation.ReadRegistrar
- getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.iceberg.IcebergSchemaTransformTranslation.WriteRegistrar
- getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.kafka.KafkaSchemaTransformTranslation.ReadRegistrar
- getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.kafka.KafkaSchemaTransformTranslation.WriteRegistrar
- getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.kafka.upgrade.KafkaIOTranslation.ReadRegistrar
- getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.kafka.upgrade.KafkaIOTranslation.WriteRegistrar
- getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.TFRecordSchemaTransformTranslation.ReadWriteRegistrar
- getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.managed.ManagedSchemaTransformTranslation.ManagedTransformRegistrar
- getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.transforms.Redistribute.Registrar
- getTransformStepNames() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
- getTransformTranslator(Class<TransformT>) - Method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator
-
Returns the
TransformTranslator
to use for instances of the specified PTransform class, or null if none registered. - getTransformTranslator(TransformT) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.PipelineTranslatorBatch
-
Returns a
TransformTranslator
for the givenPTransform
if known. - getTransformTranslator(TransformT) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator
-
Returns a
TransformTranslator
for the givenPTransform
if known. - getTransformUniqueID(RunnerApi.FunctionSpec) - Method in interface org.apache.beam.sdk.expansion.service.TransformProvider
- getTranslator() - Method in class org.apache.beam.runners.dataflow.DataflowRunner
-
Returns the DataflowPipelineTranslator associated with this object.
- getTransport() - Static method in class org.apache.beam.sdk.extensions.gcp.util.Transport
- getTrigger() - Method in class org.apache.beam.sdk.values.WindowingStrategy
- getTriggeringFrequencySeconds() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
- getTriggeringFrequencySeconds() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration
- getTruncatedRestriction() - Method in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker.TruncateResult
- getTruncateTimestamps() - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogPipelineOptions
-
Whether to truncate timestamps in tables described by Data Catalog.
- getTruncateTimestamps() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions
- getTSetEnvironment() - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
- getTSetGraph() - Method in class org.apache.beam.runners.twister2.Twister2PipelineExecutionEnvironment
- getTupleTag() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple.TaggedKeyedPCollection
-
Returns the TupleTag of this TaggedKeyedPCollection.
- getTupleTagCoders(Map<TupleTag<?>, PCollection<?>>) - Static method in class org.apache.beam.runners.spark.translation.TranslationUtils
-
Utility to get mapping between TupleTag and a coder.
- getTupleTagDecodeFunction(Map<TupleTag<?>, Coder<WindowedValue<?>>>) - Static method in class org.apache.beam.runners.spark.translation.TranslationUtils
-
Returns a pair function to convert bytes to value via coder.
- getTupleTagEncodeFunction(Map<TupleTag<?>, Coder<WindowedValue<?>>>) - Static method in class org.apache.beam.runners.spark.translation.TranslationUtils
-
Returns a pair function to convert value to bytes via coder.
- getTupleTagId(PValue) - Static method in class org.apache.beam.runners.jet.Utils
- getTupleTagList() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
-
Returns the TupleTagList tuple associated with this schema.
- getTwister2Home() - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
- getType() - Method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
-
Returns the type this coder encodes/decodes.
- getType() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory
-
Returns the type for the datum factory.
- getType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.TypeWithNullability
- getType() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPMeasure
- getType() - Method in class org.apache.beam.sdk.extensions.sql.meta.Table
-
type of the table.
- getType() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
- getType() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
-
The type of the column.
- getType() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema.Column
- getType() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
- getType() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Destination
-
Gets the type of the destination (TOPIC, QUEUE or UNKNOWN).
- getType() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
-
Returns the field type.
- getType() - Method in class org.apache.beam.sdk.schemas.Schema.Field
-
Returns the fields
Schema.FieldType
. - getType() - Method in interface org.apache.beam.sdk.testing.TestStream.Event
- getType() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
-
Retrieve the
DisplayData.Type
of display data. - getType() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
-
The
DisplayData.Type
of display data. - getType() - Method in class org.apache.beam.sdk.values.TypeDescriptor
-
Returns the
Type
represented by thisTypeDescriptor
. - getType(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
- getType(String) - Method in class org.apache.beam.sdk.schemas.Schema.Options
-
Get the type of an option.
- getTypeClass() - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeInformation
- getTypeClass() - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueTypeInformation
- getTypeDescriptor() - Method in class org.apache.beam.sdk.schemas.utils.ReflectUtils.TypeDescriptorWithSchema
- getTypeDescriptor() - Method in class org.apache.beam.sdk.transforms.ViewFn
-
Return the
TypeDescriptor
describing the output of this fn. - getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollection
-
Returns a
TypeDescriptor<T>
with some reflective information aboutT
, if possible. - getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryListFromMultimapViewFn
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryListViewFn
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryMapFromVoidKeyViewFn
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryMapViewFn
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryMultimapFromVoidKeyViewFn
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryMultimapViewFn
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableBackedListViewFn
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableViewFn
-
Deprecated.
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableViewFn2
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.ListViewFn
-
Deprecated.
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.ListViewFn2
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.MapViewFn
-
Deprecated.
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.MapViewFn2
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.MultimapViewFn
-
Deprecated.
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.MultimapViewFn2
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn
-
Deprecated.
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn2
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.TupleTag
-
Returns a
TypeDescriptor
capturing what is known statically about the type of thisTupleTag
instance's most-derived class. - getTypeFactory() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- getTypeMap() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- getTypeName() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
- getTypeName() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
- getTypeNames() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
- getTypeParameter(String) - Method in class org.apache.beam.sdk.values.TypeDescriptor
-
Returns a
TypeVariable
for the named type parameter. - getTypePayload() - Method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService.ArtifactDestination
- getTypes() - Method in class org.apache.beam.sdk.values.TypeDescriptor
-
Returns a set of
TypeDescriptor
, one for each superclass as well as each interface implemented by this class. - getTypeUrn() - Method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService.ArtifactDestination
- getUdaf(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct
- getUdafImpl() - Method in class org.apache.beam.sdk.extensions.sql.impl.LazyAggregateCombineFn
- getUdafs() - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.UdfUdafProvider
- getUnalignedCheckpointEnabled() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- getUnboundedReaderMaxElements() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
The max elements read from an UnboundedReader before checkpointing.
- getUnboundedReaderMaxReadTimeMs() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
The max amount of time an UnboundedReader is consumed before checkpointing.
- getUnboundedReaderMaxReadTimeSec() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
Deprecated.
- getUnboundedReaderMaxWaitForElementsMs() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
The max amount of time waiting for elements when reading from UnboundedReader.
- getUnderlyingDoFn() - Method in class org.apache.beam.runners.dataflow.BatchStatefulParDoOverrides.BatchStatefulDoFn
- getUnderlyingSchemaProvider(Class<T>) - Method in class org.apache.beam.sdk.schemas.annotations.DefaultSchema.DefaultSchemaProvider
-
Retrieves the underlying
SchemaProvider
for the givenClass
. - getUnderlyingSchemaProvider(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.annotations.DefaultSchema.DefaultSchemaProvider
-
Retrieves the underlying
SchemaProvider
for the givenTypeDescriptor
. - getUnfinishedEndpoints() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver
-
Get all unfinished data and timers endpoints represented as [transform_id]:data and [transform_id]:timers:[timer_family_id].
- getUnfinishedMinWatermark() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.cache.AsyncWatermarkCache
- getUnfinishedMinWatermark() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.cache.NoOpWatermarkCache
- getUnfinishedMinWatermark() - Method in interface org.apache.beam.sdk.io.gcp.spanner.changestreams.cache.WatermarkCache
-
Fetches the earliest partition watermark from the partition metadata table that is not in a
PartitionMetadata.State.FINISHED
state. - getUnfinishedMinWatermarkFrom(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
-
Fetches the earliest partition watermark from the partition metadata table that is not in a
PartitionMetadata.State.FINISHED
state. - getUnionCoder() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
- getUnionTag() - Method in class org.apache.beam.sdk.transforms.join.RawUnionValue
- getUniqueId() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
- getUnknownFields() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload
- getUnknownFieldsPayload() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload
- getUnsharedKeySize() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.KeyPrefix
- getUntilTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.OrFinallyTrigger
-
The trigger that signals termination of this trigger.
- getUpdateCompatibilityVersion() - Method in interface org.apache.beam.sdk.options.StreamingOptions
- getUpdatedSchema() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StreamAppendClient
-
If the table schema has been updated, returns the new schema.
- getUpdatedSchema(TableSchema, TableSchema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableSchemaUpdateUtils
- getUploadBufferSizeBytes() - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.CreateOptions
-
If non-null, the upload buffer size to be used.
- getUrl() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- getUrl() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- getUrn() - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.PayloadTranslator
- getUrn() - Method in class org.apache.beam.sdk.schemas.transforms.SchemaTransformTranslation.SchemaTransformPayloadTranslator
- getUrn() - Method in interface org.apache.beam.sdk.transforms.Materialization
-
Gets the URN describing this
Materialization
. - getUseActiveSparkSession() - Method in interface org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingPipelineOptions
- getUseAltsServer() - Method in interface org.apache.beam.sdk.expansion.service.ExpansionServiceOptions
- getUseAtLeastOnceSemantics() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
- getUseCdc() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- getUseCdcWrites() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
- getUseDataStreamForBatch() - Method in interface org.apache.beam.runners.flink.VersionDependentFlinkPipelineOptions
- getUsePublicIps() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
-
Specifies whether worker pools should be started with public IP addresses.
- getUserAgent() - Method in interface org.apache.beam.sdk.options.PipelineOptions
-
A user agent string as per RFC2616, describing the pipeline to external services.
- getUserId() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
- getUsername() - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration
- getUsername() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
- getUsername() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
- getUsername() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
- getUsername() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- getUsername() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- getUsername() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- getUseSeparateWindmillHeartbeatStreams() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getUsesProvidedSparkContext() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
- getUseStandardSql() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration
-
Enables BigQuery's Standard SQL dialect when reading from a query.
- getUseStorageApiConnectionPool() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getUseStorageWriteApi() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getUseStorageWriteApiAtLeastOnce() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getUseTransformService() - Method in interface org.apache.beam.sdk.extensions.python.PythonExternalTransformOptions
- getUseWindmillIsolatedChannels() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getUsingStandardSql() - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher.TableAndQuery
- getUuid() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
- getUUID() - Method in class org.apache.beam.sdk.schemas.Schema
-
Get this schema's UUID.
- getUuidFromMessage(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
- getValidate() - Method in class org.apache.beam.sdk.io.TFRecordReadSchemaTransformConfiguration
- getValidationFailures() - Method in class org.apache.beam.sdk.io.cdap.context.FailureCollectorWrapper
- getValue() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmRecord
-
Returns the value.
- getValue() - Method in class org.apache.beam.runners.spark.util.ByteArray
- getValue() - Method in class org.apache.beam.runners.spark.util.SideInputBroadcast
- getValue() - Method in class org.apache.beam.sdk.coders.StructuralByteArray
- getValue() - Method in class org.apache.beam.sdk.extensions.timeseries.FillGaps.InterpolateData
- getValue() - Method in class org.apache.beam.sdk.io.range.ByteKey
-
Returns a read-only
ByteBuffer
representing thisByteKey
. - getValue() - Method in class org.apache.beam.sdk.metrics.GaugeResult.EmptyGaugeResult
- getValue() - Method in class org.apache.beam.sdk.metrics.GaugeResult
- getValue() - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType.Value
-
Return the integer enum value.
- getValue() - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType.Value
-
Returns the current value of the OneOf.
- getValue() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
-
Retrieve the value of the display item.
- getValue() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
-
The value of the display item.
- getValue() - Method in class org.apache.beam.sdk.transforms.join.RawUnionValue
- getValue() - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow
-
Returns the value of this
FailsafeValueInSingleWindow
. - getValue() - Method in class org.apache.beam.sdk.values.KV
-
Returns the value of this
KV
. - getValue() - Method in class org.apache.beam.sdk.values.TaggedPValue
-
Returns the
PCollection
. - getValue() - Method in class org.apache.beam.sdk.values.TimestampedValue
- getValue() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow
-
Returns the value of this
ValueInSingleWindow
. - getValue() - Method in class org.apache.beam.sdk.values.ValueWithRecordId
- getValue() - Method in interface org.apache.beam.sdk.values.WindowedValue
-
The primary data for this value.
- getValue(int) - Method in class org.apache.beam.sdk.values.Row
-
Get value by field index,
ClassCastException
is thrown if schema doesn't match. - getValue(int) - Method in class org.apache.beam.sdk.values.RowWithGetters
- getValue(int) - Method in class org.apache.beam.sdk.values.RowWithStorage
- getValue(Class<T>) - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType.Value
-
Returns the current value of the OneOf as the destination type.
- getValue(String) - Method in class org.apache.beam.sdk.schemas.Schema.Options
-
Get the value of an option.
- getValue(String) - Method in class org.apache.beam.sdk.values.Row
-
Get value by field name,
ClassCastException
is thrown if type doesn't match. - getValue(String, Class<T>) - Method in class org.apache.beam.sdk.schemas.Schema.Options
-
Get the value of an option.
- getValueCaptureType() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The capture type of the change stream that generated this record.
- getValueClass() - Method in enum class org.apache.beam.sdk.io.cdap.PluginConstants.Hadoop
- getValueCoder() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmRecordCoder
-
Returns the value coder.
- getValueCoder() - Method in class org.apache.beam.sdk.coders.KvCoder
- getValueCoder() - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
-
Gets the value coder that will be prefixed by the length.
- getValueCoder() - Method in class org.apache.beam.sdk.coders.MapCoder
- getValueCoder() - Method in class org.apache.beam.sdk.coders.NullableCoder
-
Returns the inner
Coder
wrapped by thisNullableCoder
instance. - getValueCoder() - Method in class org.apache.beam.sdk.coders.OptionalCoder
-
Returns the inner
Coder
wrapped by thisOptionalCoder
instance. - getValueCoder() - Method in class org.apache.beam.sdk.coders.SortedMapCoder
- getValueCoder() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
- getValueCoder() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getValueCoder() - Method in class org.apache.beam.sdk.testing.TestStream
- getValueCoder() - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
- getValueCoder() - Method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
- getValueCoder() - Method in class org.apache.beam.sdk.values.WindowedValues.WindowedValueCoder
-
Returns the value coder.
- getValueDeserializerProvider() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getValueOnlyCoder(Coder<T>) - Static method in class org.apache.beam.sdk.values.WindowedValues
-
Returns the
ValueOnlyCoder
from the given valueCoder. - getValueOrDefault(String, T) - Method in class org.apache.beam.sdk.schemas.Schema.Options
-
Get the value of an option.
- getValues() - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
- getValues() - Method in class org.apache.beam.sdk.values.Row
-
Return the list of raw unmodified data values to enable 0-copy code.
- getValues() - Method in class org.apache.beam.sdk.values.RowWithGetters
-
Return the list of raw unmodified data values to enable 0-copy code.
- getValues() - Method in class org.apache.beam.sdk.values.RowWithStorage
- getValueSerializer() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- getValuesMap() - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
- getValueTranslationFunction() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
- getValueTypeDescriptor() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
- getVerifyRowValues() - Method in interface org.apache.beam.sdk.extensions.sql.impl.BeamSqlPipelineOptions
- getVersion() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.Footer
- getView() - Method in class org.apache.beam.runners.dataflow.CreateDataflowView
- getView() - Method in class org.apache.beam.runners.dataflow.DataflowRunner.StreamingPCollectionViewWriterFn
- getView() - Method in class org.apache.beam.runners.spark.translation.streaming.CreateStreamingSparkView.CreateSparkPCollectionView
- getView() - Method in class org.apache.beam.sdk.transforms.View.CreatePCollectionView
-
Deprecated.This should not be used to obtain the output of any given application of this
PTransform
. That should be obtained by inspecting theTransformHierarchy.Node
that contains thisView.CreatePCollectionView
, as this view may have been replaced within pipeline surgery. - getViewFn() - Method in interface org.apache.beam.sdk.values.PCollectionView
- getViewFn() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
- getVocabulary() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
- getVocabulary() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- getWarehouse() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- getWarehouse() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- getWarehouse() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- getWarnings() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- getWatchInterval() - Method in class org.apache.beam.sdk.io.FileIO.MatchConfiguration
- getWatchTopicPartitionDuration() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getWatermark() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
- getWatermark() - Method in interface org.apache.beam.sdk.io.aws2.kinesis.WatermarkPolicy
- getWatermark() - Method in class org.apache.beam.sdk.io.aws2.kinesis.WatermarkPolicyFactory.ArrivalTimeWatermarkPolicy
- getWatermark() - Method in class org.apache.beam.sdk.io.aws2.kinesis.WatermarkPolicyFactory.CustomWatermarkPolicy
- getWatermark() - Method in class org.apache.beam.sdk.io.aws2.kinesis.WatermarkPolicyFactory.ProcessingTimeWatermarkPolicy
- getWatermark() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.DetectNewPartitionsState
- getWatermark() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.StreamPartitionWithWatermark
- getWatermark() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
- getWatermark() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
The time for which all records with a timestamp less than it have been processed.
- getWatermark() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
-
Returns a timestamp before or at the timestamps of all future elements read by this reader.
- getWatermark() - Method in class org.apache.beam.sdk.testing.TestStream.WatermarkEvent
- getWatermark(TimestampPolicy.PartitionContext) - Method in class org.apache.beam.sdk.io.kafka.CustomTimestampPolicyWithLimitedDelay
- getWatermark(TimestampPolicy.PartitionContext) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicy
-
Returns watermark for the partition.
- getWatermark(TimestampPolicy.PartitionContext) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.LogAppendTimePolicy
- getWatermark(TimestampPolicy.PartitionContext) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.ProcessingTimePolicy
- getWatermark(TimestampPolicy.PartitionContext) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.TimestampFnPolicy
- getWatermarkAndState() - Method in interface org.apache.beam.sdk.fn.splittabledofn.WatermarkEstimators.WatermarkAndStateObserver
- getWatermarkCache() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.cache.CacheFactory
- getWatermarkFn() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getWatermarkIdleDurationThreshold() - Method in class org.apache.beam.sdk.io.solace.read.UnboundedSolaceSource
- getWatermarkIndexName() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataTableNames
- getWatermarkLastUpdated() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.DetectNewPartitionsState
- getWatermarkMillis() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark.PartitionMark
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterAll
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterEach
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterFirst
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterPane
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterSynchronizedProcessingTime
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.FromEndOfWindow
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.DefaultTrigger
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.Never.NeverTrigger
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.OrFinallyTrigger
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.Repeatedly
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.ReshuffleTrigger
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
-
For internal use only; no backwards-compatibility guarantees.
- getWeigher(Coder<InputT>) - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches.BatchingParams
- getWeight() - Method in class org.apache.beam.sdk.fn.data.WeightedList
- getWindmillGetDataStreamCount() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getWindmillHarnessUpdateReportingPeriod() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getWindmillMessagesBetweenIsReadyChecks() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getWindmillRequestBatchedGetWorkResponse() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getWindmillServiceCommitThreads() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getWindmillServiceEndpoint() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
-
Custom windmill service endpoint.
- getWindmillServicePort() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getWindmillServiceRpcChannelAliveTimeoutSec() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getWindmillServiceStreamingLogEveryNStreamFailures() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getWindmillServiceStreamingRpcBatchLimit() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getWindmillServiceStreamingRpcHealthCheckPeriodMs() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getWindmillServiceStreamMaxBackoffMillis() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- getWindow() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.NodeStats
-
This method returns the number of tuples in each window.
- getWindow() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
- getWindow() - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow
-
Returns the window of this
FailsafeValueInSingleWindow
. - getWindow() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow
-
Returns the window of this
ValueInSingleWindow
. - getWindow() - Method in interface org.apache.beam.sdk.values.WindowedValues.SingleWindowedValue
- getWindowCoder() - Method in class org.apache.beam.sdk.coders.TimestampPrefixingWindowCoder
- getWindowCoder() - Method in class org.apache.beam.sdk.values.WindowedValues.FullWindowedValueCoder
- getWindowedAggregateDoFnOperator(FlinkStreamingTranslationContext, PTransform<PCollection<KV<K, InputT>>, PCollection<KV<K, OutputT>>>, KvCoder<K, InputAccumT>, Coder<WindowedValue<KV<K, OutputAccumT>>>, SystemReduceFn<K, InputAccumT, ?, OutputAccumT, BoundedWindow>, Map<Integer, PCollectionView<?>>, List<PCollectionView<?>>) - Static method in class org.apache.beam.runners.flink.FlinkStreamingAggregationsTranslators
-
Create a DoFnOperator instance that group elements per window and apply a combine function on them.
- getWindowedAggregateDoFnOperator(FlinkStreamingTranslationContext, PTransform<PCollection<KV<K, InputT>>, PCollection<KV<K, OutputT>>>, KvCoder<K, InputAccumT>, Coder<WindowedValue<KV<K, OutputAccumT>>>, CombineFnBase.GlobalCombineFn<? super InputAccumT, ?, OutputAccumT>, Map<Integer, PCollectionView<?>>, List<PCollectionView<?>>) - Static method in class org.apache.beam.runners.flink.FlinkStreamingAggregationsTranslators
- getWindowedValueCoder(String, RunnerApi.Components) - Static method in class org.apache.beam.runners.fnexecution.translation.PipelineTranslatorUtils
- getWindowedWrites() - Method in class org.apache.beam.sdk.io.WriteFiles
- getWindowFn() - Method in class org.apache.beam.sdk.transforms.windowing.Window.Assign
- getWindowFn() - Method in class org.apache.beam.sdk.transforms.windowing.Window
- getWindowFn() - Method in class org.apache.beam.sdk.values.WindowingStrategy
- getWindowingStrategy() - Method in class org.apache.beam.sdk.values.PCollection
-
Returns the
WindowingStrategy
of thisPCollection
. - getWindowingStrategy(String, RunnerApi.Components) - Static method in class org.apache.beam.runners.fnexecution.translation.PipelineTranslatorUtils
- getWindowingStrategyInternal() - Method in interface org.apache.beam.sdk.values.PCollectionView
-
Deprecated.this method will be removed entirely. The
PCollection
underlying a side input, including itsWindowingStrategy
, is part of the side input's specification with aParDo
transform, which will obtain that information via a package-private channel. - getWindowingStrategyInternal() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
-
Returns the
WindowingStrategy
of thisPCollectionView
, which should be that of the underlyingPCollection
. - getWindowMappingFn() - Method in interface org.apache.beam.sdk.values.PCollectionView
-
For internal use only.
- getWindowMappingFn() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
- getWindows() - Method in interface org.apache.beam.sdk.values.WindowedValue
-
Returns the windows of this
WindowedValue
. - getWindows() - Method in class org.apache.beam.sdk.values.WindowedValues.ParamWindowedValueCoder
- getWindowsCoder() - Method in class org.apache.beam.sdk.values.WindowedValues.FullWindowedValueCoder
- getWindowTypeDescriptor() - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
- getWindowTypeDescriptor() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
-
Returns a
TypeDescriptor
capturing what is known statically about the window type of thisWindowFn
instance's most-derived class. - getWithAutoSharding() - Method in class org.apache.beam.sdk.io.WriteFiles
- getWithPartitions() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadConfiguration
- getWorkCompleted() - Method in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker.Progress
-
The known amount of completed work.
- getWorkerCacheMb() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
The size of the worker's in-memory cache, in megabytes.
- getWorkerCPUs() - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
- getWorkerDiskType() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
-
Specifies what type of persistent disk is used.
- getWorkerHarnessContainerImage() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
-
Deprecated.
- getWorkerId() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerHarnessOptions
-
The identity of the worker running this pipeline.
- getWorkerId() - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClient
- getWorkerLogLevelOverrides() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
-
Deprecated.This option controls the log levels for specifically named loggers.
- getWorkerMachineType() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
-
Machine type to create Dataflow worker VMs as.
- getWorkerPool() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerHarnessOptions
-
The identity of the worker pool of this worker.
- getWorkerRegion() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
-
The Compute Engine region (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g.
- getWorkerSystemErrMessageLevel() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
-
Deprecated.Controls the log level given to messages printed to
System.err
. - getWorkerSystemOutMessageLevel() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
-
Deprecated.Controls the log level given to messages printed to
System.out
. - getWorkerZone() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
-
The Compute Engine zone (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g.
- getWorkRemaining() - Method in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker.Progress
-
The known amount of work remaining.
- getWritableByteChannelFactory() - Method in class org.apache.beam.sdk.io.FileBasedSink
-
Returns the
FileBasedSink.WritableByteChannelFactory
used. - getWrite() - Method in class org.apache.beam.io.requestresponse.Cache.Pair
- getWrite() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteFailure
- getWriteCounterPrefix() - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.GcsCountersOptions
- getWriteDisposition() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
- getWriteDisposition() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.WriteBuilder.Configuration
- getWriteDisposition() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
-
Getting disposition how write data to table, see:
WriteDisposition
. - getWriteFailures() - Method in exception class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.FailedWritesException
-
This list of
FirestoreV1.WriteFailure
s detailing which writes failed and for what reason. - getWriteOperation() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
-
Return the WriteOperation that this Writer belongs to.
- getWriteRecordsTransform() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
- getWriteResult() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteFailure
- getWriteStatement() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
- getWriteStreamSchema(String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.WriteStreamService
- getWriteStreamSchema(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.WriteStreamServiceImpl
- getWriteStreamSchema(String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- getWriteStreamService(BigQueryOptions) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices
-
Returns a real, mock, or fake
BigQueryServices.WriteStreamService
. - getWriteStreamService(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl
- getWriteStreamService(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
- getXmlConfiguration() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration
- getZetaSqlDefaultTimezone() - Method in interface org.apache.beam.sdk.extensions.sql.impl.BeamSqlPipelineOptions
- getZetaSqlRuleSets() - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner
- getZetaSqlRuleSets(Collection<RelOptRule>) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner
- getZone() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
-
Deprecated.Use
GcpOptions.getWorkerZone()
instead. - global(Map<Integer, GlobalWatermarkHolder.SparkWatermarks>) - Static method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
-
Build a global
TimerInternals
for all feeding streams. - Global() - Constructor for class org.apache.beam.sdk.schemas.transforms.Group.Global
- GLOBAL_SEQUENCE_TRACKER - Static variable in class org.apache.beam.sdk.extensions.ordered.OrderedEventProcessor
- GlobalConfigRefreshPeriodFactory() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions.GlobalConfigRefreshPeriodFactory
- globalDefault() - Static method in class org.apache.beam.sdk.values.WindowingStrategy
-
Return a fully specified, default windowing strategy.
- GlobalDigest() - Constructor for class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.GlobalDigest
- globally() - Static method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct
-
Computes the approximate number of distinct elements in the input
PCollection<InputT>
and returns aPCollection<Long>
. - globally() - Static method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies
-
Create the
PTransform
that will build a Count-min sketch for keeping track of the frequency of the elements in the whole stream. - globally() - Static method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles
-
Compute the stream in order to build a T-Digest structure (MergingDigest) for keeping track of the stream distribution and returns a
PCollection<MergingDigest>
. - globally() - Static method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct
- globally() - Static method in class org.apache.beam.sdk.extensions.zetasketch.HllCount.Extract
-
Returns a
PTransform
that takes an inputPCollection<byte[]>
of HLL++ sketches and returns aPCollection<Long>
of the estimated count of distinct elements extracted from each sketch. - globally() - Method in class org.apache.beam.sdk.extensions.zetasketch.HllCount.Init.Builder
-
Returns a
Combine.Globally
PTransform
that takes an inputPCollection<InputT>
and returns aPCollection<byte[]>
which consists of the HLL++ sketch computed from the elements in the inputPCollection
. - globally() - Static method in class org.apache.beam.sdk.extensions.zetasketch.HllCount.MergePartial
-
Returns a
Combine.Globally
PTransform
that takes an inputPCollection<byte[]>
of HLL++ sketches and returns aPCollection<byte[]>
of a new sketch merged from the input sketches. - globally() - Static method in class org.apache.beam.sdk.schemas.transforms.Group
-
Returns a transform that groups all elements in the input
PCollection
. - globally() - Static method in class org.apache.beam.sdk.transforms.Count
-
Returns a
PTransform
that counts the number of elements in its inputPCollection
. - globally() - Static method in class org.apache.beam.sdk.transforms.Latest
-
Returns a
PTransform
that takes as input aPCollection<T>
and returns aPCollection<T>
whose contents is the latest element according to its event time, or null if there are no elements. - globally() - Static method in class org.apache.beam.sdk.transforms.Max
-
Returns a
PTransform
that takes an inputPCollection<T>
and returns aPCollection<T>
whose contents is the maximum according to the natural ordering ofT
of the inputPCollection
's elements, ornull
if there are no elements. - globally() - Static method in class org.apache.beam.sdk.transforms.Mean
-
Returns a
PTransform
that takes an inputPCollection<NumT>
and returns aPCollection<Double>
whose contents is the mean of the inputPCollection
's elements, or0
if there are no elements. - globally() - Static method in class org.apache.beam.sdk.transforms.Min
-
Returns a
PTransform
that takes an inputPCollection<T>
and returns aPCollection<T>
whose contents is the minimum according to the natural ordering ofT
of the inputPCollection
's elements, ornull
if there are no elements. - globally(double) - Static method in class org.apache.beam.sdk.transforms.ApproximateUnique
-
Deprecated.Like
ApproximateUnique.globally(int)
, but specifies the desired maximum estimation error instead of the sample size. - globally(int) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles
-
Like
ApproximateQuantiles.globally(int, Comparator)
, but sorts using the elements' natural ordering. - globally(int) - Static method in class org.apache.beam.sdk.transforms.ApproximateUnique
-
Deprecated.Returns a
PTransform
that takes aPCollection<T>
and returns aPCollection<Long>
containing a single value that is an estimate of the number of distinct elements in the inputPCollection
. - globally(int, ComparatorT) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles
-
Returns a
PTransform
that takes aPCollection<T>
and returns aPCollection<List<T>>
whose single value is aList
of the approximateN
-tiles of the elements of the inputPCollection
. - globally(ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Max
-
Returns a
PTransform
that takes an inputPCollection<T>
and returns aPCollection<T>
whose contents is the maximum of the inputPCollection
's elements, ornull
if there are no elements. - globally(ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Min
-
Returns a
PTransform
that takes an inputPCollection<T>
and returns aPCollection<T>
whose contents is the minimum of the inputPCollection
's elements, ornull
if there are no elements. - globally(CombineFnBase.GlobalCombineFn<? super InputT, ?, OutputT>) - Static method in class org.apache.beam.sdk.transforms.Combine
-
Returns a
Combine.Globally
PTransform
that uses the givenGloballyCombineFn
to combine all the elements in each window of the inputPCollection
into a single value in the outputPCollection
. - globally(CombineWithContext.CombineFnWithContext<InputT, AccumT, OutputT>, SerializablePipelineOptions, Map<TupleTag<?>, KV<WindowingStrategy<?, ?>, SideInputBroadcast<?>>>, WindowingStrategy<?, ?>) - Static method in class org.apache.beam.runners.spark.translation.SparkCombineFn
- globally(SerializableBiFunction<V, V, V>) - Static method in class org.apache.beam.sdk.transforms.Combine
-
Returns a
Combine.Globally
PTransform
that uses the givenSerializableBiFunction
to combine all the elements in each window of the inputPCollection
into a single value in the outputPCollection
. - globally(SerializableFunction<Iterable<V>, V>) - Static method in class org.apache.beam.sdk.transforms.Combine
-
Returns a
Combine.Globally
PTransform
that uses the givenSerializableFunction
to combine all the elements in each window of the inputPCollection
into a single value in the outputPCollection
. - Globally() - Constructor for class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.Globally
- Globally(double) - Constructor for class org.apache.beam.sdk.transforms.ApproximateUnique.Globally
-
Deprecated.
- Globally(int) - Constructor for class org.apache.beam.sdk.transforms.ApproximateUnique.Globally
-
Deprecated.
- GloballyDistinct() - Constructor for class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.GloballyDistinct
- GlobalSketch() - Constructor for class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.GlobalSketch
- GlobalWatermarkHolder - Class in org.apache.beam.runners.spark.util
-
A store to hold the global watermarks for a micro-batch.
- GlobalWatermarkHolder() - Constructor for class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
- GlobalWatermarkHolder.SparkWatermarks - Class in org.apache.beam.runners.spark.util
-
A
GlobalWatermarkHolder.SparkWatermarks
holds the watermarks and batch time relevant to a micro-batch input from a specific source. - GlobalWatermarkHolder.WatermarkAdvancingStreamingListener - Class in org.apache.beam.runners.spark.util
-
Advance the WMs onBatchCompleted event.
- GlobalWindow - Class in org.apache.beam.sdk.transforms.windowing
-
The default window into which all data is placed (via
GlobalWindows
). - GlobalWindow.Coder - Class in org.apache.beam.sdk.transforms.windowing
-
GlobalWindow.Coder
for encoding and decodingGlobalWindow
s. - GlobalWindows - Class in org.apache.beam.sdk.transforms.windowing
-
A
WindowFn
that assigns all data to the same window. - GlobalWindows() - Constructor for class org.apache.beam.sdk.transforms.windowing.GlobalWindows
- GoogleADCIdTokenProvider - Class in org.apache.beam.sdk.io.aws2.auth
-
A OIDC web identity token provider implementation that uses the application default credentials set by the runtime (container, GCE instance, local environment, etc.).
- GoogleADCIdTokenProvider() - Constructor for class org.apache.beam.sdk.io.aws2.auth.GoogleADCIdTokenProvider
- GoogleAdsClientFactory - Interface in org.apache.beam.sdk.io.googleads
-
Defines how to construct a
GoogleAdsClient
. - GoogleAdsCredentialsFactory() - Constructor for class org.apache.beam.sdk.io.googleads.GoogleAdsOptions.GoogleAdsCredentialsFactory
- GoogleAdsIO<GoogleAdsRowT,
SearchGoogleAdsStreamRequestT> - Class in org.apache.beam.sdk.io.googleads -
GoogleAdsIO
provides an API for reading from the Google Ads API over supported versions of the Google Ads client libraries. - GoogleAdsIO() - Constructor for class org.apache.beam.sdk.io.googleads.GoogleAdsIO
- GoogleAdsIO.RateLimitPolicy<GoogleAdsErrorT> - Interface in org.apache.beam.sdk.io.googleads
-
This interface can be used to implement custom client-side rate limiting policies.
- GoogleAdsIO.RateLimitPolicyFactory<GoogleAdsErrorT> - Interface in org.apache.beam.sdk.io.googleads
-
Implement this interface to create a
GoogleAdsIO.RateLimitPolicy
. - GoogleAdsOptions - Interface in org.apache.beam.sdk.io.googleads
-
Options used to configure Google Ads API specific options.
- GoogleAdsOptions.GoogleAdsCredentialsFactory - Class in org.apache.beam.sdk.io.googleads
-
Attempts to load the Google Ads credentials.
- GoogleAdsUserCredentialFactory - Class in org.apache.beam.sdk.io.googleads
-
Constructs and returns
Credentials
to be used by Google Ads API calls. - GoogleAdsV19 - Class in org.apache.beam.sdk.io.googleads
-
GoogleAdsV19
provides an API to read Google Ads API v19 reports. - GoogleAdsV19.Read - Class in org.apache.beam.sdk.io.googleads
-
A
PTransform
that reads the results of a Google Ads query asGoogleAdsRow
objects. - GoogleAdsV19.ReadAll - Class in org.apache.beam.sdk.io.googleads
-
A
PTransform
that reads the results of manySearchGoogleAdsStreamRequest
objects asGoogleAdsRow
objects. - GoogleAdsV19.SimpleRateLimitPolicy - Class in org.apache.beam.sdk.io.googleads
-
This rate limit policy wraps a
RateLimiter
and can be used in low volume and development use cases as a client-side rate limiting policy. - GoogleApiDebugOptions - Interface in org.apache.beam.sdk.extensions.gcp.options
-
These options configure debug settings for Google API clients created within the Apache Beam SDK.
- GoogleApiDebugOptions.GoogleApiTracer - Class in org.apache.beam.sdk.extensions.gcp.options
-
A
GoogleClientRequestInitializer
that adds the trace destination to Google API calls. - GoogleApiTracer() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GoogleApiDebugOptions.GoogleApiTracer
- GraphiteSink - Class in org.apache.beam.runners.spark.metrics.sink
-
A
Sink
for Spark's metric system reporting metrics (including Beam step metrics) to Graphite. - GraphiteSink(Properties, MetricRegistry) - Constructor for class org.apache.beam.runners.spark.metrics.sink.GraphiteSink
-
Constructor for Spark 3.2.x and later.
- GraphiteSink(Properties, MetricRegistry, SecurityManager) - Constructor for class org.apache.beam.runners.spark.metrics.sink.GraphiteSink
-
Constructor for Spark 3.1.x and earlier.
- GREATER_THAN - Enum constant in enum class org.apache.beam.sdk.extensions.sql.impl.cep.CEPKind
- GREATER_THAN_OR_EQUAL - Enum constant in enum class org.apache.beam.sdk.extensions.sql.impl.cep.CEPKind
- greaterThan(Coder<T>, T) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.greaterThan(Comparable)
. - greaterThan(T) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.greaterThan(Comparable)
. - greaterThan(T) - Static method in class org.apache.beam.sdk.transforms.Filter
-
Returns a
PTransform
that takes an inputPCollection<T>
and returns aPCollection<T>
with elements that are greater than a given value, based on the elements' natural ordering. - greaterThanEq(T) - Static method in class org.apache.beam.sdk.transforms.Filter
-
Returns a
PTransform
that takes an inputPCollection<T>
and returns aPCollection<T>
with elements that are greater than or equal to a given value, based on the elements' natural ordering. - greaterThanOrEqualTo(Coder<T>, T) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.greaterThanOrEqualTo(Comparable)
. - greaterThanOrEqualTo(T) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.greaterThanOrEqualTo(Comparable)
. - Group - Class in org.apache.beam.sdk.schemas.transforms
-
A generic grouping transform for schema
PCollection
s. - Group() - Constructor for class org.apache.beam.sdk.schemas.transforms.Group
- Group.AggregateCombiner<InputT> - Class in org.apache.beam.sdk.schemas.transforms
-
a
PTransform
that does a combine using an aggregation built up by calls to aggregateField and aggregateFields. - Group.ByFields<InputT> - Class in org.apache.beam.sdk.schemas.transforms
-
a
PTransform
that groups schema elements based on the given fields. - Group.CombineFieldsByFields<InputT> - Class in org.apache.beam.sdk.schemas.transforms
-
a
PTransform
that does a per-key combine using an aggregation built up by calls to aggregateField and aggregateFields. - Group.CombineFieldsByFields.Fanout - Class in org.apache.beam.sdk.schemas.transforms
- Group.CombineFieldsByFields.Fanout.Kind - Enum Class in org.apache.beam.sdk.schemas.transforms
- Group.CombineFieldsGlobally<InputT> - Class in org.apache.beam.sdk.schemas.transforms
-
a
PTransform
that does a global combine using an aggregation built up by calls to aggregateField and aggregateFields. - Group.CombineGlobally<InputT,
OutputT> - Class in org.apache.beam.sdk.schemas.transforms -
a
PTransform
that does a global combine using a providerCombine.CombineFn
. - Group.Global<InputT> - Class in org.apache.beam.sdk.schemas.transforms
-
A
PTransform
for doing global aggregations on schema PCollections. - GroupAlsoByWindowViaOutputBufferFn<K,
InputT, - Class in org.apache.beam.runners.spark.structuredstreaming.translation.batch.functionsW> -
A FlatMap function that groups by windows in batch mode using
ReduceFnRunner
. - GroupAlsoByWindowViaOutputBufferFn(WindowingStrategy<?, W>, StateInternalsFactory<K>, SystemReduceFn<K, InputT, Iterable<InputT>, Iterable<InputT>, W>, Supplier<PipelineOptions>) - Constructor for class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.GroupAlsoByWindowViaOutputBufferFn
- GroupByKey<K,
V> - Class in org.apache.beam.sdk.transforms -
GroupByKey<K, V>
takes aPCollection<KV<K, V>>
, groups the values by key and windows, and returns aPCollection<KV<K, Iterable<V>>>
representing a map from each distinct key and window of the inputPCollection
to anIterable
over all the values associated with that key in the input per window. - groupByKeyAndWindow(JavaDStream<WindowedValue<KV<K, InputT>>>, Coder<K>, Coder<WindowedValue<InputT>>, WindowingStrategy<?, W>, SerializablePipelineOptions, List<Integer>, String) - Static method in class org.apache.beam.runners.spark.stateful.SparkGroupAlsoByWindowViaWindowSet
- groupByKeyOnly(JavaRDD<WindowedValue<KV<K, V>>>, Coder<K>, WindowedValues.WindowedValueCoder<V>, Partitioner) - Static method in class org.apache.beam.runners.spark.translation.GroupCombineFunctions
-
An implementation of
GroupByKeyViaGroupByKeyOnly.GroupByKeyOnly
for the Spark runner. - GroupByKeyTranslatorBatch<K,
V> - Class in org.apache.beam.runners.twister2.translators.batch -
GroupByKey translator.
- GroupByKeyTranslatorBatch() - Constructor for class org.apache.beam.runners.twister2.translators.batch.GroupByKeyTranslatorBatch
- GroupByKeyVisitor - Class in org.apache.beam.runners.spark.translation
-
Traverses the pipeline to populate the candidates for group by key.
- GroupByKeyVisitor(SparkPipelineTranslator, EvaluationContext) - Constructor for class org.apache.beam.runners.spark.translation.GroupByKeyVisitor
- GroupByWindowFunction<K,
V, - Class in org.apache.beam.runners.twister2.translators.functionsW> -
GroupBy window function.
- GroupByWindowFunction() - Constructor for class org.apache.beam.runners.twister2.translators.functions.GroupByWindowFunction
- GroupByWindowFunction(WindowingStrategy<?, W>, SystemReduceFn<K, V, Iterable<V>, Iterable<V>, W>, PipelineOptions) - Constructor for class org.apache.beam.runners.twister2.translators.functions.GroupByWindowFunction
- GroupCombineFunctions - Class in org.apache.beam.runners.spark.translation
-
A set of group/combine functions to apply to Spark
RDD
s. - GroupCombineFunctions() - Constructor for class org.apache.beam.runners.spark.translation.GroupCombineFunctions
- grouped() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
Same transform but can be applied to
PCollection
ofMutationGroup
. - groupedValues(CombineFnBase.GlobalCombineFn<? super InputT, ?, OutputT>) - Static method in class org.apache.beam.sdk.transforms.Combine
-
Returns a
Combine.GroupedValues
PTransform
that takes aPCollection
ofKV
s where a key maps to anIterable
of values, e.g., the result of aGroupByKey
, then uses the givenCombineFn
to combine all the values associated with a key, ignoring the key. - groupedValues(SerializableBiFunction<V, V, V>) - Static method in class org.apache.beam.sdk.transforms.Combine
-
Returns a
Combine.GroupedValues
PTransform
that takes aPCollection
ofKV
s where a key maps to anIterable
of values, e.g., the result of aGroupByKey
, then uses the givenSerializableFunction
to combine all the values associated with a key, ignoring the key. - groupedValues(SerializableFunction<Iterable<V>, V>) - Static method in class org.apache.beam.sdk.transforms.Combine
-
Returns a
Combine.GroupedValues
PTransform
that takes aPCollection
ofKV
s where a key maps to anIterable
of values, e.g., the result of aGroupByKey
, then uses the givenSerializableFunction
to combine all the values associated with a key, ignoring the key. - GroupingState<InputT,
OutputT> - Interface in org.apache.beam.sdk.state -
A
ReadableState
cell that combines multiple input values and outputs a single value of a different type. - GroupIntoBatches<K,
InputT> - Class in org.apache.beam.sdk.transforms -
A
PTransform
that batches inputs to a desired batch size. - GroupIntoBatches.BatchingParams<InputT> - Class in org.apache.beam.sdk.transforms
-
Wrapper class for batching parameters supplied by users.
- GroupIntoBatches.WithShardedKey - Class in org.apache.beam.sdk.transforms
- GroupIntoBatchesOverride - Class in org.apache.beam.runners.dataflow
- GroupIntoBatchesOverride() - Constructor for class org.apache.beam.runners.dataflow.GroupIntoBatchesOverride
- GroupNonMergingWindowsFunctions - Class in org.apache.beam.runners.spark.translation
-
Functions for GroupByKey with Non-Merging windows translations to Spark.
- GroupNonMergingWindowsFunctions() - Constructor for class org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions
- groups() - Element in annotation interface org.apache.beam.sdk.options.Validation.Required
-
The groups that the annotated attribute is a member of.
- GrowableOffsetRangeTracker - Class in org.apache.beam.sdk.transforms.splittabledofn
-
An
OffsetRangeTracker
for tracking a growable offset range. - GrowableOffsetRangeTracker(long, GrowableOffsetRangeTracker.RangeEndEstimator) - Constructor for class org.apache.beam.sdk.transforms.splittabledofn.GrowableOffsetRangeTracker
- GrowableOffsetRangeTracker.RangeEndEstimator - Interface in org.apache.beam.sdk.transforms.splittabledofn
-
Provides the estimated end offset of the range.
- Growth() - Constructor for class org.apache.beam.sdk.transforms.Watch.Growth
- growthOf(Contextful<Watch.Growth.PollFn<InputT, OutputT>>, SerializableFunction<OutputT, KeyT>) - Static method in class org.apache.beam.sdk.transforms.Watch
-
Watches the growth of the given poll function, using the given "key function" to deduplicate outputs.
- growthOf(Watch.Growth.PollFn<InputT, OutputT>) - Static method in class org.apache.beam.sdk.transforms.Watch
-
Watches the growth of the given poll function.
- growthOf(Watch.Growth.PollFn<InputT, OutputT>, Requirements) - Static method in class org.apache.beam.sdk.transforms.Watch
-
Watches the growth of the given poll function.
- GrpcContextHeaderAccessorProvider - Class in org.apache.beam.sdk.fn.server
-
A HeaderAccessorProvider which intercept the header in a GRPC request and expose the relevant fields.
- GrpcContextHeaderAccessorProvider() - Constructor for class org.apache.beam.sdk.fn.server.GrpcContextHeaderAccessorProvider
- GrpcDataService - Class in org.apache.beam.runners.fnexecution.data
-
A
FnDataService
implemented via gRPC. - GrpcDataService() - Constructor for class org.apache.beam.runners.fnexecution.data.GrpcDataService
-
Deprecated.This constructor is for migrating Dataflow purpose only.
- GrpcFnServer<ServiceT> - Class in org.apache.beam.sdk.fn.server
-
A
gRPC Server
which manages a singleFnService
. - GrpcLoggingService - Class in org.apache.beam.runners.fnexecution.logging
-
An implementation of the Beam Fn Logging Service over gRPC.
- GrpcStateService - Class in org.apache.beam.runners.fnexecution.state
-
An implementation of the Beam Fn State service.
- guessExpressionType(String, Map<String, Type>) - Static method in class org.apache.beam.sdk.schemas.transforms.providers.StringCompiler
- GZIP - Enum constant in enum class org.apache.beam.sdk.io.CompressedSource.CompressionMode
-
Deprecated.
- GZIP - Enum constant in enum class org.apache.beam.sdk.io.Compression
-
GZip compression.
- GZIP - Enum constant in enum class org.apache.beam.sdk.io.FileBasedSink.CompressionType
-
Deprecated.
- GZIP - Enum constant in enum class org.apache.beam.sdk.io.TextIO.CompressionType
-
Deprecated.
- GZIP - Enum constant in enum class org.apache.beam.sdk.io.TFRecordIO.CompressionType
-
Deprecated.
- GZIP - Enum constant in enum class org.apache.beam.sdk.io.xml.XmlIO.Read.CompressionType
-
Deprecated.
H
- HADOOP - Enum constant in enum class org.apache.beam.sdk.extensions.sorter.ExternalSorter.Options.SorterType
- hadoopConfiguration - Variable in class org.apache.beam.sdk.io.cdap.Plugin
- HadoopFileSystemModule - Class in org.apache.beam.sdk.io.hdfs
- HadoopFileSystemModule() - Constructor for class org.apache.beam.sdk.io.hdfs.HadoopFileSystemModule
- HadoopFileSystemOptions - Interface in org.apache.beam.sdk.io.hdfs
- HadoopFileSystemOptions.ConfigurationLocator - Class in org.apache.beam.sdk.io.hdfs
-
A
DefaultValueFactory
which locates a HadoopConfiguration
. - HadoopFileSystemOptionsRegistrar - Class in org.apache.beam.sdk.io.hdfs
-
AutoService
registrar forHadoopFileSystemOptions
. - HadoopFileSystemOptionsRegistrar() - Constructor for class org.apache.beam.sdk.io.hdfs.HadoopFileSystemOptionsRegistrar
- HadoopFileSystemRegistrar - Class in org.apache.beam.sdk.io.hdfs
-
AutoService
registrar for theHadoopFileSystem
. - HadoopFileSystemRegistrar() - Constructor for class org.apache.beam.sdk.io.hdfs.HadoopFileSystemRegistrar
- HadoopFormatIO - Class in org.apache.beam.sdk.io.hadoop.format
-
A
HadoopFormatIO
is a Transform for reading data from any source or writing data to any sink which implements HadoopInputFormat
orOutputFormat
. - HadoopFormatIO() - Constructor for class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO
- HadoopFormatIO.HadoopInputFormatBoundedSource<K,
V> - Class in org.apache.beam.sdk.io.hadoop.format -
Bounded source implementation for
HadoopFormatIO
. - HadoopFormatIO.Read<K,
V> - Class in org.apache.beam.sdk.io.hadoop.format -
A
PTransform
that reads from any data source which implements Hadoop InputFormat. - HadoopFormatIO.SerializableSplit - Class in org.apache.beam.sdk.io.hadoop.format
-
A wrapper to allow Hadoop
InputSplit
to be serialized using Java's standard serialization mechanisms. - HadoopFormatIO.Write<KeyT,
ValueT> - Class in org.apache.beam.sdk.io.hadoop.format -
A
PTransform
that writes to any data sink which implements Hadoop OutputFormat. - HadoopFormatIO.Write.ExternalSynchronizationBuilder<KeyT,
ValueT> - Interface in org.apache.beam.sdk.io.hadoop.format -
Builder for External Synchronization defining.
- HadoopFormatIO.Write.PartitionedWriterBuilder<KeyT,
ValueT> - Interface in org.apache.beam.sdk.io.hadoop.format -
Builder for partitioning determining.
- HadoopFormatIO.Write.WriteBuilder<KeyT,
ValueT> - Interface in org.apache.beam.sdk.io.hadoop.format -
Main builder of Write transformation.
- HadoopInputFormatBoundedSource(SerializableConfiguration, Coder<K>, Coder<V>, SimpleFunction<?, K>, SimpleFunction<?, V>, HadoopFormatIO.SerializableSplit, boolean, boolean) - Constructor for class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.HadoopInputFormatBoundedSource
- handle(BeamFnApi.InstructionRequest) - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClient
- handle(BeamFnApi.InstructionRequest) - Method in interface org.apache.beam.runners.fnexecution.control.InstructionRequestHandler
- handle(BeamFnApi.StateRequest) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandler
-
Handle a
BeamFnApi.StateRequest
asynchronously. - handle(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter
-
Opportunity to further refine the relational expression created for a given level.
- handleErrorEx(Object, JCSMPException, long) - Method in class org.apache.beam.sdk.io.solace.broker.PublishResultHandler
- handleSplitRequest(int, String) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceSplitEnumerator
- handleSplitRequest(int, String) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.LazyFlinkSourceSplitEnumerator
- Handling Errors - Search tag in class org.apache.beam.sdk.io.FileIO
- Section
- HarnessUpdateReportingPeriodFactory() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions.HarnessUpdateReportingPeriodFactory
- has(String) - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
-
Returns whether this
PCollectionRowTuple
contains aPCollection
with the given tag. - has(String) - Method in class org.apache.beam.sdk.values.PCollectionTuple
-
Returns whether this
PCollectionTuple
contains aPCollection
with the given tag. - has(TupleTag<T>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
-
Returns whether this
PCollectionTuple
contains aPCollection
with the given tag. - hasAnyPrefix() - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.GcsCountersOptions
- hasCommitted() - Method in class org.apache.beam.sdk.metrics.MetricResult
- hasDatabase(String) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogBeamSchema
-
Checks if metastore client has the specified database.
- hasDefault() - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn
-
Deprecated.Returns if a default value was specified.
- hasDefault() - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn2
-
Returns if a default value was specified.
- HasDefaultTracker<RestrictionT,
TrackerT> - Interface in org.apache.beam.sdk.transforms.splittabledofn -
Interface for restrictions for which a default implementation of
DoFn.NewTracker
is available, depending only on the restriction itself. - hasDefaultValue() - Method in class org.apache.beam.sdk.transforms.View.AsSingleton
-
Returns whether this transform has a default value.
- HasDefaultWatermarkEstimator<WatermarkEstimatorStateT,
WatermarkEstimatorT> - Interface in org.apache.beam.sdk.transforms.splittabledofn -
Interface for watermark estimator state for which a default implementation of
DoFn.NewWatermarkEstimator
is available, depending only on the watermark estimator state itself. - HasDisplayData - Interface in org.apache.beam.sdk.transforms.display
-
Marker interface for
PTransforms
and components to specify display data used within UIs and diagnostic tools. - hasErrored() - Method in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
-
If this handler has errored since it was last reset.
- hasEventTimers(DoFn<?, ?>) - Static method in class org.apache.beam.runners.spark.translation.TranslationUtils
-
Checks if the given DoFn uses event time timers.
- hasException() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- hasExperiment(DataflowPipelineDebugOptions, String) - Static method in class org.apache.beam.runners.dataflow.DataflowRunner
-
Returns true if the specified experiment is enabled, handling null experiments.
- hasExperiment(PipelineOptions, String) - Static method in interface org.apache.beam.sdk.options.ExperimentalOptions
-
Returns true iff the provided pipeline options has the specified experiment enabled.
- hasFailedRecords(List<ResT>) - Method in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
- hasField(String) - Method in class org.apache.beam.sdk.schemas.Schema
-
Returns true if
fieldName
exists in the schema, false otherwise. - hasGlobWildcard(String) - Static method in class org.apache.beam.sdk.io.FileSystems
-
Checks whether the given spec contains a glob wildcard character.
- hash(byte[]) - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueComparator
- hash(List<?>) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmRecordCoder
-
Computes the shard id for the given key component(s).
- hashCode() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmRecordCoder
- hashCode() - Method in class org.apache.beam.runners.dataflow.util.CloudObject
- hashCode() - Method in class org.apache.beam.runners.dataflow.util.OutputReference
- hashCode() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
- hashCode() - Method in class org.apache.beam.runners.flink.adapter.FlinkKey
- hashCode() - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeInformation
- hashCode() - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeSerializer
- hashCode() - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueTypeInformation
- hashCode() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.FlinkStateNamespaceKeySerializer
- hashCode() - Method in class org.apache.beam.runners.jet.Utils.ByteArrayKey
- hashCode() - Method in class org.apache.beam.runners.spark.io.EmptyCheckpointMark
- hashCode() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource
- hashCode() - Method in class org.apache.beam.runners.spark.util.ByteArray
- hashCode() - Method in class org.apache.beam.runners.spark.util.TimerUtils.TimerMarker
- hashCode() - Method in class org.apache.beam.sdk.coders.AtomicCoder
-
.
- hashCode() - Method in class org.apache.beam.sdk.coders.Coder.Context
-
Deprecated.
- hashCode() - Method in class org.apache.beam.sdk.coders.DelegateCoder
- hashCode() - Method in class org.apache.beam.sdk.coders.RowCoder
- hashCode() - Method in class org.apache.beam.sdk.coders.SerializableCoder
- hashCode() - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
- hashCode() - Method in class org.apache.beam.sdk.coders.StructuralByteArray
- hashCode() - Method in class org.apache.beam.sdk.coders.StructuredCoder
- hashCode() - Method in class org.apache.beam.sdk.coders.ZstdCoder
- hashCode() - Method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
- hashCode() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory
- hashCode() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsResourceId
- hashCode() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- hashCode() - Method in class org.apache.beam.sdk.extensions.ordered.combiner.SequenceRangeAccumulator
- hashCode() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus
- hashCode() - Method in class org.apache.beam.sdk.extensions.protobuf.DynamicProtoCoder
- hashCode() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
- hashCode() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoDomain
- hashCode() - Method in class org.apache.beam.sdk.extensions.sql.example.model.Customer
- hashCode() - Method in class org.apache.beam.sdk.extensions.sql.example.model.Order
- hashCode() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
- hashCode() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
- hashCode() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
- hashCode() - Method in class org.apache.beam.sdk.io.cassandra.RingRange
- hashCode() - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy.Params
- hashCode() - Method in class org.apache.beam.sdk.io.FileIO.ReadableFile
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertError
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.DetectNewPartitionsState
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.InitialPipelineState
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.StreamPartitionWithWatermark
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.WriteSuccessSummary
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteFailure
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteSuccessSummary
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadParameter
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadResponse
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.ProjectPath
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.TopicPath
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataTableNames
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartition
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.HeartbeatRecord
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.Mod
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionEndRecord
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionEventRecord
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionStartRecord
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.TypeCode
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRange
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
- hashCode() - Method in class org.apache.beam.sdk.io.hadoop.WritableCoder
- hashCode() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
- hashCode() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Write
- hashCode() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.WriteRowMutations
- hashCode() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
- hashCode() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
- hashCode() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
- hashCode() - Method in class org.apache.beam.sdk.io.range.ByteKey
- hashCode() - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
- hashCode() - Method in class org.apache.beam.sdk.io.range.OffsetRange
- hashCode() - Method in class org.apache.beam.sdk.io.redis.RedisCursor
- hashCode() - Method in class org.apache.beam.sdk.io.solace.broker.SessionService
-
You need to override this method to be able to compare these objects by value.
- hashCode() - Method in class org.apache.beam.sdk.io.solace.broker.SessionServiceFactory
-
You need to override this method to be able to compare these objects by value.
- hashCode() - Method in class org.apache.beam.sdk.io.solace.read.SolaceCheckpointMark
- hashCode() - Method in class org.apache.beam.sdk.io.tika.ParseResult
- hashCode() - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
- hashCode() - Method in class org.apache.beam.sdk.options.ValueProvider.NestedValueProvider
- hashCode() - Method in class org.apache.beam.sdk.options.ValueProvider.RuntimeValueProvider
- hashCode() - Method in class org.apache.beam.sdk.options.ValueProvider.StaticValueProvider
- hashCode() - Method in class org.apache.beam.sdk.schemas.CachingFactory
- hashCode() - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProvider
-
Deprecated.
- hashCode() - Method in class org.apache.beam.sdk.schemas.JavaBeanSchema.GetterTypeSupplier
- hashCode() - Method in class org.apache.beam.sdk.schemas.JavaBeanSchema
- hashCode() - Method in class org.apache.beam.sdk.schemas.JavaBeanSchema.SetterTypeSupplier
- hashCode() - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType.Value
- hashCode() - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType.Value
- hashCode() - Method in class org.apache.beam.sdk.schemas.Schema.Field
- hashCode() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
- hashCode() - Method in class org.apache.beam.sdk.schemas.Schema
- hashCode() - Method in class org.apache.beam.sdk.schemas.Schema.Options
- hashCode() - Method in class org.apache.beam.sdk.schemas.SchemaCoder
- hashCode() - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
- hashCode() - Method in class org.apache.beam.sdk.testing.PAssert.PAssertionSite
- hashCode() - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
-
Deprecated.
Object.hashCode()
is not supported on PAssert objects. - hashCode() - Method in class org.apache.beam.sdk.testing.SuccessOrFailure
- hashCode() - Method in class org.apache.beam.sdk.testing.TestStream
- hashCode() - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique
-
Deprecated.
- hashCode() - Method in class org.apache.beam.sdk.transforms.CombineFns.CoCombineResult
- hashCode() - Method in class org.apache.beam.sdk.transforms.display.DisplayData
- hashCode() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Path
- hashCode() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
- hashCode() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
- hashCode() - Method in class org.apache.beam.sdk.transforms.join.RawUnionValue
- hashCode() - Method in class org.apache.beam.sdk.transforms.PeriodicSequence.SequenceDefinition
- hashCode() - Method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHint
- hashCode() - Method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHints
- hashCode() - Method in class org.apache.beam.sdk.transforms.Watch.Growth.PollResult
- hashCode() - Method in class org.apache.beam.sdk.transforms.windowing.AfterPane
- hashCode() - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
- hashCode() - Method in class org.apache.beam.sdk.transforms.windowing.AfterSynchronizedProcessingTime
- hashCode() - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.FromEndOfWindow
- hashCode() - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
- hashCode() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow
- hashCode() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
- hashCode() - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
- hashCode() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
- hashCode() - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
- hashCode() - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
- hashCode() - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
- hashCode() - Method in class org.apache.beam.sdk.values.EncodableThrowable
- hashCode() - Method in class org.apache.beam.sdk.values.KV
- hashCode() - Method in class org.apache.beam.sdk.values.PCollectionList
- hashCode() - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
- hashCode() - Method in class org.apache.beam.sdk.values.PCollectionTuple
- hashCode() - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryListFromMultimapViewFn
- hashCode() - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryListViewFn
- hashCode() - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableBackedListViewFn
- hashCode() - Method in class org.apache.beam.sdk.values.PCollectionViews.ListViewFn
-
Deprecated.
- hashCode() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
- hashCode() - Method in class org.apache.beam.sdk.values.Row
- hashCode() - Method in class org.apache.beam.sdk.values.RowWithGetters
- hashCode() - Method in class org.apache.beam.sdk.values.ShardedKey
- hashCode() - Method in class org.apache.beam.sdk.values.TimestampedValue
- hashCode() - Method in class org.apache.beam.sdk.values.TupleTag
- hashCode() - Method in class org.apache.beam.sdk.values.TypeDescriptor
- hashCode() - Method in class org.apache.beam.sdk.values.TypeParameter
- hashCode() - Method in class org.apache.beam.sdk.values.ValueWithRecordId
- hashCode() - Method in class org.apache.beam.sdk.values.WindowingStrategy
- hashCode(WindowedValue<T>) - Static method in class org.apache.beam.sdk.values.WindowedValues
- HashingFlinkCombineRunner<K,
InputT, - Class in org.apache.beam.runners.flink.translation.functionsAccumT, OutputT, W> -
A Flink combine runner that builds a map of merged windows and produces output after seeing all input.
- HashingFlinkCombineRunner() - Constructor for class org.apache.beam.runners.flink.translation.functions.HashingFlinkCombineRunner
- hasItem(Coder<T>, T) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.hasItem(Object)
. - hasItem(SerializableMatcher<? super T>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.hasItem(Matcher)
. - hasItem(T) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.hasItem(Object)
. - hasNext() - Method in class org.apache.beam.runners.spark.translation.streaming.ParDoStateUpdateFn.SparkTimerInternalsIterator
- hasNext() - Method in class org.apache.beam.runners.twister2.translation.wrappers.Twister2BoundedSource
- hasNext() - Method in class org.apache.beam.runners.twister2.translation.wrappers.Twister2EmptySource
- hasNext() - Method in class org.apache.beam.runners.twister2.translators.functions.ImpulseSource
- hasNext() - Method in class org.apache.beam.sdk.extensions.arrow.ArrowConversion.RecordBatchRowIterator
- hasNext() - Method in class org.apache.beam.sdk.fn.stream.DataStreams.DataStreamDecoder
- hasNext() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.FhirResourcePagesIterator
- hasNext() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.HL7v2MessagePages.HL7v2MessagePagesIterator
- hasNextProcessingTimer() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
-
Checks if there are any expired timers in the
TimeDomain.PROCESSING_TIME
domain. - HasOffset - Interface in org.apache.beam.sdk.io.sparkreceiver
-
Interface for any Spark
Receiver
that supports reading from and to some offset. - hasOption(String) - Method in class org.apache.beam.sdk.schemas.Schema.Options
- hasOptions() - Method in class org.apache.beam.sdk.schemas.Schema.Options
- hasOutput(ErrorHandling) - Static method in class org.apache.beam.sdk.schemas.transforms.providers.ErrorHandling
- hasReplacementJob() - Method in enum class org.apache.beam.sdk.PipelineResult.State
- hasSchema() - Method in class org.apache.beam.sdk.values.PCollection
-
Returns whether this
PCollection
has an attached schema. - hasSize(int) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.hasSize(int)
. - hasSize(SerializableMatcher<? super Integer>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.hasSize(Matcher)
. - hasTimers(DoFn<?, ?>) - Static method in class org.apache.beam.runners.spark.translation.TranslationUtils
-
Checks if the given DoFn uses any timers.
- hasTranslation(PTransform<?, ?>) - Method in interface org.apache.beam.runners.spark.translation.SparkPipelineTranslator
- hasTranslation(PTransform<?, ?>) - Method in class org.apache.beam.runners.spark.translation.streaming.StreamingTransformTranslator.Translator
- hasTranslation(PTransform<?, ?>) - Method in class org.apache.beam.runners.spark.translation.TransformTranslator.Translator
- hasUnboundedPCollections(RunnerApi.Pipeline) - Static method in class org.apache.beam.runners.fnexecution.translation.PipelineTranslatorUtils
-
Indicates whether the given pipeline has any unbounded PCollections.
- hasUnresolvedParameters() - Method in class org.apache.beam.sdk.values.TypeDescriptor
-
Returns whether this
TypeDescriptor
has any unresolved type parameters, as opposed to being a concrete type. - HBaseCoderProviderRegistrar - Class in org.apache.beam.sdk.io.hbase
-
A
CoderProviderRegistrar
for standard types used withHBaseIO
. - HBaseCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.io.hbase.HBaseCoderProviderRegistrar
- HBaseIO - Class in org.apache.beam.sdk.io.hbase
-
A bounded source and sink for HBase.
- HBaseIO.Read - Class in org.apache.beam.sdk.io.hbase
-
A
PTransform
that reads from HBase. - HBaseIO.ReadAll - Class in org.apache.beam.sdk.io.hbase
-
Implementation of
HBaseIO.readAll()
. - HBaseIO.Write - Class in org.apache.beam.sdk.io.hbase
-
A
PTransform
that writes to HBase. - HBaseIO.WriteRowMutations - Class in org.apache.beam.sdk.io.hbase
-
Transformation that writes RowMutation objects to a Hbase table.
- HCatalogBeamSchema - Class in org.apache.beam.sdk.io.hcatalog
-
Adapter from HCatalog table schema to Beam
Schema
. - HCatalogIO - Class in org.apache.beam.sdk.io.hcatalog
-
IO to read and write data using HCatalog.
- HCatalogIO.Read - Class in org.apache.beam.sdk.io.hcatalog
-
A
PTransform
to read data using HCatalog. - HCatalogIO.Write - Class in org.apache.beam.sdk.io.hcatalog
-
A
PTransform
to write to a HCatalog managed source. - HCatalogTable - Class in org.apache.beam.sdk.extensions.sql.meta.provider.hcatalog
-
Beam SQL table that wraps
HCatalogIO
. - HCatalogTable() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.hcatalog.HCatalogTable
- HCatalogUtils - Class in org.apache.beam.sdk.io.hcatalog
-
Utility classes to enable meta store conf/client creation.
- HCatalogUtils() - Constructor for class org.apache.beam.sdk.io.hcatalog.HCatalogUtils
- HCatToRow - Class in org.apache.beam.sdk.io.hcatalog
-
Utilities to convert
HCatRecords
toRows
. - HCatToRow() - Constructor for class org.apache.beam.sdk.io.hcatalog.HCatToRow
- HDFSSynchronization - Class in org.apache.beam.sdk.io.hadoop.format
-
Implementation of
ExternalSynchronization
which registers locks in the HDFS. - HDFSSynchronization(String) - Constructor for class org.apache.beam.sdk.io.hadoop.format.HDFSSynchronization
-
Creates instance of
HDFSSynchronization
. - HeaderAccessor - Interface in org.apache.beam.sdk.fn.server
-
Interface to access headers in the client request.
- HealthcareApiClient - Interface in org.apache.beam.sdk.io.gcp.healthcare
-
Defines a client to communicate with the GCP HCLS API (version v1).
- HealthcareIOError<T> - Class in org.apache.beam.sdk.io.gcp.healthcare
-
Class for capturing errors on IO operations on Google Cloud Healthcare APIs resources.
- HealthcareIOErrorCoder<T> - Class in org.apache.beam.sdk.io.gcp.healthcare
- HealthcareIOErrorToTableRow<T> - Class in org.apache.beam.sdk.io.gcp.healthcare
-
Convenience transform to write dead-letter
HealthcareIOError
s to BigQueryTableRow
s. - HealthcareIOErrorToTableRow() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOErrorToTableRow
- HEARTBEAT_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
-
Counter for the total number of heartbeats identified during the execution of the Connector.
- HEARTBEAT_RECORD_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Counter for the total number of heartbeat records identified during the execution of the Connector.
- HeartbeatRecord - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
A heartbeat record serves as a notification that the change stream query has returned all changes for the partition less or equal to the record timestamp.
- HeartbeatRecord(Timestamp, ChangeStreamRecordMetadata) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.HeartbeatRecord
-
Constructs the heartbeat record with the given timestamp and metadata.
- heartbeatRecordAction(ChangeStreamMetrics) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.ActionFactory
-
Creates and returns a singleton instance of an action class capable of processing
HeartbeatRecord
s. - HeartbeatRecordAction - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.action
-
This class is part of the process for
ReadChangeStreamPartitionDoFn
SDF. - helloWorld() - Method in class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.HelloWorldFn
- HelloWorldFn() - Constructor for class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.HelloWorldFn
- Hidden - Annotation Interface in org.apache.beam.sdk.options
-
Methods and/or interfaces annotated with
@Hidden
will be suppressed from being output when--help
is specified on the command-line. - HIGHER_BIT_SIGNED - Enum constant in enum class org.apache.beam.sdk.extensions.sbe.UnsignedOptions.Behavior
-
Uses the signed primitive with the next higher bit count.
- HIGHER_THROUGHPUT - Enum constant in enum class org.apache.beam.sdk.io.solace.SolaceIO.SubmissionMode
- hintNumWorkers - Variable in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntityWithSummary
- hintNumWorkers - Variable in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKeyWithSummary
- hintNumWorkers - Variable in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.WriteWithSummary
- hints() - Method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHints
- Histogram - Interface in org.apache.beam.sdk.metrics
-
A metric that reports information about the histogram of reported values.
- HISTOGRAM_BUCKET_TYPE - Static variable in class org.apache.beam.sdk.extensions.gcp.util.LatencyRecordingHttpRequestInitializer
- HL7v2IO - Class in org.apache.beam.sdk.io.gcp.healthcare
-
HL7v2IO
provides an API for reading from and writing to Google Cloud Healthcare HL7v2 API. - HL7v2IO() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
- HL7v2IO.HL7v2Read - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type Read that reads HL7v2 message contents given a PCollection of
HL7v2ReadParameter
. - HL7v2IO.HL7v2Read.FetchHL7v2Message - Class in org.apache.beam.sdk.io.gcp.healthcare
-
PTransform
to fetch a message from an Google Cloud Healthcare HL7v2 store based on msgID. - HL7v2IO.HL7v2Read.FetchHL7v2Message.HL7v2MessageGetFn - Class in org.apache.beam.sdk.io.gcp.healthcare
-
DoFn for fetching messages from the HL7v2 store with error handling.
- HL7v2IO.HL7v2Read.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type Result includes
PCollection
ofHL7v2ReadResponse
objects for successfully read results andPCollection
ofHealthcareIOError
objects for failed reads. - HL7v2IO.ListHL7v2Messages - Class in org.apache.beam.sdk.io.gcp.healthcare
-
List HL7v2 messages in HL7v2 Stores with optional filter.
- HL7v2IO.Read - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type Read that reads HL7v2 message contents given a PCollection of message IDs strings.
- HL7v2IO.Read.FetchHL7v2Message - Class in org.apache.beam.sdk.io.gcp.healthcare
-
PTransform
to fetch a message from an Google Cloud Healthcare HL7v2 store based on msgID. - HL7v2IO.Read.FetchHL7v2Message.HL7v2MessageGetFn - Class in org.apache.beam.sdk.io.gcp.healthcare
-
DoFn for fetching messages from the HL7v2 store with error handling.
- HL7v2IO.Read.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type Result includes
PCollection
ofHL7v2Message
objects for successfully read results andPCollection
ofHealthcareIOError
objects for failed reads. - HL7v2IO.Write - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type Write that writes the given PCollection of HL7v2 messages.
- HL7v2IO.Write.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
- HL7v2IO.Write.WriteMethod - Enum Class in org.apache.beam.sdk.io.gcp.healthcare
-
The enum Write method.
- HL7v2Message - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type HL7v2 message to wrap the
Message
model. - HL7v2Message(String, String, String, String, String, String, String, Map<String, String>) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
- HL7v2MessageCoder - Class in org.apache.beam.sdk.io.gcp.healthcare
- HL7v2MessageGetFn() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read.FetchHL7v2Message.HL7v2MessageGetFn
- HL7v2Read() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read
- HL7v2ReadParameter - Class in org.apache.beam.sdk.io.gcp.healthcare
-
HL7v2ReadParameter represents the read parameters for a HL7v2 read request, used as the input type for
HL7v2IO.HL7v2Read
. - HL7v2ReadParameter() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadParameter
- HL7v2ReadResponse - Class in org.apache.beam.sdk.io.gcp.healthcare
-
HL7v2ReadResponse represents the response format for a HL7v2 read request, used as the output type of
HL7v2IO.HL7v2Read
. - HL7v2ReadResponse(String, HL7v2Message) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadResponse
- HL7v2ReadResponseCoder - Class in org.apache.beam.sdk.io.gcp.healthcare
-
Coder for
HL7v2ReadResponse
. - HllCount - Class in org.apache.beam.sdk.extensions.zetasketch
-
PTransform
s to compute HyperLogLogPlusPlus (HLL++) sketches on data streams based on the ZetaSketch implementation. - HllCount.Extract - Class in org.apache.beam.sdk.extensions.zetasketch
-
Provides
PTransform
s to extract the estimated count of distinct elements (asLong
s) from each HLL++ sketch. - HllCount.Init - Class in org.apache.beam.sdk.extensions.zetasketch
-
Provides
PTransform
s to aggregate inputs into HLL++ sketches. - HllCount.Init.Builder<InputT> - Class in org.apache.beam.sdk.extensions.zetasketch
-
Builder for the
HllCount.Init
combiningPTransform
. - HllCount.MergePartial - Class in org.apache.beam.sdk.extensions.zetasketch
-
Provides
PTransform
s to merge HLL++ sketches into a new sketch. - host() - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionServiceFactory
-
The host name or IP address of the Solace broker.
- host() - Method in class org.apache.beam.sdk.io.solace.broker.GCPSecretSessionServiceFactory
- host() - Method in class org.apache.beam.sdk.io.splunk.SplunkEvent
- host(String) - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionServiceFactory.Builder
-
Set Solace host, format: Host[:Port] e.g.
- host(String) - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthSempClientFactory.Builder
-
Set Solace SEMP host, format: [Protocol://]Host[:Port].
- host(String) - Method in class org.apache.beam.sdk.io.solace.broker.GCPSecretSessionServiceFactory.Builder
-
The location of the broker, including port details if it is not listening in the default port.
- How it works - Search tag in class org.apache.beam.io.debezium.SourceRecordJson
- Section
- HttpClientConfiguration - Class in org.apache.beam.sdk.io.aws2.common
-
HTTP client configuration for both, sync and async AWS clients.
- HttpClientConfiguration() - Constructor for class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration
- HttpClientConfiguration.Builder - Class in org.apache.beam.sdk.io.aws2.common
- HttpHealthcareApiClient - Class in org.apache.beam.sdk.io.gcp.healthcare
-
A client that talks to the Cloud Healthcare API through HTTP requests.
- HttpHealthcareApiClient() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
-
Instantiates a new Http healthcare api client.
- HttpHealthcareApiClient(CloudHealthcare) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
-
Instantiates a new Http healthcare api client.
- HttpHealthcareApiClient.AuthenticatedRetryInitializer - Class in org.apache.beam.sdk.io.gcp.healthcare
- HttpHealthcareApiClient.FhirResourcePagesIterator - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type FhirResourcePagesIterator for methods which return paged output.
- HttpHealthcareApiClient.FhirResourcePagesIterator.FhirMethod - Enum Class in org.apache.beam.sdk.io.gcp.healthcare
- HttpHealthcareApiClient.HealthcareHttpException - Exception Class in org.apache.beam.sdk.io.gcp.healthcare
-
Wraps
HttpResponse
in an exception with a statusCode field for use withHealthcareIOError
. - HttpHealthcareApiClient.HL7v2MessagePages - Class in org.apache.beam.sdk.io.gcp.healthcare
- HttpHealthcareApiClient.HL7v2MessagePages.HL7v2MessagePagesIterator - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type Hl7v2 message id pages iterator.
- HyperLogLogPlusCoder() - Constructor for class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.HyperLogLogPlusCoder
I
- ICEBERG - Static variable in class org.apache.beam.sdk.managed.Managed
- ICEBERG_CDC - Static variable in class org.apache.beam.sdk.managed.Managed
- IcebergCatalog - Class in org.apache.beam.sdk.extensions.sql.meta.provider.iceberg
- IcebergCatalog(String, Map<String, String>) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.iceberg.IcebergCatalog
- IcebergCatalogConfig - Class in org.apache.beam.sdk.io.iceberg
- IcebergCatalogConfig() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig
- IcebergCatalogConfig.Builder - Class in org.apache.beam.sdk.io.iceberg
- IcebergCdcReadSchemaTransformProvider - Class in org.apache.beam.sdk.io.iceberg
-
SchemaTransform implementation for
IcebergIO.readRows(org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig)
. - IcebergCdcReadSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergCdcReadSchemaTransformProvider
- IcebergCdcReadSchemaTransformProvider.Configuration - Class in org.apache.beam.sdk.io.iceberg
- IcebergDestination - Class in org.apache.beam.sdk.io.iceberg
- IcebergDestination() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergDestination
- IcebergDestination.Builder - Class in org.apache.beam.sdk.io.iceberg
- IcebergFilter - Class in org.apache.beam.sdk.extensions.sql.meta.provider.iceberg
- IcebergFilter(List<RexNode>) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.iceberg.IcebergFilter
- IcebergIO - Class in org.apache.beam.sdk.io.iceberg
-
A connector that reads and writes to Apache Iceberg tables.
- IcebergIO() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergIO
- IcebergIO.ReadRows - Class in org.apache.beam.sdk.io.iceberg
- IcebergIO.ReadRows.StartingStrategy - Enum Class in org.apache.beam.sdk.io.iceberg
- IcebergIO.WriteRows - Class in org.apache.beam.sdk.io.iceberg
- IcebergReadSchemaTransformProvider - Class in org.apache.beam.sdk.io.iceberg
-
SchemaTransform implementation for
IcebergIO.readRows(org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig)
. - IcebergReadSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergReadSchemaTransformProvider
- IcebergReadSchemaTransformProvider.Configuration - Class in org.apache.beam.sdk.io.iceberg
- icebergRecordToBeamRow(Schema, Record) - Static method in class org.apache.beam.sdk.io.iceberg.IcebergUtils
-
Converts an Iceberg
Record
to a BeamRow
. - IcebergScanConfig - Class in org.apache.beam.sdk.io.iceberg
- IcebergScanConfig() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
- IcebergScanConfig.Builder - Class in org.apache.beam.sdk.io.iceberg
- IcebergScanConfig.ScanType - Enum Class in org.apache.beam.sdk.io.iceberg
- icebergSchemaToBeamSchema(Schema) - Static method in class org.apache.beam.sdk.io.iceberg.IcebergUtils
-
Converts an Iceberg
Schema
to a BeamSchema
. - IcebergSchemaTransformTranslation - Class in org.apache.beam.sdk.io.iceberg
- IcebergSchemaTransformTranslation() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergSchemaTransformTranslation
- IcebergSchemaTransformTranslation.ReadRegistrar - Class in org.apache.beam.sdk.io.iceberg
- IcebergSchemaTransformTranslation.WriteRegistrar - Class in org.apache.beam.sdk.io.iceberg
- IcebergTableCreateConfig - Class in org.apache.beam.sdk.io.iceberg
- IcebergTableCreateConfig() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergTableCreateConfig
- IcebergTableCreateConfig.Builder - Class in org.apache.beam.sdk.io.iceberg
- IcebergTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.iceberg
-
A table provider for Iceberg tables.
- IcebergTableProvider(IcebergCatalogConfig) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.iceberg.IcebergTableProvider
- IcebergUtils - Class in org.apache.beam.sdk.io.iceberg
-
Utilities for converting between Beam and Iceberg types, made public for user's convenience.
- IcebergWriteResult - Class in org.apache.beam.sdk.io.iceberg
- IcebergWriteSchemaTransformProvider - Class in org.apache.beam.sdk.io.iceberg
-
SchemaTransform implementation for
IcebergIO.writeRows(org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig)
. - IcebergWriteSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider
- IcebergWriteSchemaTransformProvider.Configuration - Class in org.apache.beam.sdk.io.iceberg
- IcebergWriteSchemaTransformProvider.Configuration.Builder - Class in org.apache.beam.sdk.io.iceberg
- id - Variable in enum class org.apache.beam.sdk.io.kafka.KafkaTimestampType
- id() - Method in class org.apache.beam.runners.jobsubmission.JobPreparation
- identifier() - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider
- identifier() - Method in class org.apache.beam.sdk.expansion.service.WindowIntoTransformProvider
- identifier() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSchemaIOProvider
-
Returns an id that uniquely represents this IO.
- identifier() - Method in class org.apache.beam.sdk.extensions.avro.schemas.io.payloads.AvroPayloadSerializerProvider
- identifier() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoPayloadSerializerProvider
- identifier() - Method in class org.apache.beam.sdk.extensions.sql.expansion.SqlTransformSchemaTransformProvider
- identifier() - Method in class org.apache.beam.sdk.io.aws2.sqs.providers.SqsReadSchemaTransformProvider
- identifier() - Method in class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider
- identifier() - Method in class org.apache.beam.sdk.io.fileschematransform.AvroReadSchemaTransformFormatProvider
- identifier() - Method in class org.apache.beam.sdk.io.fileschematransform.AvroWriteSchemaTransformFormatProvider
- identifier() - Method in class org.apache.beam.sdk.io.fileschematransform.CsvWriteSchemaTransformFormatProvider
- identifier() - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformProvider
- identifier() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformProvider
-
Returns the
SchemaTransformProvider.identifier()
required for registration. - identifier() - Method in class org.apache.beam.sdk.io.fileschematransform.JsonReadSchemaTransformFormatProvider
- identifier() - Method in class org.apache.beam.sdk.io.fileschematransform.JsonWriteSchemaTransformFormatProvider
- identifier() - Method in class org.apache.beam.sdk.io.fileschematransform.LineReadSchemaTransformFormatProvider
- identifier() - Method in class org.apache.beam.sdk.io.fileschematransform.ParquetReadSchemaTransformFormatProvider
- identifier() - Method in class org.apache.beam.sdk.io.fileschematransform.ParquetWriteSchemaTransformFormatProvider
- identifier() - Method in class org.apache.beam.sdk.io.fileschematransform.XmlWriteSchemaTransformFormatProvider
- identifier() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryDlqProvider
- identifier() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformProvider
-
Implementation of the
TypedSchemaTransformProvider
identifier method. - identifier() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaIOProvider
-
Returns an id that uniquely represents this IO.
- identifier() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider
- identifier() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryFileLoadsSchemaTransformProvider
- identifier() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider
- identifier() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteSchemaTransformProvider
- identifier() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider
- identifier() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider
- identifier() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider
-
Returns an id that uniquely represents this IO.
- identifier() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubDlqProvider
- identifier() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformProvider
- identifier() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
-
Returns an id that uniquely represents this IO.
- identifier() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformProvider
- identifier() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.DlqProvider
- identifier() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
- identifier() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
- identifier() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider
- identifier() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider
- identifier() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider
- identifier() - Method in class org.apache.beam.sdk.io.iceberg.IcebergCdcReadSchemaTransformProvider
- identifier() - Method in class org.apache.beam.sdk.io.iceberg.IcebergReadSchemaTransformProvider
- identifier() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider
- identifier() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider
- identifier() - Method in class org.apache.beam.sdk.io.jdbc.JdbcSchemaIOProvider
-
Returns an id that uniquely represents this IO.
- identifier() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider
- identifier() - Method in class org.apache.beam.sdk.io.jdbc.providers.ReadFromMySqlSchemaTransformProvider
- identifier() - Method in class org.apache.beam.sdk.io.jdbc.providers.ReadFromOracleSchemaTransformProvider
- identifier() - Method in class org.apache.beam.sdk.io.jdbc.providers.ReadFromPostgresSchemaTransformProvider
- identifier() - Method in class org.apache.beam.sdk.io.jdbc.providers.ReadFromSqlServerSchemaTransformProvider
- identifier() - Method in class org.apache.beam.sdk.io.jdbc.providers.WriteToMySqlSchemaTransformProvider
- identifier() - Method in class org.apache.beam.sdk.io.jdbc.providers.WriteToOracleSchemaTransformProvider
- identifier() - Method in class org.apache.beam.sdk.io.jdbc.providers.WriteToPostgresSchemaTransformProvider
- identifier() - Method in class org.apache.beam.sdk.io.jdbc.providers.WriteToSqlServerSchemaTransformProvider
- identifier() - Method in class org.apache.beam.sdk.io.json.providers.JsonWriteTransformProvider
- identifier() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider
- identifier() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider
- identifier() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadProvider
-
Implementation of the
TypedSchemaTransformProvider
identifier method. - identifier() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteProvider
-
Implementation of the
TypedSchemaTransformProvider
identifier method. - identifier() - Method in class org.apache.beam.sdk.io.TFRecordReadSchemaTransformProvider
-
Implementation of the
TypedSchemaTransformProvider
identifier method. - identifier() - Method in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformProvider
-
Implementation of the
TypedSchemaTransformProvider
identifier method. - identifier() - Method in class org.apache.beam.sdk.io.thrift.ThriftPayloadSerializerProvider
- identifier() - Method in class org.apache.beam.sdk.managed.ManagedSchemaTransformProvider
- identifier() - Method in class org.apache.beam.sdk.managed.testing.TestSchemaTransformProvider
- identifier() - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider
- identifier() - Method in class org.apache.beam.sdk.schemas.io.payloads.JsonPayloadSerializerProvider
- identifier() - Method in interface org.apache.beam.sdk.schemas.io.Providers.Identifyable
-
Returns an id that uniquely represents this among others implementing its derived interface.
- identifier() - Method in interface org.apache.beam.sdk.schemas.io.SchemaIOProvider
-
Returns an id that uniquely represents this IO.
- identifier() - Method in class org.apache.beam.sdk.schemas.transforms.providers.FlattenTransformProvider
- identifier() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider
- identifier() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider
- identifier() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider
- identifier() - Method in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider
- identifier() - Method in interface org.apache.beam.sdk.schemas.transforms.SchemaTransformProvider
-
Returns an id that uniquely represents this transform.
- Identifier() - Constructor for class org.apache.beam.sdk.transforms.display.DisplayData.Identifier
- IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.protobuf.ProtoSchemaLogicalTypes.Fixed32
- IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.protobuf.ProtoSchemaLogicalTypes.Fixed64
- IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.protobuf.ProtoSchemaLogicalTypes.SFixed32
- IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.protobuf.ProtoSchemaLogicalTypes.SFixed64
- IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.protobuf.ProtoSchemaLogicalTypes.SInt32
- IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.protobuf.ProtoSchemaLogicalTypes.SInt64
- IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.protobuf.ProtoSchemaLogicalTypes.UInt32
- IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.protobuf.ProtoSchemaLogicalTypes.UInt64
- IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.LocalMktDate
- IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimeOnly
- IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimestamp
- IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.Uint16
- IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.Uint32
- IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.Uint64
- IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.Uint8
- IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCDateOnly
- IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimeOnly
- IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimestamp
- IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils.TimeWithLocalTzType
- IDENTIFIER - Static variable in class org.apache.beam.sdk.managed.testing.TestSchemaTransformProvider
- IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.Date
- IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.DateTime
- IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
- IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.FixedBytes
- IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.FixedPrecisionNumeric
- IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.FixedString
- IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.MicrosInstant
- IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.NanosDuration
- IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.NanosInstant
- IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
- IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.PythonCallable
- IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.SchemaLogicalType
- IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.Time
- IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.UuidLogicalType
- IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.VariableBytes
- IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.VariableString
- IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
- IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- IDENTIFIER() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifiedComponentContext
- IDENTIFIER() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.SimpleIdentifierContext
- identity() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
-
Returns the identity element of this operation, i.e.
- identity() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
-
Returns the value that should be used for the combine of the empty set.
- identity() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
-
Returns the identity element of this operation, i.e.
- identity() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
-
Returns the identity element of this operation, i.e.
- identity() - Static method in class org.apache.beam.sdk.transforms.SerializableFunctions
- IDENTITY_ELEMENT - Static variable in class org.apache.beam.sdk.metrics.DistributionResult
-
The IDENTITY_ELEMENT is used to start accumulating distributions.
- IdGenerator - Interface in org.apache.beam.sdk.fn
-
A generator of unique IDs.
- IdGenerators - Class in org.apache.beam.sdk.fn
-
Common
IdGenerator
implementations. - IdGenerators() - Constructor for class org.apache.beam.sdk.fn.IdGenerators
- idleTimeoutMs - Variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- IGNORE - Enum constant in enum class org.apache.beam.sdk.schemas.Schema.EquivalenceNullablePolicy
- IGNORE_MISSING_FILES - Enum constant in enum class org.apache.beam.sdk.io.fs.MoveOptions.StandardMoveOptions
- ignored() - Static method in interface org.apache.beam.runners.fnexecution.control.BundleProgressHandler
-
Returns a handler that ignores metrics.
- Ignored CSVFormat parameters - Search tag in class org.apache.beam.sdk.io.csv.CsvIO
- Section
- ignoreInput(Watch.Growth.TerminationCondition<?, StateT>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
-
Wraps a given input-independent
Watch.Growth.TerminationCondition
as an equivalent condition with a given input type, passingnull
to the original condition as input. - ignoreInsertIds() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Setting this option to true disables insertId based data deduplication offered by BigQuery.
- ignoreReturnValue(Object) - Static method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- ignoreUnknownValues() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Accept rows that contain values that do not match the schema.
- immediate(T) - Static method in class org.apache.beam.sdk.state.ReadableStates
-
A
ReadableState
constructed from a constant value, hence immediately available. - immutableNames() - Method in class org.apache.beam.sdk.metrics.MetricsFilter
- immutableNamesBuilder() - Method in class org.apache.beam.sdk.metrics.MetricsFilter.Builder
- immutableSteps() - Method in class org.apache.beam.sdk.metrics.MetricsFilter
- immutableStepsBuilder() - Method in class org.apache.beam.sdk.metrics.MetricsFilter.Builder
- implement(EnumerableRelImplementor, EnumerableRel.Prefer) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamEnumerableConverter
- Implementation - Search tag in org.apache.beam.sdk.io.BoundedSource.BoundedReader.getCurrentSource()
- Section
- Implementation - Search tag in org.apache.beam.sdk.io.BoundedSource.BoundedReader.getCurrentSource()
- Section
- Implementation - Search tag in org.apache.beam.sdk.io.BoundedSource.BoundedReader.getCurrentSource()
- Section
- Implementation - Search tag in org.apache.beam.sdk.io.BoundedSource.BoundedReader.getCurrentSource()
- Section
- Implementing #splitAtFraction - Search tag in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
- Section
- implementor() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamSqlUnparseContext
- IMPORT - Enum constant in enum class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.WriteMethod
-
Import Method bulk imports resources from GCS.
- importCatalogItems() - Static method in class org.apache.beam.sdk.extensions.ml.RecommendationAIIO
- importFhirResource(String, String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Importing a FHIR resource from GCS.
- importFhirResource(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- importResources(String, String, String, FhirIO.Import.ContentStructure) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
-
Import resources.
- importResources(ValueProvider<String>, ValueProvider<String>, ValueProvider<String>, FhirIO.Import.ContentStructure) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
-
Import resources.
- importUserEvents() - Static method in class org.apache.beam.sdk.extensions.ml.RecommendationAIIO
- Impulse - Class in org.apache.beam.sdk.transforms
-
For internal use only; no backwards-compatibility guarantees.
- IMPULSE_ELEMENT - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- ImpulseInputFormat - Class in org.apache.beam.runners.flink.translation.wrappers
-
Flink input format that implements impulses.
- ImpulseInputFormat() - Constructor for class org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat
- ImpulseP - Class in org.apache.beam.runners.jet.processors
-
/** * Jet
Processor
implementation for Beam's Impulse primitive. - ImpulseSource - Class in org.apache.beam.runners.twister2.translators.functions
-
A
SourceFunc
which executes the impulse transform contract. - ImpulseSource() - Constructor for class org.apache.beam.runners.twister2.translators.functions.ImpulseSource
- ImpulseSourceFunction - Class in org.apache.beam.runners.flink.translation.functions
-
Source function which sends a single global impulse to a downstream operator.
- ImpulseSourceFunction(long) - Constructor for class org.apache.beam.runners.flink.translation.functions.ImpulseSourceFunction
- ImpulseTranslatorBatch - Class in org.apache.beam.runners.twister2.translators.batch
-
Impulse translator.
- ImpulseTranslatorBatch() - Constructor for class org.apache.beam.runners.twister2.translators.batch.ImpulseTranslatorBatch
- in(Pipeline) - Static method in class org.apache.beam.sdk.values.PBegin
- in(Pipeline) - Static method in class org.apache.beam.sdk.values.PDone
- in(Pipeline, PCollection<Solace.PublishResult>) - Static method in class org.apache.beam.sdk.io.solace.write.SolaceOutput
- in(Pipeline, PCollection<FhirBundleResponse>, PCollection<HealthcareIOError<FhirBundleParameter>>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
-
Entry point for the ExecuteBundlesResult, storing the successful and failed bundles and their metadata.
- IN_ARRAY_OPERATOR - Static variable in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamBigQuerySqlDialect
- inc() - Method in class org.apache.beam.runners.jet.metrics.CounterImpl
- inc() - Method in interface org.apache.beam.sdk.metrics.Counter
-
Increment the counter.
- inc() - Method in class org.apache.beam.sdk.metrics.DelegatingCounter
-
Increment the counter.
- inc() - Method in class org.apache.beam.sdk.metrics.NoOpCounter
- inc(long) - Method in class org.apache.beam.runners.jet.metrics.CounterImpl
- inc(long) - Method in interface org.apache.beam.sdk.metrics.Counter
-
Increment the counter by the given amount.
- inc(long) - Method in class org.apache.beam.sdk.metrics.DelegatingCounter
-
Increment the counter by the given amount.
- inc(long) - Method in class org.apache.beam.sdk.metrics.NoOpCounter
- incActivePartitionReadCounter() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Increments the
ChangeStreamMetrics.ACTIVE_PARTITION_READ_COUNT
by 1 if the metric is enabled. - incChangeStreamMutationGcCounter() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
-
Increments the
ChangeStreamMetrics.CHANGE_STREAM_MUTATION_GC_COUNT
by 1 if the metric is enabled. - incChangeStreamMutationUserCounter() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
-
Increments the
ChangeStreamMetrics.CHANGE_STREAM_MUTATION_USER_COUNT
by 1 if the metric is enabled. - incClosestreamCount() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
-
Increments the
ChangeStreamMetrics.CLOSESTREAM_COUNT
by 1 if the metric is enabled. - incDataRecordCounter() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Increments the
ChangeStreamMetrics.DATA_RECORD_COUNT
by 1 if the metric is enabled. - incHeartbeatCount() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
-
Increments the
ChangeStreamMetrics.HEARTBEAT_COUNT
by 1 if the metric is enabled. - incHeartbeatRecordCount() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Increments the
ChangeStreamMetrics.HEARTBEAT_RECORD_COUNT
by 1 if the metric is enabled. - incListPartitionsCount() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
-
Increments the
ChangeStreamMetrics.LIST_PARTITIONS_COUNT
by 1 if the metric is enabled. - include(String, HasDisplayData) - Method in interface org.apache.beam.sdk.transforms.display.DisplayData.Builder
-
Register display data from the specified subcomponent at the given path.
- inCombinedNonLatePanes(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
-
Creates a new
PAssert.IterableAssert
like this one, but with the assertion restricted to only run on the provided window across all panes that were not produced by the arrival of late data. - inCombinedNonLatePanes(BoundedWindow) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
- IncomingMessage() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.IncomingMessage
- IncompatibleWindowException - Exception Class in org.apache.beam.sdk.transforms.windowing
-
Exception thrown by
WindowFn.verifyCompatibility(WindowFn)
if two compared WindowFns are not compatible, including the explanation of incompatibility. - IncompatibleWindowException(WindowFn<?, ?>, String) - Constructor for exception class org.apache.beam.sdk.transforms.windowing.IncompatibleWindowException
- incomplete(List<TimestampedValue<OutputT>>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth.PollResult
-
Constructs a
Watch.Growth.PollResult
with the given outputs and declares that new outputs might appear for the current input. - incomplete(Instant, List<OutputT>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth.PollResult
-
Like
Watch.Growth.PollResult.incomplete(List)
, but assigns the same timestamp to all new outputs. - incOrphanedNewPartitionCleanedCount() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
-
Increments the
ChangeStreamMetrics.ORPHANED_NEW_PARTITION_CLEANED_COUNT
by 1. - incPartitionEndRecordCount() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Increments the
ChangeStreamMetrics.PARTITION_END_RECORD_COUNT
by 1 if the metric is enabled. - incPartitionEventRecordCount() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Increments the
ChangeStreamMetrics.PARTITION_END_RECORD_COUNT
by 1 if the metric is enabled. - incPartitionMergeCount() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
-
Increments the
ChangeStreamMetrics.PARTITION_MERGE_COUNT
by 1 if the metric is enabled. - incPartitionReconciledWithoutTokenCount() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
-
Increments the
ChangeStreamMetrics.PARTITION_RECONCILED_WITHOUT_TOKEN_COUNT
by 1. - incPartitionReconciledWithTokenCount() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
-
Increments the
ChangeStreamMetrics.PARTITION_RECONCILED_WITH_TOKEN_COUNT
by 1. - incPartitionRecordCount() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Increments the
ChangeStreamMetrics.PARTITION_RECORD_COUNT
by 1 if the metric is enabled. - incPartitionRecordMergeCount() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Increments the
ChangeStreamMetrics.PARTITION_RECORD_MERGE_COUNT
by 1 if the metric is enabled. - incPartitionRecordSplitCount() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Increments the
ChangeStreamMetrics.PARTITION_RECORD_SPLIT_COUNT
by 1 if the metric is enabled. - incPartitionSplitCount() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
-
Increments the
ChangeStreamMetrics.PARTITION_SPLIT_COUNT
by 1 if the metric is enabled. - incPartitionStartRecordCount() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Increments the
ChangeStreamMetrics.PARTITION_START_RECORD_COUNT
by 1 if the metric is enabled. - incPartitionStreamCount() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
-
Increments the
ChangeStreamMetrics.PARTITION_STREAM_COUNT
by 1. - incQueryCounter() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Increments the
ChangeStreamMetrics.QUERY_COUNT
by 1 if the metric is enabled. - INCRBY - Enum constant in enum class org.apache.beam.sdk.io.redis.RedisIO.Write.Method
-
Use INCBY command.
- increment() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
-
Returns a RandomAccessData that is the smallest value of same length which is strictly greater than this.
- increment(Long) - Method in class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.IncrementFn
- incrementAll(Date) - Method in class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.DateIncrementAllFn
- IncrementFn() - Constructor for class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.IncrementFn
- incrementingLongs() - Static method in class org.apache.beam.sdk.fn.IdGenerators
-
Returns an
IdGenerator
which provides successive incrementing longs. - index() - Method in class org.apache.beam.sdk.io.splunk.SplunkEvent
- INDEX_OF_MAX - Static variable in class org.apache.beam.sdk.io.ShardNameTemplate
-
Shard name containing the index and max.
- indexOf(String) - Method in class org.apache.beam.sdk.schemas.Schema
-
Find the index of a given field.
- indexOfProjectionColumnRef(long, List<ResolvedColumn>) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.ExpressionConverter
-
Return an index of the projection column reference.
- inEarlyGlobalWindowPanes() - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
-
Creates a new
PAssert.IterableAssert
like this one, but with the assertion restricted to only run on panes in theGlobalWindow
that were emitted before theGlobalWindow
closed. - inEarlyGlobalWindowPanes() - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
- inEarlyPane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
-
Creates a new
PAssert.IterableAssert
like this one, but with the assertion restricted to only run on the provided window across all panes that were produced by the arrival of early data. - inEarlyPane(BoundedWindow) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
- inEarlyPane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.SingletonAssert
-
Creates a new
PAssert.SingletonAssert
like this one, but with the assertion restricted to only run on the provided window, running the checker only on early panes for each key. - InferableFunction<InputT,
OutputT> - Class in org.apache.beam.sdk.transforms -
A
ProcessFunction
which is not a functional interface. - InferableFunction() - Constructor for class org.apache.beam.sdk.transforms.InferableFunction
- InferableFunction(ProcessFunction<InputT, OutputT>) - Constructor for class org.apache.beam.sdk.transforms.InferableFunction
- inferBeamSchema(DataSource, String) - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadRows
- Inferring Beam schemas from Avro files - Search tag in class org.apache.beam.sdk.extensions.avro.io.AvroIO
- Section
- Inferring Beam schemas from Avro PCollections - Search tag in class org.apache.beam.sdk.extensions.avro.io.AvroIO
- Section
- Inferring Beam schemas from Parquet files - Search tag in class org.apache.beam.sdk.io.parquet.ParquetIO
- Section
- inferType(Object) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
-
Infer the
DisplayData.Type
for the given object. - inFinalPane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
-
Creates a new
PAssert.IterableAssert
like this one, but with the assertion restricted to only run on the provided window, running the checker only on the final pane for each key. - inFinalPane(BoundedWindow) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
- inFinalPane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.SingletonAssert
-
Creates a new
PAssert.SingletonAssert
like this one, but with the assertion restricted to only run on the provided window, running the checker only on the final pane for each key. - InfluxDbIO - Class in org.apache.beam.sdk.io.influxdb
-
IO to read and write from InfluxDB.
- InfluxDbIO.DataSourceConfiguration - Class in org.apache.beam.sdk.io.influxdb
-
A POJO describing a DataSourceConfiguration such as URL, userName and password.
- InfluxDbIO.Read - Class in org.apache.beam.sdk.io.influxdb
-
A
PTransform
to read from InfluxDB metric or data related to query. - InfluxDbIO.Write - Class in org.apache.beam.sdk.io.influxdb
-
A
PTransform
to write to a InfluxDB datasource. - INFO - Enum constant in enum class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.Level
-
Deprecated.Level for logging informational messages.
- INFO - Enum constant in enum class org.apache.beam.sdk.io.snowflake.enums.StreamingLogLevel
- INFO - Enum constant in enum class org.apache.beam.sdk.options.SdkHarnessOptions.LogLevel
-
LogLevel for logging informational messages.
- INGEST - Enum constant in enum class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write.WriteMethod
-
Ingest write method.
- ingestHL7v2Message(String, Message) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Ingest an HL7v2 message.
- ingestHL7v2Message(String, Message) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- ingestMessages(String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
-
Write with Messages.Ingest method.
- INHERIT_IO_FILE - Static variable in class org.apache.beam.runners.fnexecution.environment.ProcessManager
-
A symbolic file to indicate that we want to inherit I/O of parent process.
- inheritedDescription(String, String, String, int) - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider
- inheritedDescription(String, String, String, int) - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider
- init(Outbox, Processor.Context) - Method in class org.apache.beam.runners.jet.processors.ParDoP
- init(Outbox, Processor.Context) - Method in class org.apache.beam.runners.jet.processors.StatefulParDoP
- init(Processor.Context) - Method in class org.apache.beam.runners.jet.processors.BoundedSourceP
- init(Processor.Context) - Method in class org.apache.beam.runners.jet.processors.UnboundedSourceP
- init(ResultSetMetaData) - Method in interface org.apache.beam.sdk.io.singlestore.SingleStoreIO.RowMapperWithInit
- init(SparkPipelineOptions, JavaSparkContext) - Static method in class org.apache.beam.runners.spark.metrics.MetricsAccumulator
-
Init metrics accumulator if it has not been initiated.
- initAccumulators(SparkPipelineOptions, JavaSparkContext) - Static method in class org.apache.beam.runners.spark.SparkRunner
-
Init Metrics/Aggregators accumulators.
- initClient() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Deidentify.DeidentifyFn
- initClient() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Export.ExportResourcesFn
- initContext(Class<?>) - Static method in class org.apache.beam.sdk.io.cdap.Plugin
-
Initializes
BatchContextImpl
for CDAP plugin. - initialBackoff() - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
- initialize(AbstractGoogleClientRequest<?>) - Method in class org.apache.beam.sdk.extensions.gcp.options.GoogleApiDebugOptions.GoogleApiTracer
- initialize(HttpRequest) - Method in class org.apache.beam.sdk.extensions.gcp.auth.NullCredentialInitializer
- initialize(HttpRequest) - Method in class org.apache.beam.sdk.extensions.gcp.util.LatencyRecordingHttpRequestInitializer
- initialize(HttpRequest) - Method in class org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer
- initialize(HttpRequest) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.AuthenticatedRetryInitializer
- initializeBroadcastVariable(Iterable<WindowedValue<?>>) - Method in class org.apache.beam.runners.flink.translation.functions.SideInputInitializer
- InitializeDoFn - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn
-
A DoFn responsible to initialize the metadata table and prepare it for managing the state of the pipeline.
- InitializeDoFn - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn
-
A DoFn responsible for initializing the change stream Connector.
- InitializeDoFn(DaoFactory, Instant, BigtableIO.ExistingPipelineOptions) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.InitializeDoFn
- InitializeDoFn(DaoFactory, MapperFactory, Timestamp, Timestamp) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.InitializeDoFn
- initializeSplits() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.LazyFlinkSourceSplitEnumerator
- initializeState(FunctionInitializationContext) - Method in class org.apache.beam.runners.flink.translation.functions.ImpulseSourceFunction
- initializeState(FunctionInitializationContext) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.UnboundedSourceWrapper
- initializeState(StateInitializationContext) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- initializeState(StateInitializationContext) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.DedupingOperator
- initializeState(StateInitializationContext) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.PartialReduceBundleOperator
- initializeState(StateInitializationContext) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.SplittableDoFnOperator
- initializeWriteSessionProperties(SolaceIO.SubmissionMode) - Method in class org.apache.beam.sdk.io.solace.broker.SessionService
-
This method will be called by the write connector when a new session is started.
- InitialPartition - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
Utility class to determine initial partition constants and methods.
- InitialPartition() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.InitialPartition
- InitialPipelineState - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.model
-
States to initialize a pipeline outputted by
InitializeDoFn
. - InitialPipelineState(Instant, boolean) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.InitialPipelineState
- initialRestriction() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.DetectNewPartitionsDoFn
- initialRestriction() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.ReadChangeStreamPartitionDoFn
- initialRestriction(PartitionMetadata) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
-
Uses an
TimestampRange
with a max range. - initialRestriction(PartitionMetadata) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
-
The restriction for a partition will be defined from the start and end timestamp to query the partition for.
- initialSystemTimeAt(Instant) - Method in class org.apache.beam.runners.spark.io.CreateStream
-
Set the initial synchronized processing time.
- initPluginType(Class<?>) - Static method in class org.apache.beam.sdk.io.cdap.Plugin
-
Gets value of a plugin type.
- initPulsarClients() - Method in class org.apache.beam.sdk.io.pulsar.ReadFromPulsarDoFn
- InjectPackageStrategy(Class<?>) - Constructor for class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.InjectPackageStrategy
- inLatePane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
-
Creates a new
PAssert.IterableAssert
with the assertion restricted to only run on the provided window across all panes that were produced by the arrival of late data. - inLatePane(BoundedWindow) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
- inLatePane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.SingletonAssert
-
Creates a new
PAssert.SingletonAssert
with the assertion restricted to only run on the provided window, running the checker only on late panes for each key. - inMemory() - Method in class org.apache.beam.sdk.transforms.View.AsList
-
Returns a PCollection view like this one, but whose resulting list will be entirely cached in memory.
- inMemory() - Method in class org.apache.beam.sdk.transforms.View.AsMap
-
Returns a PCollection view like this one, but whose resulting map will be entirely cached in memory.
- inMemory() - Method in class org.apache.beam.sdk.transforms.View.AsMultimap
-
Returns a PCollection view like this one, but whose resulting map will be entirely cached in memory.
- inMemory(boolean) - Method in class org.apache.beam.sdk.transforms.View.AsList
-
Returns a PCollection view like this one, but whose resulting list will be entirely cached in memory according to the input parameter.
- inMemory(boolean) - Method in class org.apache.beam.sdk.transforms.View.AsMap
-
Returns a PCollection view like this one, but whose resulting map will be entirely cached in memory according to the input parameter.
- inMemory(boolean) - Method in class org.apache.beam.sdk.transforms.View.AsMultimap
-
Returns a PCollection view like this one, but whose resulting map will be entirely cached in memory according to the input parameter.
- inMemory(TableProvider...) - Static method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
-
This method creates
BeamSqlEnv
using empty * Pipeline Options. - InMemoryBagUserStateFactory<K,
V, - Class in org.apache.beam.runners.fnexecution.stateW> -
Holds user state in memory.
- InMemoryBagUserStateFactory() - Constructor for class org.apache.beam.runners.fnexecution.state.InMemoryBagUserStateFactory
- InMemoryCatalog - Class in org.apache.beam.sdk.extensions.sql.meta.catalog
- InMemoryCatalog(String, Map<String, String>) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.catalog.InMemoryCatalog
- InMemoryCatalogManager - Class in org.apache.beam.sdk.extensions.sql.meta.catalog
- InMemoryCatalogManager() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.catalog.InMemoryCatalogManager
- InMemoryCatalogRegistrar - Class in org.apache.beam.sdk.extensions.sql.meta.catalog
- InMemoryCatalogRegistrar() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.catalog.InMemoryCatalogRegistrar
- inMemoryFinalizer(InstructionRequestHandler) - Static method in class org.apache.beam.runners.fnexecution.control.BundleFinalizationHandlers
-
A bundle finalizer that stores all bundle finalization requests in memory.
- InMemoryJobService - Class in org.apache.beam.runners.jobsubmission
-
A InMemoryJobService that prepares and runs jobs on behalf of a client using a
JobInvoker
. - InMemoryListFromMultimapViewFn(PCollectionViews.TypeDescriptorSupplier<T>) - Constructor for class org.apache.beam.sdk.values.PCollectionViews.InMemoryListFromMultimapViewFn
- inMemoryListView(PCollection<T>, PCollectionViews.TypeDescriptorSupplier<T>, WindowingStrategy<?, W>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
-
Returns a
PCollectionView<List<T>>
capable of processing elements windowed using the providedWindowingStrategy
. - InMemoryListViewFn(PCollectionViews.TypeDescriptorSupplier<T>) - Constructor for class org.apache.beam.sdk.values.PCollectionViews.InMemoryListViewFn
- inMemoryListViewUsingVoidKey(PCollection<KV<Void, T>>, PCollectionViews.TypeDescriptorSupplier<T>, WindowingStrategy<?, W>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
-
Returns a
PCollectionView<List<T>>
capable of processing elements windowed using the providedWindowingStrategy
. - InMemoryMapFromVoidKeyViewFn(Coder<K>, Coder<V>) - Constructor for class org.apache.beam.sdk.values.PCollectionViews.InMemoryMapFromVoidKeyViewFn
- inMemoryMapView(PCollection<KV<K, V>>, Coder<K>, Coder<V>, WindowingStrategy<?, W>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
-
Returns a
PCollectionView<Map<K, V>>
capable of processing elements windowed using the providedWindowingStrategy
. - InMemoryMapViewFn(Coder<K>, Coder<V>) - Constructor for class org.apache.beam.sdk.values.PCollectionViews.InMemoryMapViewFn
- inMemoryMapViewUsingVoidKey(PCollection<KV<Void, KV<K, V>>>, Coder<K>, Coder<V>, WindowingStrategy<?, W>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
-
Returns a
PCollectionView<Map<K, V>>
capable of processing elements windowed using the providedWindowingStrategy
. - InMemoryMetaStore - Class in org.apache.beam.sdk.extensions.sql.meta.store
-
A
MetaStore
which stores the meta info in memory. - InMemoryMetaStore() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.store.InMemoryMetaStore
- InMemoryMetaTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider
-
A
InMemoryMetaTableProvider
is an abstractTableProvider
for in-memory types. - InMemoryMetaTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.InMemoryMetaTableProvider
- InMemoryMultimapFromVoidKeyViewFn(Coder<K>, Coder<V>) - Constructor for class org.apache.beam.sdk.values.PCollectionViews.InMemoryMultimapFromVoidKeyViewFn
- inMemoryMultimapView(PCollection<KV<K, V>>, Coder<K>, Coder<V>, WindowingStrategy<?, W>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
-
Returns a
PCollectionView<Map<K, Iterable<V>>>
capable of processing elements windowed using the providedWindowingStrategy
. - InMemoryMultimapViewFn(Coder<K>, Coder<V>) - Constructor for class org.apache.beam.sdk.values.PCollectionViews.InMemoryMultimapViewFn
- inMemoryMultimapViewUsingVoidKey(PCollection<KV<Void, KV<K, V>>>, Coder<K>, Coder<V>, WindowingStrategy<?, W>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
-
Returns a
PCollectionView<Map<K, Iterable<V>>>
capable of processing elements windowed using the providedWindowingStrategy
. - inNamespace(Class<?>) - Static method in class org.apache.beam.sdk.metrics.MetricNameFilter
- inNamespace(String) - Static method in class org.apache.beam.sdk.metrics.MetricNameFilter
- Inner() - Constructor for class org.apache.beam.sdk.schemas.transforms.Filter.Inner
- innerBroadcastJoin(PCollection<RhsT>) - Static method in class org.apache.beam.sdk.schemas.transforms.Join
-
Perform an inner join, broadcasting the right side.
- innerJoin(String, PCollection<KV<K, V1>>, PCollection<KV<K, V2>>) - Static method in class org.apache.beam.sdk.extensions.joinlibrary.Join
-
Inner join of two collections of KV elements.
- innerJoin(PCollection<KV<K, V1>>, PCollection<KV<K, V2>>) - Static method in class org.apache.beam.sdk.extensions.joinlibrary.Join
-
Inner join of two collections of KV elements.
- innerJoin(PCollection<RhsT>) - Static method in class org.apache.beam.sdk.schemas.transforms.Join
-
Perform an inner join.
- inOnlyPane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
-
Creates a new
PAssert.IterableAssert
like this one, but with the assertion restricted to only run on the provided window. - inOnlyPane(BoundedWindow) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
- inOnlyPane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.SingletonAssert
-
Creates a new
PAssert.SingletonAssert
like this one, but with the assertion restricted to only run on the provided window. - inOnTimePane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
-
Creates a new
PAssert.IterableAssert
like this one, but with the assertion restricted to only run on the provided window. - inOnTimePane(BoundedWindow) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
- inOnTimePane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.SingletonAssert
-
Creates a new
PAssert.SingletonAssert
like this one, but with the assertion restricted to only run on the provided window, running the checker only on the on-time pane for each key. - inOrder(List<Trigger>) - Static method in class org.apache.beam.sdk.transforms.windowing.AfterEach
-
Returns an
AfterEach
Trigger
with the given subtriggers. - inOrder(Trigger...) - Static method in class org.apache.beam.sdk.transforms.windowing.AfterEach
-
Returns an
AfterEach
Trigger
with the given subtriggers. - InProcessServerFactory - Class in org.apache.beam.sdk.fn.server
- INPUT - Enum constant in enum class org.apache.beam.sdk.io.cdap.PluginConstants.Format
- INPUT - Enum constant in enum class org.apache.beam.sdk.io.cdap.PluginConstants.FormatProvider
- INPUT - Static variable in class org.apache.beam.sdk.managed.Managed.ManagedTransform
- INPUT_ROWS_TAG - Static variable in class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider
- INPUT_ROWS_TAG - Static variable in class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider
- INPUT_ROWS_TAG - Static variable in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider
- INPUT_ROWS_TAG - Static variable in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider
- INPUT_TAG - Static variable in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteProvider
- inputCollectionNames() - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider
- inputCollectionNames() - Method in class org.apache.beam.sdk.expansion.service.WindowIntoTransformProvider
- inputCollectionNames() - Method in class org.apache.beam.sdk.extensions.sql.expansion.SqlTransformSchemaTransformProvider
- inputCollectionNames() - Method in class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider
- inputCollectionNames() - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformProvider
- inputCollectionNames() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformProvider
-
The expected
PCollectionRowTuple
input tags. - inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformProvider
-
Implementation of the
TypedSchemaTransformProvider
inputCollectionNames method. - inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider
- inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryFileLoadsSchemaTransformProvider
- inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider
- inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider
- inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformProvider
- inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformProvider
- inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
- inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
- inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider
- inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider
- inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider
- inputCollectionNames() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider
- inputCollectionNames() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider
- inputCollectionNames() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider
- inputCollectionNames() - Method in class org.apache.beam.sdk.io.json.providers.JsonWriteTransformProvider
- inputCollectionNames() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider
- inputCollectionNames() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider
- inputCollectionNames() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadProvider
-
Implementation of the
TypedSchemaTransformProvider
inputCollectionNames method. - inputCollectionNames() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteProvider
-
Implementation of the
TypedSchemaTransformProvider
inputCollectionNames method. - inputCollectionNames() - Method in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformProvider
-
Implementation of the
TypedSchemaTransformProvider
inputCollectionNames method. - inputCollectionNames() - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider
- inputCollectionNames() - Method in class org.apache.beam.sdk.schemas.transforms.providers.FlattenTransformProvider
- inputCollectionNames() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider
- inputCollectionNames() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider
- inputCollectionNames() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider
- inputCollectionNames() - Method in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider
- inputCollectionNames() - Method in interface org.apache.beam.sdk.schemas.transforms.SchemaTransformProvider
-
Returns the input collection names of this transform.
- inputFormatProvider - Variable in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
-
This should be set after
SubmitterLifecycle.prepareRun(Object)
call with passing this context object as a param. - inputOf(Contextful.Fn<InputT, OutputT>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
Like
TypeDescriptors.inputOf(ProcessFunction)
but forContextful.Fn
. - inputOf(ProcessFunction<InputT, OutputT>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
Returns a type descriptor for the input of the given
ProcessFunction
, subject to Java type erasure: may contain unresolved type variables if the type was erased. - inputOf(SerializableFunction<InputT, OutputT>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
Binary compatibility adapter for
TypeDescriptors.inputOf(ProcessFunction)
. - INPUTS - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- insert(PartitionMetadata) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
-
Inserts the partition metadata.
- insert(PartitionMetadata) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.InTransactionContext
-
Inserts the partition metadata.
- INSERT - Enum constant in enum class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ModType
- INSERT_OR_UPDATE_URN - Static variable in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar
- INSERT_URN - Static variable in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar
- insertAll(TableReference, List<TableRow>, List<String>) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- insertAll(TableReference, List<FailsafeValueInSingleWindow<TableRow, TableRow>>, List<String>, InsertRetryPolicy, List<ValueInSingleWindow<T>>, ErrorContainer<T>, boolean, boolean, boolean, List<ValueInSingleWindow<TableRow>>) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Inserts
TableRows
with the specified insertIds if not null. - insertAll(TableReference, List<FailsafeValueInSingleWindow<TableRow, TableRow>>, List<String>, InsertRetryPolicy, List<ValueInSingleWindow<T>>, ErrorContainer<T>, boolean, boolean, boolean, List<ValueInSingleWindow<TableRow>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
- insertAll(TableReference, List<FailsafeValueInSingleWindow<TableRow, TableRow>>, List<String>, InsertRetryPolicy, List<ValueInSingleWindow<T>>, ErrorContainer<T>, boolean, boolean, boolean, List<ValueInSingleWindow<TableRow>>) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- InsertBuilder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.InsertBuilder
- insertDataToTable(String, String, String, List<Map<String, Object>>) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
-
Inserts rows to a table using a BigQuery streaming write.
- insertDeduplicate() - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
- insertDistributedSync() - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
- Insertion Method - Search tag in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
- Section
- InsertOrUpdateBuilder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.InsertOrUpdateBuilder
- insertQuorum() - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
- InsertRetryPolicy - Class in org.apache.beam.sdk.io.gcp.bigquery
-
A retry policy for streaming BigQuery inserts.
- InsertRetryPolicy() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy
- InsertRetryPolicy.Context - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Contains information about a failed insert.
- insertRows(Schema, Row...) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
- instance() - Static method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRelMetadataQuery
- INSTANCE - Enum constant in enum class org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention
- INSTANCE - Static variable in interface org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderProvider.Factory
- INSTANCE - Static variable in class org.apache.beam.runners.spark.translation.streaming.CreateStreamingSparkView.Factory
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory.GenericDatumFactory
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.JdbcDriver
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamJavaTypeFactory
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRelDataTypeSystem
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamAggregateProjectMergeRule
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamAggregationRule
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamBasicAggregationRule
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamCalcMergeRule
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamCalcRule
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamCoGBKJoinRule
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamEnumerableConverterRule
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamIntersectRule
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamIOPushDownRule
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamIOSinkRule
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamJoinAssociateRule
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamMatchRule
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamMinusRule
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamSideInputJoinRule
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamSideInputLookupJoinRule
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamSortRule
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamTableFunctionScanRule
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamUncollectRule
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamUnionRule
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamUnnestRule
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamValuesRule
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamWindowRule
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.BeamJavaUdfCalcRule
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.BeamZetaSqlCalcMergeRule
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.BeamZetaSqlCalcRule
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.BeamZetaSqlCalcSplittingRule
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.BeamZetaSqlUncollectRule
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.BeamZetaSqlUnnestRule
- INSTANCE - Static variable in exception class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver.CloseException
- INSTANCE - Static variable in class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow.Coder
- INSTANCE - Static variable in class org.apache.beam.sdk.schemas.AutoValueSchema.AbstractGetterTypeSupplier
- INSTANCE - Static variable in class org.apache.beam.sdk.schemas.JavaBeanSchema.GetterTypeSupplier
- INSTANCE - Static variable in class org.apache.beam.sdk.schemas.JavaFieldSchema.JavaFieldTypeSupplier
- INSTANCE - Static variable in class org.apache.beam.sdk.transforms.windowing.GlobalWindow.Coder
- INSTANCE - Static variable in class org.apache.beam.sdk.transforms.windowing.GlobalWindow
-
Singleton instance of
GlobalWindow
. - INSTANCE - Static variable in class org.apache.beam.sdk.transforms.windowing.PaneInfo.PaneInfoCoder
- instanceId - Variable in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
- InstantCoder - Class in org.apache.beam.sdk.coders
- InstantDeserializer - Class in org.apache.beam.sdk.io.kafka.serialization
-
Kafka
Deserializer
forInstant
. - InstantDeserializer() - Constructor for class org.apache.beam.sdk.io.kafka.serialization.InstantDeserializer
- instantiateCoder(String, RunnerApi.Components) - Static method in class org.apache.beam.runners.fnexecution.translation.PipelineTranslatorUtils
-
Creates a coder for a given PCollection id from the Proto definition.
- instantiateDestination(String) - Method in interface org.apache.beam.sdk.io.iceberg.DynamicDestinations
- instantiateHealthcareClient() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read.FetchHL7v2Message.HL7v2MessageGetFn
-
Instantiate healthcare client (version v1).
- instantiateHealthcareClient() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.FetchHL7v2Message.HL7v2MessageGetFn
-
Instantiate healthcare client (version v1).
- instantiateRunnerWireCoder(PipelineNode.PCollectionNode, RunnerApi.Components) - Static method in class org.apache.beam.runners.fnexecution.wire.WireCoders
-
Instantiates a runner-side wire coder for the given PCollection.
- instantiateRunnerWireCoder(PipelineNode.PCollectionNode, RunnerApi.Components, RunnerApi.ExecutableStagePayload.WireCoderSetting) - Static method in class org.apache.beam.runners.fnexecution.wire.WireCoders
-
Instantiates a runner-side wire coder for the given PCollection.
- InstantSerializer - Class in org.apache.beam.sdk.io.kafka.serialization
-
Kafka
Serializer
forInstant
. - InstantSerializer() - Constructor for class org.apache.beam.sdk.io.kafka.serialization.InstantSerializer
- InstructionRequestHandler - Interface in org.apache.beam.runners.fnexecution.control
-
Interface for any function that can handle a Fn API
BeamFnApi.InstructionRequest
. - INT16 - Enum constant in enum class org.apache.beam.sdk.io.clickhouse.TableSchema.TypeName
- INT16 - Enum constant in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- INT16 - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- INT16 - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
-
The type of int16 fields.
- INT32 - Enum constant in enum class org.apache.beam.sdk.io.clickhouse.TableSchema.TypeName
- INT32 - Enum constant in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- INT32 - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- INT32 - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
-
The type of int32 fields.
- INT64 - Enum constant in enum class org.apache.beam.sdk.io.clickhouse.TableSchema.TypeName
- INT64 - Enum constant in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- INT64 - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- INT64 - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
-
The type of int64 fields.
- INT8 - Enum constant in enum class org.apache.beam.sdk.io.clickhouse.TableSchema.TypeName
- INT8 - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- IntBundle() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.IntBundle
- INTEGER - Enum constant in enum class org.apache.beam.sdk.transforms.display.DisplayData.Type
- INTEGER - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
- integers() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
The
TypeDescriptor
for Integer. - integersGlobally() - Static method in class org.apache.beam.sdk.transforms.Max
-
Returns a
PTransform
that takes an inputPCollection<Integer>
and returns aPCollection<Integer>
whose contents is the maximum of the inputPCollection
's elements, orInteger.MIN_VALUE
if there are no elements. - integersGlobally() - Static method in class org.apache.beam.sdk.transforms.Min
-
Returns a
PTransform
that takes an inputPCollection<Integer>
and returns aPCollection<Integer>
whose contents is a single value that is the minimum of the inputPCollection
's elements, orInteger.MAX_VALUE
if there are no elements. - integersGlobally() - Static method in class org.apache.beam.sdk.transforms.Sum
-
Returns a
PTransform
that takes an inputPCollection<Integer>
and returns aPCollection<Integer>
whose contents is the sum of the inputPCollection
's elements, or0
if there are no elements. - integersPerKey() - Static method in class org.apache.beam.sdk.transforms.Max
-
Returns a
PTransform
that takes an inputPCollection<KV<K, Integer>>
and returns aPCollection<KV<K, Integer>>
that contains an output element mapping each distinct key in the inputPCollection
to the maximum of the values associated with that key in the inputPCollection
. - integersPerKey() - Static method in class org.apache.beam.sdk.transforms.Min
-
Returns a
PTransform
that takes an inputPCollection<KV<K, Integer>>
and returns aPCollection<KV<K, Integer>>
that contains an output element mapping each distinct key in the inputPCollection
to the minimum of the values associated with that key in the inputPCollection
. - integersPerKey() - Static method in class org.apache.beam.sdk.transforms.Sum
-
Returns a
PTransform
that takes an inputPCollection<KV<K, Integer>>
and returns aPCollection<KV<K, Integer>>
that contains an output element mapping each distinct key in the inputPCollection
to the sum of the values associated with that key in the inputPCollection
. - INTERACTIVE - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead.QueryPriority
-
Specifies that a query should be run with an INTERACTIVE priority.
- interceptor() - Static method in class org.apache.beam.sdk.fn.server.GrpcContextHeaderAccessorProvider
- interceptResponse(HttpResponse) - Method in class org.apache.beam.sdk.extensions.gcp.util.UploadIdResponseInterceptor
- Intermediate Representation - Search tag in class org.apache.beam.sdk.extensions.sbe.SbeSchema
- Section
- Internal - Annotation Interface in org.apache.beam.sdk.annotations
-
Signifies that a publicly accessible API (public class, method or field) is intended for internal use only and not for public consumption.
- InterpolateData() - Constructor for class org.apache.beam.sdk.extensions.timeseries.FillGaps.InterpolateData
- interpolateKey(double) - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
-
Returns a
ByteKey
key
such that[startKey, key)
represents approximately the specified fraction of the range[startKey, endKey)
. - Interpreting ByteKey in a ByteKeyRange - Search tag in class org.apache.beam.sdk.io.range.ByteKeyRange
- Section
- INTERSECT - Enum constant in enum class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSetOperatorRelBase.OpType
- intersectAll() - Static method in class org.apache.beam.sdk.transforms.Sets
-
Returns a new
PTransform
transform that follows SET ALL semantics which takes aPCollectionList<PCollection<T>>
and returns aPCollection<T>
containing the intersection all of collections done in order for all collections inPCollectionList<T>
. - intersectAll(PCollection<T>) - Static method in class org.apache.beam.sdk.transforms.Sets
-
Returns a new
PTransform
transform that follows SET ALL semantics to compute the intersection with providedPCollection<T>
. - intersectDistinct() - Static method in class org.apache.beam.sdk.transforms.Sets
-
Returns a
PTransform
that takes aPCollectionList<PCollection<T>>
and returns aPCollection<T>
containing the intersection of collections done in order for all collections inPCollectionList<T>
. - intersectDistinct(PCollection<T>) - Static method in class org.apache.beam.sdk.transforms.Sets
-
Returns a new
PTransform
transform that follows SET DISTINCT semantics to compute the intersection with providedPCollection<T>
. - intersects(IntervalWindow) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
-
Returns whether this window intersects the given window.
- IntervalWindow - Class in org.apache.beam.sdk.transforms.windowing
-
An implementation of
BoundedWindow
that represents an interval fromIntervalWindow.start
(inclusive) toIntervalWindow.end
(exclusive). - IntervalWindow(Instant, Instant) - Constructor for class org.apache.beam.sdk.transforms.windowing.IntervalWindow
-
Creates a new IntervalWindow that represents the half-open time interval [start, end).
- IntervalWindow(Instant, ReadableDuration) - Constructor for class org.apache.beam.sdk.transforms.windowing.IntervalWindow
- IntervalWindow.IntervalWindowCoder - Class in org.apache.beam.sdk.transforms.windowing
-
Encodes an
IntervalWindow
as a pair of its upper bound and duration. - IntervalWindowCoder() - Constructor for class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
- into(WindowFn<? super T, ?>) - Static method in class org.apache.beam.sdk.transforms.windowing.Window
- into(TypeDescriptor<K2>) - Static method in class org.apache.beam.sdk.transforms.MapKeys
-
Returns a new
MapKeys
transform with the given type descriptor for the output type, but the mapping function yet to be specified usingMapKeys.via(SerializableFunction)
. - into(TypeDescriptor<OutputT>) - Static method in class org.apache.beam.sdk.transforms.FlatMapElements
-
Returns a new
FlatMapElements
transform with the given type descriptor for the output type, but the mapping function yet to be specified usingFlatMapElements.via(ProcessFunction)
. - into(TypeDescriptor<OutputT>) - Static method in class org.apache.beam.sdk.transforms.MapElements
-
Returns a new
MapElements
transform with the given type descriptor for the output type, but the mapping function yet to be specified usingMapElements.via(ProcessFunction)
. - into(TypeDescriptor<V2>) - Static method in class org.apache.beam.sdk.transforms.MapValues
-
Returns a new
MapValues
transform with the given type descriptor for the output type, but the mapping function yet to be specified usingMapValues.via(SerializableFunction)
. - InTransactionContext(String, TransactionContext, Dialect) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.InTransactionContext
-
Constructs a context to execute a user defined function transactionally.
- InvalidConfigurationException - Exception Class in org.apache.beam.sdk.schemas.io
-
Exception thrown when the configuration for a
SchemaIO
is invalid. - InvalidConfigurationException(String) - Constructor for exception class org.apache.beam.sdk.schemas.io.InvalidConfigurationException
- InvalidConfigurationException(String, Throwable) - Constructor for exception class org.apache.beam.sdk.schemas.io.InvalidConfigurationException
- InvalidConfigurationException(Throwable) - Constructor for exception class org.apache.beam.sdk.schemas.io.InvalidConfigurationException
- InvalidLocationException - Exception Class in org.apache.beam.sdk.schemas.io
-
Exception thrown when the configuration for a
SchemaIO
is invalid. - InvalidLocationException(String) - Constructor for exception class org.apache.beam.sdk.schemas.io.InvalidLocationException
- InvalidLocationException(String, Throwable) - Constructor for exception class org.apache.beam.sdk.schemas.io.InvalidLocationException
- InvalidLocationException(Throwable) - Constructor for exception class org.apache.beam.sdk.schemas.io.InvalidLocationException
- InvalidSchemaException - Exception Class in org.apache.beam.sdk.schemas.io
-
Exception thrown when the schema for a
SchemaIO
is invalid. - InvalidSchemaException(String) - Constructor for exception class org.apache.beam.sdk.schemas.io.InvalidSchemaException
- InvalidSchemaException(String, Throwable) - Constructor for exception class org.apache.beam.sdk.schemas.io.InvalidSchemaException
- InvalidSchemaException(Throwable) - Constructor for exception class org.apache.beam.sdk.schemas.io.InvalidSchemaException
- InvalidTableException - Exception Class in org.apache.beam.sdk.extensions.sql.meta.provider
-
Exception thrown when the request for a table is invalid, such as invalid metadata.
- InvalidTableException(String) - Constructor for exception class org.apache.beam.sdk.extensions.sql.meta.provider.InvalidTableException
- InvalidTableException(String, Throwable) - Constructor for exception class org.apache.beam.sdk.extensions.sql.meta.provider.InvalidTableException
- InvalidTableException(Throwable) - Constructor for exception class org.apache.beam.sdk.extensions.sql.meta.provider.InvalidTableException
- invertNormalizedKey() - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueComparator
- invocationUtil - Variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- invokeAdvance(ReaderT) - Method in class org.apache.beam.runners.flink.metrics.ReaderInvocationUtil
- invokeFinishBundle() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- invokeStart(ReaderT) - Method in class org.apache.beam.runners.flink.metrics.ReaderInvocationUtil
- invokeWithExecutor(RunnerApi.Pipeline, Struct, String, ListeningExecutorService) - Method in class org.apache.beam.runners.flink.FlinkJobInvoker
- invokeWithExecutor(RunnerApi.Pipeline, Struct, String, ListeningExecutorService) - Method in class org.apache.beam.runners.spark.SparkJobInvoker
- invokeWithExecutor(RunnerApi.Pipeline, Struct, String, ListeningExecutorService) - Method in class org.apache.beam.runners.jobsubmission.JobInvoker
-
Start running a job, abstracting its state as a
JobInvocation
instance. - inWindow(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
-
Creates a new
PAssert.IterableAssert
like this one, but with the assertion restricted to only run on the provided window. - inWindow(BoundedWindow) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
- inWindow(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.SingletonAssert
-
Creates a new
PAssert.SingletonAssert
like this one, but with the assertion restricted to only run on the provided window. - ioException() - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.StorageObjectOrIOException
-
Returns the
IOException
. - ir() - Method in class org.apache.beam.sdk.extensions.sbe.SerializableIr
-
Returns the underlying
Ir
. - IrOptions() - Constructor for class org.apache.beam.sdk.extensions.sbe.SbeSchema.IrOptions
- IS_MERGING_WINDOW_FN - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- IS_PAIR_LIKE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- IS_STREAM_LIKE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- IS_WRAPPER - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- isAbsolute() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- isAccessible() - Method in interface org.apache.beam.sdk.options.ValueProvider
-
Whether the contents of this
ValueProvider
is currently available viaValueProvider.get()
. - isAccessible() - Method in class org.apache.beam.sdk.options.ValueProvider.NestedValueProvider
- isAccessible() - Method in class org.apache.beam.sdk.options.ValueProvider.RuntimeValueProvider
- isAccessible() - Method in class org.apache.beam.sdk.options.ValueProvider.StaticValueProvider
- isAliveOrThrow() - Method in class org.apache.beam.runners.fnexecution.environment.ProcessManager.RunningProcess
-
Checks if the underlying process is still running.
- isAllowDuplicates() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- isAllowedLatenessSpecified() - Method in class org.apache.beam.sdk.values.WindowingStrategy
- isAlreadyMerged() - Method in class org.apache.beam.sdk.values.WindowingStrategy
- isAppProfileSingleClusterAndTransactional(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
-
Verify the app profile is for single cluster routing with allow single-row transactions enabled.
- isArray() - Method in class org.apache.beam.sdk.values.TypeDescriptor
-
Returns true if this type is known to be an array type.
- isAtSplitPoint() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
-
Returns true if the reader is at a split point.
- isAtSplitPoint() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
-
Returns true only for the first record; compressed sources cannot be split.
- isAtSplitPoint() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
-
Returns whether the current record is at a split point (i.e., whether the current record would be the first record to be read by a source with a specified start offset of
OffsetBasedSource.OffsetBasedReader.getCurrentOffset()
). - isAutoBalanceWriteFilesShardingEnabled() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- isAvailable() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- isAvailableForAliveReaders() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.bounded.FlinkBoundedSourceReader
- isAvailableForAliveReaders() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
-
This method needs to be overridden by subclasses to determine if data is available when there are alive readers.
- isAvailableForAliveReaders() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.unbounded.FlinkUnboundedSourceReader
-
Check whether there are data available from alive readers.
- isBasicType() - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeInformation
- isBasicType() - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueTypeInformation
- isBlockOnRun() - Method in interface org.apache.beam.runners.direct.DirectOptions
- isBlockOnRun() - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
- isBounded() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSchemaIOProvider
- isBounded() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
- isBounded() - Method in interface org.apache.beam.sdk.extensions.sql.impl.rel.BeamRelNode
-
Whether the collection of rows represented by this relational expression is bounded (known to be finite) or unbounded (may or may not be finite).
- isBounded() - Method in class org.apache.beam.sdk.extensions.sql.impl.schema.BeamPCollectionTable
- isBounded() - Method in interface org.apache.beam.sdk.extensions.sql.meta.BeamSqlTable
-
Whether this table is bounded (known to be finite) or unbounded (may or may not be finite).
- isBounded() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigtable.BigtableTable
- isBounded() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datagen.DataGeneratorTable
- isBounded() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.hcatalog.HCatalogTable
- isBounded() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
- isBounded() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTable
- isBounded() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestBoundedTable
- isBounded() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestUnboundedTable
- isBounded() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTable
- isBounded() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaIOProvider
-
Indicates whether the PCollections produced by this transform will contain a bounded or unbounded number of elements.
- isBounded() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.ReadChangeStreamPartitionProgressTracker
-
This restriction tracker is for unbounded streams.
- isBounded() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider
- isBounded() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
- isBounded() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
- isBounded() - Method in class org.apache.beam.sdk.io.jdbc.JdbcSchemaIOProvider
- isBounded() - Method in interface org.apache.beam.sdk.schemas.io.SchemaIOProvider
- isBounded() - Method in class org.apache.beam.sdk.transforms.PeriodicSequence.OutputRangeTracker
- isBounded() - Method in class org.apache.beam.sdk.transforms.splittabledofn.ByteKeyRangeTracker
- isBounded() - Method in class org.apache.beam.sdk.transforms.splittabledofn.GrowableOffsetRangeTracker
- isBounded() - Method in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
- isBounded() - Method in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker
-
Return the boundedness of the current restriction.
- isBounded() - Method in class org.apache.beam.sdk.values.PCollection
- isBoundedCollection(Collection<PCollection<?>>) - Method in class org.apache.beam.runners.spark.SparkRunner.Evaluator
- isCacheDisabled() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
- isCandidateForGroupByKeyAndWindow(GroupByKey<K, V>) - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
-
Returns if given GBK transform can be considered as candidate for group by key and window translation aiming to reduce memory usage.
- isCleanArtifactsPerJob() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver.ServerConfiguration
- isClosed() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- isClosed() - Method in interface org.apache.beam.sdk.io.solace.broker.MessageProducer
-
Returns true if the message producer is closed, false otherwise.
- isClosed() - Method in class org.apache.beam.sdk.io.solace.broker.SolaceMessageProducer
- isClosed() - Method in class org.apache.beam.sdk.transforms.errorhandling.ErrorHandler.DefaultErrorHandler
- isClosed() - Method in interface org.apache.beam.sdk.transforms.errorhandling.ErrorHandler
- isClosed() - Method in class org.apache.beam.sdk.transforms.errorhandling.ErrorHandler.PTransformErrorHandler
- isCollectionType() - Method in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- isCommitOffsetsInFinalizeEnabled() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- isCompatible(Trigger) - Method in class org.apache.beam.sdk.transforms.windowing.AfterPane
- isCompatible(Trigger) - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
- isCompatible(Trigger) - Method in class org.apache.beam.sdk.transforms.windowing.DefaultTrigger
- isCompatible(Trigger) - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
-
For internal use only; no backwards-compatibility guarantees.
- isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.extensions.sql.impl.TVFSlidingWindowFn
- isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
- isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
- isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
- isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
- isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
- isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
- isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
- isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
-
Deprecated.please override verifyCompatibility to throw a useful error message; we will remove isCompatible at version 3.0.0
- isCompositeType() - Method in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- isCompound() - Method in class org.apache.beam.sdk.extensions.sql.impl.TableName
-
Whether it's a compound table name (with multiple path components).
- isCompressed(String) - Static method in enum class org.apache.beam.sdk.io.CompressedSource.CompressionMode
-
Deprecated.Returns whether the file's extension matches of one of the known compression formats.
- isCompressed(String) - Method in enum class org.apache.beam.sdk.io.Compression
- isCompressionEnabled() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
- isConsumingReceivedData() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver
- isCooperative() - Method in class org.apache.beam.runners.jet.processors.ParDoP
- isCooperative() - Method in class org.apache.beam.runners.jet.processors.StatefulParDoP
- isCooperative() - Method in class org.apache.beam.runners.jet.processors.BoundedSourceP
- isCooperative() - Method in class org.apache.beam.runners.jet.processors.UnboundedSourceP
- isDateTimeType(Schema.FieldType) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
-
Returns true if the type is any of the various date time types.
- isDateType() - Method in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- isDdl(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
- isDecimal(Schema.TypeName) - Static method in class org.apache.beam.sdk.schemas.transforms.Cast
-
Checks if type is decimal.
- isDeleteCheckpointDir() - Method in interface org.apache.beam.runners.spark.TestSparkPipelineOptions
- isDirectory() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsResourceId
- isDirectory() - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem.ClassLoaderResourceId
- isDirectory() - Method in interface org.apache.beam.sdk.io.fs.ResourceId
-
Returns
true
if thisResourceId
represents a directory, false otherwise. - isDisjoint(IntervalWindow) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
-
Returns whether this window is disjoint from the given window.
- isDone() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
-
Returns true if the last call to
OffsetBasedSource.OffsetBasedReader.start()
orOffsetBasedSource.OffsetBasedReader.advance()
returned false. - isDone() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
- isDone() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
- isDynamicRead() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- isEmpty() - Method in class org.apache.beam.runners.flink.translation.functions.FlinkSideInputReader
- isEmpty() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.CachedSideInputReader
- isEmpty() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.SparkSideInputReader
- isEmpty() - Method in interface org.apache.beam.runners.spark.translation.SparkCombineFn.WindowedAccumulator
-
Check if this accumulator is empty.
- isEmpty() - Method in class org.apache.beam.runners.spark.util.CachedSideInputReader
- isEmpty() - Method in class org.apache.beam.runners.spark.util.SparkSideInputReader
- isEmpty() - Method in class org.apache.beam.runners.twister2.utils.Twister2SideInputReader
- isEmpty() - Method in class org.apache.beam.sdk.fn.data.WeightedList
- isEmpty() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
- isEmpty() - Method in class org.apache.beam.sdk.io.range.ByteKey
- isEmpty() - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
- isEmpty() - Method in interface org.apache.beam.sdk.state.GroupingState
-
Returns a
ReadableState
whoseReadableState.read()
method will return true if this state is empty at the point when thatReadableState.read()
call returns. - isEmpty() - Method in interface org.apache.beam.sdk.state.MapState
-
Returns a
ReadableState
whoseReadableState.read()
method will return true if this state is empty at the point when thatReadableState.read()
call returns. - isEmpty() - Method in interface org.apache.beam.sdk.state.MultimapState
-
Returns a
ReadableState
whoseReadableState.read()
method will return true if this state is empty at the point when thatReadableState.read()
call returns. - isEmpty() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
- isEmpty() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
- isEmpty() - Method in class org.apache.beam.sdk.transforms.Requirements
-
Whether this is an empty set of requirements.
- isEmpty(StateAccessor<K>) - Method in class org.apache.beam.runners.twister2.translators.functions.internal.SystemReduceFnBuffering
- isEnableStreamingEngine() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
- isEncodingPositionsOverridden() - Method in class org.apache.beam.sdk.schemas.Schema
-
Returns whether encoding positions have been explicitly overridden.
- isEnforceEncodability() - Method in interface org.apache.beam.runners.direct.DirectOptions
- isEnforceImmutability() - Method in interface org.apache.beam.runners.direct.DirectOptions
- isEOF() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Return true if
PubsubClient.pull(long, org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath, int, boolean)
will always return empty list. - isEOF() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
- isEOF() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
- isEOF() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- isEOF() - Method in interface org.apache.beam.sdk.io.solace.broker.MessageReceiver
-
Test clients may return true to signal that all expected messages have been pulled and the test may complete.
- isEOS() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- isEqualTo(T) - Method in interface org.apache.beam.sdk.testing.PAssert.SingletonAssert
-
Asserts that the value in question is equal to the provided value, according to
Object.equals(java.lang.Object)
. - isEqWithEpsilon(RelOptCost) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
- isExternalizedCheckpointsEnabled() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
-
Enables or disables externalized checkpoints.
- isFailToLock() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
- isFirst() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
-
Return true if this is the first pane produced for the associated window.
- IsFlinkNativeTransform() - Constructor for class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.IsFlinkNativeTransform
- IsFlinkNativeTransform() - Constructor for class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator.IsFlinkNativeTransform
- isForceStreaming() - Method in interface org.apache.beam.runners.spark.TestSparkPipelineOptions
- isForceWatermarkSync() - Method in class org.apache.beam.runners.spark.io.CreateStream
- isGetter(Method) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
- isHeartbeat() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
- isHotKeyLoggingEnabled() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
-
If enabled then the literal key will be logged to Cloud Logging if a hot key is detected.
- isImmutableType() - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeSerializer
- isImmutableType() - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueSerializer
- isImmutableType() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.FlinkStateNamespaceKeySerializer
- isIn(Collection<T>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.isIn(Collection)
. - isIn(Coder<T>, Collection<T>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.isIn(Collection)
. - isIn(Coder<T>, T[]) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.isIn(Object[])
. - isIn(T[]) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.isIn(Object[])
. - isInboundEdgeOfVertex(Edge, String, String, String) - Method in interface org.apache.beam.runners.jet.DAGBuilder.WiringListener
- isInboundEdgeOfVertex(Edge, String, String, String) - Method in class org.apache.beam.runners.jet.processors.ParDoP.Supplier
- isInboundEdgeOfVertex(Edge, String, String, String) - Method in class org.apache.beam.runners.jet.processors.StatefulParDoP.Supplier
- isInboundEdgeOfVertex(Edge, String, String, String) - Method in class org.apache.beam.runners.jet.processors.FlattenP.Supplier
- isInf(Double) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.IsInf
- isInf(Float) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.IsInf
- IsInf - Class in org.apache.beam.sdk.extensions.sql.impl.udf
-
IS_INF(X)
- IsInf() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.udf.IsInf
- isInfinite() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
- isInitialEvent(long, EventT) - Method in interface org.apache.beam.sdk.extensions.ordered.EventExaminer
-
Is this event the first expected event for the given key and window if the per key sequence is used? In case of global sequence it determines the first global sequence event.
- isInitialPartition(String) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.InitialPartition
-
Verifies if the given partition token is the initial partition.
- isInputSortRelAndLimitOnly() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.AbstractBeamCalcRel
- isInsertDefault() - Method in class org.apache.beam.sdk.transforms.Combine.Globally
-
Returns whether or not this transformation applies a default value.
- isIntegral(Schema.TypeName) - Static method in class org.apache.beam.sdk.schemas.transforms.Cast
-
Checks if type is integral.
- isJoinLegal(Join) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
-
This method checks if a join is legal and can be converted into Beam SQL.
- isKey(ImmutableBitSet) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
- isKeyType() - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeInformation
- isKeyType() - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueTypeInformation
- isLast() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
-
Return true if this is the last pane that will be produced in the associated window.
- isLastEvent(long, EventT) - Method in interface org.apache.beam.sdk.extensions.ordered.EventExaminer
-
Is this event the last expected event for a given key and window?
- isLastEventReceived() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus
- isLastRecordInTransactionInPartition() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
Indicates whether this record is the last emitted for the given transaction in the given partition.
- isLe(RelOptCost) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
- isLeaf(PCollection<?>) - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator.TranslationState
- isLeaf(PCollection<?>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
- isLeaf(PCollection<?>) - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
-
Check if given
PCollection
is a leaf or not. - isLimitOnly() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel
- isLogicalType() - Method in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- isLogicalType(String) - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
- isLt(RelOptCost) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
- ISM_SHARD_INDEX_CODER - Static variable in class org.apache.beam.runners.dataflow.internal.IsmFormat
-
A
ListCoder
wrapping aIsmFormat.IsmShardCoder
used to encode the shard index. - isMapType() - Method in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- isMetadata() - Method in class org.apache.beam.sdk.values.PCollectionViews.ValueOrMetadata
- isMetadataKey(List<?>) - Static method in class org.apache.beam.runners.dataflow.internal.IsmFormat
-
Returns true if and only if any of the passed in key components represent a metadata key.
- isMetricsSupported() - Static method in class org.apache.beam.sdk.metrics.MetricsEnvironment
-
Indicates whether metrics reporting is supported.
- IsmFormat - Class in org.apache.beam.runners.dataflow.internal
-
An Ism file is a prefix encoded composite key value file broken into shards.
- IsmFormat() - Constructor for class org.apache.beam.runners.dataflow.internal.IsmFormat
- IsmFormat.Footer - Class in org.apache.beam.runners.dataflow.internal
-
The footer stores the relevant information required to locate the index and bloom filter.
- IsmFormat.FooterCoder - Class in org.apache.beam.runners.dataflow.internal
-
A
Coder
forIsmFormat.Footer
. - IsmFormat.IsmRecord<V> - Class in org.apache.beam.runners.dataflow.internal
-
A record containing a composite key and either a value or metadata.
- IsmFormat.IsmRecordCoder<V> - Class in org.apache.beam.runners.dataflow.internal
-
A
Coder
forIsmFormat.IsmRecord
s. - IsmFormat.IsmShard - Class in org.apache.beam.runners.dataflow.internal
-
A shard descriptor containing shard id, the data block offset, and the index offset for the given shard.
- IsmFormat.IsmShardCoder - Class in org.apache.beam.runners.dataflow.internal
-
A coder for
IsmFormat.IsmShard
s. - IsmFormat.KeyPrefix - Class in org.apache.beam.runners.dataflow.internal
-
The prefix used before each key which contains the number of shared and unshared bytes from the previous key that was read.
- IsmFormat.KeyPrefixCoder - Class in org.apache.beam.runners.dataflow.internal
-
A
Coder
forIsmFormat.KeyPrefix
. - IsmFormat.MetadataKeyCoder<K> - Class in org.apache.beam.runners.dataflow.internal
-
A coder for metadata key component.
- isModeSpecified() - Method in class org.apache.beam.sdk.values.WindowingStrategy
- isMutable() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
- isNan(Double) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.IsNan
- isNan(Float) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.IsNan
- IsNan - Class in org.apache.beam.sdk.extensions.sql.impl.udf
-
IS_NAN(X)
- IsNan() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.udf.IsNan
- isNonMerging() - Method in class org.apache.beam.sdk.transforms.windowing.NonMergingWindowFn
- isNonMerging() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
-
Returns true if this
WindowFn
never needs to merge any windows. - isNormalizedKeyPrefixOnly(int) - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueComparator
- isNull(String) - Method in class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.IsNullFn
- isNullable() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.TypeWithNullability
- isNullable() - Method in class org.apache.beam.sdk.io.snowflake.data.SnowflakeColumn
- isNullable() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
-
Returns whether the field is nullable.
- IsNullFn() - Constructor for class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.IsNullFn
- isNumericType() - Method in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- isOneOf(Coder<T>, T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.isOneOf(T...)
. - isOneOf(T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.isOneOf(T...)
. - isOpen() - Method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingReadableByteChannel
- isOpen() - Method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingSeekableByteChannel
- isOpen() - Method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingWritableByteChannel
- isOutboundEdgeOfVertex(Edge, String, String, String) - Method in interface org.apache.beam.runners.jet.DAGBuilder.WiringListener
- isOutboundEdgeOfVertex(Edge, String, String, String) - Method in class org.apache.beam.runners.jet.processors.ParDoP.Supplier
- isOutboundEdgeOfVertex(Edge, String, String, String) - Method in class org.apache.beam.runners.jet.processors.StatefulParDoP.Supplier
- isOutboundEdgeOfVertex(Edge, String, String, String) - Method in class org.apache.beam.runners.jet.processors.FlattenP.Supplier
- isPreviewEnabled() - Method in class org.apache.beam.sdk.io.cdap.context.BatchSinkContextImpl
- isPreviewEnabled() - Method in class org.apache.beam.sdk.io.cdap.context.BatchSourceContextImpl
- isPreviewEnabled() - Method in class org.apache.beam.sdk.io.cdap.context.StreamingSourceContextImpl
- isPrimaryKey() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
-
True if the column is part of the primary key, false otherwise.
- isPrimitiveType() - Method in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- isProduceStatusUpdateOnEveryEvent() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler
-
Indicates if the status update needs to be sent after each event's processing.
- isProtoChangeRecord() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSet
-
Returns true if the result set at the current pointer contain only one proto change record.
- isQueueNonExclusive(String) - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthSempClient
- isQueueNonExclusive(String) - Method in class org.apache.beam.sdk.io.solace.broker.SempBasicAuthClientExecutor
- isQueueNonExclusive(String) - Method in interface org.apache.beam.sdk.io.solace.broker.SempClient
-
Determines if the specified queue is non-exclusive.
- isReadOnly() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- isReadSeekEfficient() - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata
- isReady() - Method in class org.apache.beam.sdk.fn.stream.DataStreams.DataStreamDecoder
- isReady() - Method in interface org.apache.beam.sdk.fn.stream.PrefetchableIterator
-
Returns
true
if and only ifIterator.hasNext()
andIterator.next()
will not require an expensive operation. - isReceiverStopped() - Method in class org.apache.beam.sdk.io.sparkreceiver.WrappedSupervisor
- isRedistributed() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- isRegisterByteSizeObserverCheap(byte[]) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
-
Returns whether
Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver)
cheap enough to call for every element, that is, if thisCoder
can calculate the byte size of the element to be coded in roughly constant time (or lazily). - isRegisterByteSizeObserverCheap(T) - Method in class org.apache.beam.sdk.coders.NullableCoder
-
NullableCoder
is cheap ifvalueCoder
is cheap. - isRegisterByteSizeObserverCheap(HyperLogLogPlus) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.HyperLogLogPlusCoder
- isRegisterByteSizeObserverCheap(ByteString) - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
-
Returns whether
Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver)
cheap enough to call for every element, that is, if thisCoder
can calculate the byte size of the element to be coded in roughly constant time (or lazily). - isRegisterByteSizeObserverCheap(IterableT) - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
-
Returns whether
Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver)
cheap enough to call for every element, that is, if thisCoder
can calculate the byte size of the element to be coded in roughly constant time (or lazily). - isRegisterByteSizeObserverCheap(Boolean) - Method in class org.apache.beam.sdk.coders.BooleanCoder
- isRegisterByteSizeObserverCheap(Byte) - Method in class org.apache.beam.sdk.coders.ByteCoder
-
Returns whether
Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver)
cheap enough to call for every element, that is, if thisCoder
can calculate the byte size of the element to be coded in roughly constant time (or lazily). - isRegisterByteSizeObserverCheap(Double) - Method in class org.apache.beam.sdk.coders.DoubleCoder
-
Returns whether
Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver)
cheap enough to call for every element, that is, if thisCoder
can calculate the byte size of the element to be coded in roughly constant time (or lazily). - isRegisterByteSizeObserverCheap(Float) - Method in class org.apache.beam.sdk.coders.FloatCoder
-
Returns whether
Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver)
cheap enough to call for every element, that is, if thisCoder
can calculate the byte size of the element to be coded in roughly constant time (or lazily). - isRegisterByteSizeObserverCheap(Integer) - Method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
-
Returns whether
Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver)
cheap enough to call for every element, that is, if thisCoder
can calculate the byte size of the element to be coded in roughly constant time (or lazily). - isRegisterByteSizeObserverCheap(Integer) - Method in class org.apache.beam.sdk.coders.VarIntCoder
-
Returns whether
Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver)
cheap enough to call for every element, that is, if thisCoder
can calculate the byte size of the element to be coded in roughly constant time (or lazily). - isRegisterByteSizeObserverCheap(Long) - Method in class org.apache.beam.sdk.coders.BigEndianLongCoder
-
Returns whether
Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver)
cheap enough to call for every element, that is, if thisCoder
can calculate the byte size of the element to be coded in roughly constant time (or lazily). - isRegisterByteSizeObserverCheap(Long) - Method in class org.apache.beam.sdk.coders.VarLongCoder
-
Returns whether
Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver)
cheap enough to call for every element, that is, if thisCoder
can calculate the byte size of the element to be coded in roughly constant time (or lazily). - isRegisterByteSizeObserverCheap(Short) - Method in class org.apache.beam.sdk.coders.BigEndianShortCoder
-
Returns whether
Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver)
cheap enough to call for every element, that is, if thisCoder
can calculate the byte size of the element to be coded in roughly constant time (or lazily). - isRegisterByteSizeObserverCheap(Void) - Method in class org.apache.beam.sdk.coders.VoidCoder
-
Returns whether
Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver)
cheap enough to call for every element, that is, if thisCoder
can calculate the byte size of the element to be coded in roughly constant time (or lazily). - isRegisterByteSizeObserverCheap(BigDecimal) - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
-
Returns whether
Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver)
cheap enough to call for every element, that is, if thisCoder
can calculate the byte size of the element to be coded in roughly constant time (or lazily). - isRegisterByteSizeObserverCheap(BigInteger) - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
-
Returns whether
Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver)
cheap enough to call for every element, that is, if thisCoder
can calculate the byte size of the element to be coded in roughly constant time (or lazily). - isRegisterByteSizeObserverCheap(Optional<T>) - Method in class org.apache.beam.sdk.coders.OptionalCoder
-
OptionalCoder
is cheap ifvalueCoder
is cheap. - isRegisterByteSizeObserverCheap(IsmFormat.Footer) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.FooterCoder
- isRegisterByteSizeObserverCheap(IsmFormat.KeyPrefix) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.KeyPrefixCoder
- isRegisterByteSizeObserverCheap(RandomAccessData) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
- isRegisterByteSizeObserverCheap(EncodedBoundedWindow) - Method in class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow.Coder
- isRegisterByteSizeObserverCheap(KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
- isRegisterByteSizeObserverCheap(OffsetRange) - Method in class org.apache.beam.sdk.io.range.OffsetRange.Coder
- isRegisterByteSizeObserverCheap(RawUnionValue) - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
-
Since this coder uses elementCoders.get(index) and coders that are known to run in constant time, we defer the return value to that coder.
- isRegisterByteSizeObserverCheap(IntervalWindow) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
- isRegisterByteSizeObserverCheap(KV<K, V>) - Method in class org.apache.beam.sdk.coders.KvCoder
-
Returns whether both keyCoder and valueCoder are considered not expensive.
- isRegisterByteSizeObserverCheap(ByteString) - Method in class org.apache.beam.runners.fnexecution.wire.ByteStringCoder
-
Returns whether
Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver)
cheap enough to call for every element, that is, if thisCoder
can calculate the byte size of the element to be coded in roughly constant time (or lazily). - isRegisterByteSizeObserverCheap(ProducerRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.ProducerRecordCoder
- isRegisterByteSizeObserverCheap(Instant) - Method in class org.apache.beam.sdk.coders.InstantCoder
-
Returns whether
Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver)
cheap enough to call for every element, that is, if thisCoder
can calculate the byte size of the element to be coded in roughly constant time (or lazily). - isRegisterByteSizeObserverCheap(ReadableDuration) - Method in class org.apache.beam.sdk.coders.DurationCoder
-
Returns whether
Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver)
cheap enough to call for every element, that is, if thisCoder
can calculate the byte size of the element to be coded in roughly constant time (or lazily). - isRegisterByteSizeObserverCheap(T) - Method in class org.apache.beam.sdk.coders.Coder
-
Returns whether
Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver)
cheap enough to call for every element, that is, if thisCoder
can calculate the byte size of the element to be coded in roughly constant time (or lazily). - isRegisterByteSizeObserverCheap(T) - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
-
LengthPrefixCoder
is cheap ifvalueCoder
is cheap. - isRegisterByteSizeObserverCheap(T) - Method in class org.apache.beam.sdk.coders.TimestampPrefixingWindowCoder
- isResume() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.InitialPipelineState
- isRowLocked(Row) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.encoder.MetadataTableEncoder
-
Returns true if row is locked.
- isRunnerDeterminedSharding() - Method in interface org.apache.beam.runners.direct.DirectTestOptions
- isSdfTimer(String) - Static method in class org.apache.beam.runners.fnexecution.control.BundleCheckpointHandlers.StateAndTimerBundleCheckpointHandler
-
A helper function to help check whether the given timer is the timer which is set for rescheduling
BeamFnApi.DelayedBundleApplication
. - isSetter(Method) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
- isShouldReportDiagnosticMetrics() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
Whether additional diagnostic metrics should be reported for a Transform.
- isSideInputLookupJoin() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
- isSimple() - Method in class org.apache.beam.sdk.extensions.sql.impl.TableName
-
Whether it's a simple name, with a single name component.
- IsSparkNativeTransform() - Constructor for class org.apache.beam.runners.spark.translation.SparkBatchPortablePipelineTranslator.IsSparkNativeTransform
- isSplittable() - Method in class org.apache.beam.sdk.io.CompressedSource
-
Determines whether a single file represented by this source is splittable.
- isSplittable() - Method in class org.apache.beam.sdk.io.FileBasedSource
-
Determines whether a file represented by this source is can be split into bundles.
- isStart() - Method in class org.apache.beam.sdk.io.redis.RedisCursor
- isStarted() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
-
Returns true if there has been a call to
OffsetBasedSource.OffsetBasedReader.start()
. - isStarted() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
- isStreaming() - Method in interface org.apache.beam.sdk.options.StreamingOptions
-
Set to true if running a streaming pipeline.
- isStreamingEngine() - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
- isStreamingSideInput() - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
-
Checks if any of the side inputs in the pipeline are streaming side inputs.
- isStringType() - Method in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- isStringType(Schema.FieldType) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
- isSubtypeOf(Schema.TypeName) - Method in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- isSubtypeOf(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.values.TypeDescriptor
-
Return true if this type is a subtype of the given type.
- isSuccess() - Method in class org.apache.beam.sdk.io.tika.ParseResult
-
Returns whether this file was parsed successfully.
- isSuccess() - Method in class org.apache.beam.sdk.testing.SuccessOrFailure
- isSupertypeOf(Schema.TypeName) - Method in enum class org.apache.beam.sdk.schemas.Schema.TypeName
-
Whether this is a super type of the another type.
- isSupertypeOf(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.values.TypeDescriptor
-
Returns true if this type is assignable from the given type.
- isSupported() - Method in enum class org.apache.beam.sdk.extensions.sql.meta.ProjectSupport
- isSystemTransaction() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
Whether the given transaction is Spanner system transaction.
- isTableEmpty(TableReference) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Returns true if the table is empty.
- isTableEmpty(TableReference) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
- isTableEmpty(TableReference) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- isTableResolved(Table) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.QueryTrait
-
True if the table was resolved using the Calcite schema.
- isTerminal() - Method in enum class org.apache.beam.runners.local.ExecutionDriver.DriverState
- isTerminal() - Method in enum class org.apache.beam.sdk.PipelineResult.State
- isTerminated(JobApi.JobState.Enum) - Static method in class org.apache.beam.runners.jobsubmission.JobInvocation
- isTimer() - Method in class org.apache.beam.sdk.fn.data.LogicalEndpoint
- isTimestampCombinerSpecified() - Method in class org.apache.beam.sdk.values.WindowingStrategy
- isTopicExists(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Return true if
topic
exists. - isTopicExists(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
- isTopicExists(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
- isTopicExists(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- isTriggerSpecified() - Method in class org.apache.beam.sdk.values.WindowingStrategy
- isTrue() - Method in interface org.apache.beam.io.requestresponse.CallShouldBackoff
-
Report whether to backoff.
- isTrustSelfSignedCerts() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
- isTupleType() - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeInformation
- isTupleType() - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueTypeInformation
- isUnbounded() - Method in class org.apache.beam.sdk.io.cdap.Plugin
-
Gets value of a plugin type.
- isUnknown() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
- isUnknown() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.NodeStats
-
If any of the values for rowCount, rate or window is infinite, it returns true.
- isUnknown() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
-
Return true if there is no timing information for the current
PaneInfo
. - isUpdate() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
-
Whether to update the currently running pipeline with the same name as this one.
- isValid(int) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- isValidPartition(Range.ByteStringRange) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ByteStringRangeHelper
-
Checks if the partition's start key is before its end key.
- isWholeStream - Variable in class org.apache.beam.sdk.coders.Coder.Context
-
Deprecated.Whether the encoded or decoded value fills the remainder of the output or input (resp.) record/stream contents.
- isWildcard(GcsPath) - Static method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
-
Returns true if the given
spec
contains wildcard. - isWrapperFor(Class<?>) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- isWrapping() - Method in class org.apache.beam.sdk.io.cassandra.RingRange
- isZero() - Method in class org.apache.beam.runners.spark.metrics.MetricsContainerStepMapAccumulator
- isZero() - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.MetricsAccumulator
- item(String, Boolean) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
-
Create a display item for the specified key and boolean value.
- item(String, Class<T>) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
-
Create a display item for the specified key and class value.
- item(String, Double) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
-
Create a display item for the specified key and floating point value.
- item(String, Float) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
-
Create a display item for the specified key and floating point value.
- item(String, Integer) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
-
Create a display item for the specified key and integer value.
- item(String, Long) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
-
Create a display item for the specified key and integer value.
- item(String, String) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
-
Create a display item for the specified key and string value.
- item(String, ValueProvider<?>) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
-
Create a display item for the specified key and
ValueProvider
. - item(String, DisplayData.Type, T) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
-
Create a display item for the specified key, type, and value.
- item(String, Duration) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
-
Create a display item for the specified key and duration value.
- item(String, Instant) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
-
Create a display item for the specified key and timestamp value.
- Item() - Constructor for class org.apache.beam.sdk.transforms.display.DisplayData.Item
- items() - Method in class org.apache.beam.sdk.io.aws2.dynamodb.DynamoDBIO.Read
- items() - Method in class org.apache.beam.sdk.transforms.display.DisplayData
- ItemSpec() - Constructor for class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
- iterable() - Static method in class org.apache.beam.sdk.transforms.Materializations
-
For internal use only; no backwards-compatibility guarantees.
- iterable(Schema.FieldType) - Static method in class org.apache.beam.sdk.schemas.Schema.FieldType
- ITERABLE - Enum constant in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- ITERABLE_MATERIALIZATION_URN - Static variable in class org.apache.beam.sdk.transforms.Materializations
-
The URN for a
Materialization
where the primitive view type is an iterable of fully specified windowed values. - IterableBackedListViewFn(PCollectionViews.TypeDescriptorSupplier<T>) - Constructor for class org.apache.beam.sdk.values.PCollectionViews.IterableBackedListViewFn
- IterableCoder<T> - Class in org.apache.beam.sdk.coders
- IterableCoder(Coder<T>) - Constructor for class org.apache.beam.sdk.coders.IterableCoder
- IterableLikeCoder<T,
IterableT> - Class in org.apache.beam.sdk.coders -
An abstract base class with functionality for assembling a
Coder
for a class that implementsIterable
. - IterableLikeCoder(Coder<T>, String) - Constructor for class org.apache.beam.sdk.coders.IterableLikeCoder
- iterables() - Static method in class org.apache.beam.sdk.transforms.Flatten
-
Returns a
PTransform
that takes aPCollection<Iterable<T>>
and returns aPCollection<T>
containing all the elements from all theIterable
s. - iterables() - Static method in class org.apache.beam.sdk.transforms.ToString
-
Transforms each item in the iterable of the input
PCollection
to aString
using theObject.toString()
method followed by a "," until the last element in the iterable. - iterables(String) - Static method in class org.apache.beam.sdk.transforms.ToString
-
Transforms each item in the iterable of the input
PCollection
to aString
using theObject.toString()
method followed by the specified delimiter until the last element in the iterable. - iterables(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
The
TypeDescriptor
forIterable
. - iterableView(PCollection<T>, PCollectionViews.TypeDescriptorSupplier<T>, WindowingStrategy<?, W>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
-
Returns a
PCollectionView<Iterable<T>>
capable of processing elements windowed using the providedWindowingStrategy
. - IterableViewFn(PCollectionViews.TypeDescriptorSupplier<T>) - Constructor for class org.apache.beam.sdk.values.PCollectionViews.IterableViewFn
-
Deprecated.
- IterableViewFn2(PCollectionViews.TypeDescriptorSupplier<T>) - Constructor for class org.apache.beam.sdk.values.PCollectionViews.IterableViewFn2
- iterableViewUsingVoidKey(TupleTag<Materializations.MultimapView<Void, T>>, PCollection<KV<Void, T>>, PCollectionViews.TypeDescriptorSupplier<T>, WindowingStrategy<?, W>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
- iterableWithSize(int) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.iterableWithSize(int)
. - iterableWithSize(SerializableMatcher<? super Integer>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.iterableWithSize(Matcher)
. - iterator() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- iterator() - Method in interface org.apache.beam.sdk.fn.stream.PrefetchableIterable
- iterator() - Method in class org.apache.beam.sdk.fn.stream.PrefetchableIterables.Default
- iterator() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.HL7v2MessagePages
- iterator() - Method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
- iterator() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices.FakeBigQueryServerStream
J
- jarPath() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.UserFunctionDefinitions.JavaScalarFunction
-
The Beam filesystem path to the jar where the method was defined.
- JAVA_CLASS - Enum constant in enum class org.apache.beam.sdk.transforms.display.DisplayData.Type
- javaAggregateFunctions() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.UserFunctionDefinitions
- JavaBeanSchema - Class in org.apache.beam.sdk.schemas
-
A
SchemaProvider
for Java Bean objects. - JavaBeanSchema() - Constructor for class org.apache.beam.sdk.schemas.JavaBeanSchema
- JavaBeanSchema.GetterTypeSupplier - Class in org.apache.beam.sdk.schemas
-
FieldValueTypeSupplier
that's based on getter methods. - JavaBeanSchema.SetterTypeSupplier - Class in org.apache.beam.sdk.schemas
-
FieldValueTypeSupplier
that's based on setter methods. - JavaBeanUtils - Class in org.apache.beam.sdk.schemas.utils
-
A set of utilities to generate getter and setter classes for JavaBean objects.
- JavaBeanUtils() - Constructor for class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
- JavaClassLookupAllowListFactory() - Constructor for class org.apache.beam.sdk.expansion.service.ExpansionServiceOptions.JavaClassLookupAllowListFactory
- JavaExplodeTransformProvider - Class in org.apache.beam.sdk.schemas.transforms.providers
-
An implementation of
TypedSchemaTransformProvider
for Explode. - JavaExplodeTransformProvider() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider
- JavaExplodeTransformProvider.Configuration - Class in org.apache.beam.sdk.schemas.transforms.providers
- JavaExplodeTransformProvider.Configuration.Builder - Class in org.apache.beam.sdk.schemas.transforms.providers
- JavaExplodeTransformProvider.ExplodeTransform - Class in org.apache.beam.sdk.schemas.transforms.providers
-
A
SchemaTransform
for Explode. - JavaFieldSchema - Class in org.apache.beam.sdk.schemas
-
A
SchemaProvider
for Java POJO objects. - JavaFieldSchema() - Constructor for class org.apache.beam.sdk.schemas.JavaFieldSchema
- JavaFieldSchema.JavaFieldTypeSupplier - Class in org.apache.beam.sdk.schemas
-
FieldValueTypeSupplier
that's based on public fields. - JavaFieldTypeSupplier() - Constructor for class org.apache.beam.sdk.schemas.JavaFieldSchema.JavaFieldTypeSupplier
- JavaFilterTransformProvider - Class in org.apache.beam.sdk.schemas.transforms.providers
-
An implementation of
TypedSchemaTransformProvider
for Filter for the java language. - JavaFilterTransformProvider() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider
- JavaFilterTransformProvider.Configuration - Class in org.apache.beam.sdk.schemas.transforms.providers
- JavaFilterTransformProvider.Configuration.Builder - Class in org.apache.beam.sdk.schemas.transforms.providers
- JavaFilterTransformProvider.JavaFilterTransform - Class in org.apache.beam.sdk.schemas.transforms.providers
-
A
SchemaTransform
for Filter-java. - javaIterator(Iterator<T>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.utils.ScalaInterop
-
Java
Iterator
of ScalaIterator
. - JavaMapToFieldsTransformProvider - Class in org.apache.beam.sdk.schemas.transforms.providers
-
An implementation of
TypedSchemaTransformProvider
for MapToFields for the java language. - JavaMapToFieldsTransformProvider() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider
- JavaMapToFieldsTransformProvider.Configuration - Class in org.apache.beam.sdk.schemas.transforms.providers
- JavaMapToFieldsTransformProvider.Configuration.Builder - Class in org.apache.beam.sdk.schemas.transforms.providers
- JavaMapToFieldsTransformProvider.JavaMapToFieldsTransform - Class in org.apache.beam.sdk.schemas.transforms.providers
-
A
SchemaTransform
for MapToFields-java. - JavaRowUdf - Class in org.apache.beam.sdk.schemas.transforms.providers
- JavaRowUdf(JavaRowUdf.Configuration, Schema) - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf
- JavaRowUdf.Configuration - Class in org.apache.beam.sdk.schemas.transforms.providers
- JavaRowUdf.Configuration.Builder - Class in org.apache.beam.sdk.schemas.transforms.providers
- JavaScalarFunction() - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.translation.UserFunctionDefinitions.JavaScalarFunction
- javaScalarFunctions() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.UserFunctionDefinitions
- javaTypeForFieldType(Schema.FieldType) - Static method in class org.apache.beam.sdk.schemas.FieldTypeDescriptors
-
Get a
TypeDescriptor
from aSchema.FieldType
. - JavaUdfLoader - Class in org.apache.beam.sdk.extensions.sql.impl
-
Loads
UdfProvider
implementations from user-provided jars. - JavaUdfLoader() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.JavaUdfLoader
- JAXBCoder<T> - Class in org.apache.beam.sdk.io.xml
-
A coder for JAXB annotated objects.
- JcsmpSessionService - Class in org.apache.beam.sdk.io.solace.broker
-
A class that manages a connection to a Solace broker using basic authentication.
- JcsmpSessionService() - Constructor for class org.apache.beam.sdk.io.solace.broker.JcsmpSessionService
- JdbcConnection - Class in org.apache.beam.sdk.extensions.sql.impl
-
Beam JDBC Connection.
- JdbcDriver - Class in org.apache.beam.sdk.extensions.sql.impl
-
Calcite JDBC driver with Beam defaults.
- JdbcDriver() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.JdbcDriver
- JdbcIO - Class in org.apache.beam.sdk.io.jdbc
-
IO to read and write data on JDBC.
- JdbcIO.DataSourceConfiguration - Class in org.apache.beam.sdk.io.jdbc
-
A POJO describing a
DataSource
, either providing directly aDataSource
or all properties allowing to create aDataSource
. - JdbcIO.DataSourceProviderFromDataSourceConfiguration - Class in org.apache.beam.sdk.io.jdbc
-
Wraps a
JdbcIO.DataSourceConfiguration
to provide aDataSource
. - JdbcIO.DefaultRetryStrategy - Class in org.apache.beam.sdk.io.jdbc
-
This is the default
Predicate
we use to detect DeadLock. - JdbcIO.PoolableDataSourceProvider - Class in org.apache.beam.sdk.io.jdbc
-
Wraps a
JdbcIO.DataSourceConfiguration
to provide aPoolingDataSource
. - JdbcIO.PreparedStatementSetter<T> - Interface in org.apache.beam.sdk.io.jdbc
-
An interface used by the JdbcIO Write to set the parameters of the
PreparedStatement
used to setParameters into the database. - JdbcIO.Read<T> - Class in org.apache.beam.sdk.io.jdbc
-
Implementation of
JdbcIO.read()
. - JdbcIO.ReadAll<ParameterT,
OutputT> - Class in org.apache.beam.sdk.io.jdbc -
Implementation of
JdbcIO.readAll()
. - JdbcIO.ReadRows - Class in org.apache.beam.sdk.io.jdbc
-
Implementation of
JdbcIO.readRows()
. - JdbcIO.ReadWithPartitions<T,
PartitionColumnT> - Class in org.apache.beam.sdk.io.jdbc - JdbcIO.RetryConfiguration - Class in org.apache.beam.sdk.io.jdbc
-
Builder used to help with retry configuration for
JdbcIO
. - JdbcIO.RetryStrategy - Interface in org.apache.beam.sdk.io.jdbc
-
An interface used to control if we retry the statements when a
SQLException
occurs. - JdbcIO.RowMapper<T> - Interface in org.apache.beam.sdk.io.jdbc
-
An interface used by
JdbcIO.Read
for converting each row of theResultSet
into an element of the resultingPCollection
. - JdbcIO.StatementPreparator - Interface in org.apache.beam.sdk.io.jdbc
-
An interface used by the JdbcIO Write to set the parameters of the
PreparedStatement
used to setParameters into the database. - JdbcIO.Write<T> - Class in org.apache.beam.sdk.io.jdbc
-
This class is used as the default return value of
JdbcIO.write()
. - JdbcIO.WriteVoid<T> - Class in org.apache.beam.sdk.io.jdbc
-
A
PTransform
to write to a JDBC datasource. - JdbcIO.WriteWithResults<T,
V> - Class in org.apache.beam.sdk.io.jdbc -
A
PTransform
to write to a JDBC datasource. - JdbcReadSchemaTransform(JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration, String) - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransform
- JdbcReadSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
- JdbcReadSchemaTransformProvider - Class in org.apache.beam.sdk.io.jdbc
-
An implementation of
SchemaTransformProvider
for reading from JDBC connections usingJdbcIO
. - JdbcReadSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider
- JdbcReadSchemaTransformProvider.JdbcReadSchemaTransform - Class in org.apache.beam.sdk.io.jdbc
- JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.jdbc
- JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.jdbc
- JdbcReadWithPartitionsHelper<PartitionT> - Interface in org.apache.beam.sdk.io.jdbc
-
A helper for
JdbcIO.ReadWithPartitions
that handles range calculations. - JdbcSchemaIOProvider - Class in org.apache.beam.sdk.io.jdbc
-
An implementation of
SchemaIOProvider
for reading and writing JSON payloads withJdbcIO
. - JdbcSchemaIOProvider() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcSchemaIOProvider
- jdbcType() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider
- jdbcType() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider
- jdbcType() - Method in class org.apache.beam.sdk.io.jdbc.providers.ReadFromMySqlSchemaTransformProvider
- jdbcType() - Method in class org.apache.beam.sdk.io.jdbc.providers.ReadFromOracleSchemaTransformProvider
- jdbcType() - Method in class org.apache.beam.sdk.io.jdbc.providers.ReadFromPostgresSchemaTransformProvider
- jdbcType() - Method in class org.apache.beam.sdk.io.jdbc.providers.ReadFromSqlServerSchemaTransformProvider
- jdbcType() - Method in class org.apache.beam.sdk.io.jdbc.providers.WriteToMySqlSchemaTransformProvider
- jdbcType() - Method in class org.apache.beam.sdk.io.jdbc.providers.WriteToOracleSchemaTransformProvider
- jdbcType() - Method in class org.apache.beam.sdk.io.jdbc.providers.WriteToPostgresSchemaTransformProvider
- jdbcType() - Method in class org.apache.beam.sdk.io.jdbc.providers.WriteToSqlServerSchemaTransformProvider
- jdbcUrl() - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
- JdbcUtil - Class in org.apache.beam.sdk.io.jdbc
-
Provides utility functions for working with
JdbcIO
. - JdbcUtil() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcUtil
- JdbcWriteResult - Class in org.apache.beam.sdk.io.jdbc
-
The result of writing a row to JDBC datasource.
- JdbcWriteResult() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcWriteResult
- JdbcWriteSchemaTransform(JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration, String) - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransform
- JdbcWriteSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
- JdbcWriteSchemaTransformProvider - Class in org.apache.beam.sdk.io.jdbc
-
An implementation of
SchemaTransformProvider
for writing to a JDBC connections usingJdbcIO
. - JdbcWriteSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider
- JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransform - Class in org.apache.beam.sdk.io.jdbc
- JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.jdbc
- JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.jdbc
- JetMetricResults - Class in org.apache.beam.runners.jet.metrics
-
Jet specific
MetricResults
. - JetMetricResults(IMap<String, MetricUpdates>) - Constructor for class org.apache.beam.runners.jet.metrics.JetMetricResults
- JetMetricsContainer - Class in org.apache.beam.runners.jet.metrics
-
Jet specific implementation of
MetricsContainer
. - JetMetricsContainer(String, String, Processor.Context) - Constructor for class org.apache.beam.runners.jet.metrics.JetMetricsContainer
- JetPipelineOptions - Interface in org.apache.beam.runners.jet
-
Pipeline options specific to the Jet runner.
- JetPipelineResult - Class in org.apache.beam.runners.jet
-
Jet specific implementation of
PipelineResult
. - JetRunner - Class in org.apache.beam.runners.jet
-
Jet specific implementation of Beam's
PipelineRunner
. - JetRunnerRegistrar - Class in org.apache.beam.runners.jet
- JetRunnerRegistrar.Options - Class in org.apache.beam.runners.jet
-
Registers the
JetPipelineOptions
. - JetRunnerRegistrar.Runner - Class in org.apache.beam.runners.jet
-
Registers the
JetRunner
. - JmsIO - Class in org.apache.beam.sdk.io.jms
-
An unbounded source for JMS destinations (queues or topics).
- JmsIO.ConnectionFactoryContainer<T> - Interface in org.apache.beam.sdk.io.jms
- JmsIO.MessageMapper<T> - Interface in org.apache.beam.sdk.io.jms
-
An interface used by
JmsIO.Read
for converting each jmsMessage
into an element of the resultingPCollection
. - JmsIO.Read<T> - Class in org.apache.beam.sdk.io.jms
-
A
PTransform
to read from a JMS destination. - JmsIO.Write<EventT> - Class in org.apache.beam.sdk.io.jms
-
A
PTransform
to write to a JMS queue. - JmsIOException - Exception Class in org.apache.beam.sdk.io.jms
- JmsIOException(String) - Constructor for exception class org.apache.beam.sdk.io.jms.JmsIOException
- JmsIOException(String, Throwable) - Constructor for exception class org.apache.beam.sdk.io.jms.JmsIOException
- JmsRecord - Class in org.apache.beam.sdk.io.jms
-
JmsRecord contains message payload of the record as well as metadata (JMS headers and properties).
- JmsRecord(String, long, String, Destination, Destination, int, boolean, String, long, int, Map<String, Object>, String) - Constructor for class org.apache.beam.sdk.io.jms.JmsRecord
- JOB_ID - Static variable in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO
-
MRJobConfig.ID
. - JOB_PORT_FLAG_NAME - Static variable in interface org.apache.beam.runners.prism.PrismPipelineOptions
- JobBundleFactory - Interface in org.apache.beam.runners.fnexecution.control
-
A factory that has all job-scoped information, and can be combined with stage-scoped information to create a
StageBundleFactory
. - jobId() - Method in class org.apache.beam.runners.fnexecution.provisioning.JobInfo
- JobInfo - Class in org.apache.beam.runners.fnexecution.provisioning
-
A subset of
ProvisionApi.ProvisionInfo
that specifies a unique job, while omitting fields that are not known to the runner operator. - JobInfo() - Constructor for class org.apache.beam.runners.fnexecution.provisioning.JobInfo
- JobInvocation - Class in org.apache.beam.runners.jobsubmission
-
Internal representation of a Job which has been invoked (prepared and run) by a client.
- JobInvocation(JobInfo, ListeningExecutorService, RunnerApi.Pipeline, PortablePipelineRunner) - Constructor for class org.apache.beam.runners.jobsubmission.JobInvocation
- JobInvoker - Class in org.apache.beam.runners.jobsubmission
-
Factory to create
JobInvocation
instances. - JobInvoker(String) - Constructor for class org.apache.beam.runners.jobsubmission.JobInvoker
- jobName() - Method in class org.apache.beam.runners.fnexecution.provisioning.JobInfo
- JobNameFactory() - Constructor for class org.apache.beam.sdk.options.PipelineOptions.JobNameFactory
- JobPreparation - Class in org.apache.beam.runners.jobsubmission
-
A job that has been prepared, but not invoked.
- JobPreparation() - Constructor for class org.apache.beam.runners.jobsubmission.JobPreparation
- JobServerDriver - Class in org.apache.beam.runners.jobsubmission
-
Shared code for starting and serving an
InMemoryJobService
. - JobServerDriver(JobServerDriver.ServerConfiguration, ServerFactory, ServerFactory, JobServerDriver.JobInvokerFactory) - Constructor for class org.apache.beam.runners.jobsubmission.JobServerDriver
- JobServerDriver.JobInvokerFactory - Interface in org.apache.beam.runners.jobsubmission
- JobServerDriver.ServerConfiguration - Class in org.apache.beam.runners.jobsubmission
-
Configuration for the jobServer.
- JobSpecification(Job, RunnerApi.Pipeline, Map<AppliedPTransform<?, ?, ?>, String>) - Constructor for class org.apache.beam.runners.dataflow.DataflowPipelineTranslator.JobSpecification
- jobToString(Job) - Static method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator
-
Renders a
Job
as a string. - join(String, CoGroup.By) - Method in class org.apache.beam.sdk.schemas.transforms.CoGroup.ExpandCrossProduct
-
Select the following fields for the specified PCollection with the specified join args.
- join(String, CoGroup.By) - Method in class org.apache.beam.sdk.schemas.transforms.CoGroup.Impl
-
Select the following fields for the specified PCollection with the specified join args.
- join(String, CoGroup.By) - Static method in class org.apache.beam.sdk.schemas.transforms.CoGroup
-
Specify the following join arguments (including fields to join by_ for the specified PCollection.
- join(CoGroup.By) - Static method in class org.apache.beam.sdk.schemas.transforms.CoGroup
-
Join all input PCollections using the same args.
- Join - Class in org.apache.beam.sdk.extensions.joinlibrary
-
Utility class with different versions of joins.
- Join - Class in org.apache.beam.sdk.schemas.transforms
-
A transform that performs equijoins across two schema
PCollection
s. - Join() - Constructor for class org.apache.beam.sdk.extensions.joinlibrary.Join
- Join() - Constructor for class org.apache.beam.sdk.schemas.transforms.Join
- Join.FieldsEqual - Class in org.apache.beam.sdk.schemas.transforms
-
Predicate object to specify fields to compare when doing an equi-join.
- Join.FieldsEqual.Impl - Class in org.apache.beam.sdk.schemas.transforms
-
Implementation class for FieldsEqual.
- Join.FullOuterJoin<K,
V1, - Class in org.apache.beam.sdk.extensions.joinlibraryV2> -
PTransform representing a full outer join of two collections of KV elements.
- Join.Impl<LhsT,
RhsT> - Class in org.apache.beam.sdk.schemas.transforms -
Implementation class .
- Join.InnerJoin<K,
V1, - Class in org.apache.beam.sdk.extensions.joinlibraryV2> -
PTransform representing an inner join of two collections of KV elements.
- Join.LeftOuterJoin<K,
V1, - Class in org.apache.beam.sdk.extensions.joinlibraryV2> -
PTransform representing a left outer join of two collections of KV elements.
- Join.RightOuterJoin<K,
V1, - Class in org.apache.beam.sdk.extensions.joinlibraryV2> -
PTransform representing a right outer join of two collections of KV elements.
- JoinAsLookup(RexNode, BeamSqlSeekableTable, Schema, Schema, int, int) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamJoinTransforms.JoinAsLookup
- JoinRelOptRuleCall - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
This is a class to catch the built join and check if it is a legal join before passing it to the actual RelOptRuleCall.
- JoinRelOptRuleCall.JoinChecker - Interface in org.apache.beam.sdk.extensions.sql.impl.rule
-
This is a function gets the output relation and checks if it is a legal relational node.
- JsonArrayCoder - Class in org.apache.beam.sdk.io.gcp.healthcare
- JsonArrayCoder() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.JsonArrayCoder
- jsonBytesLike(String) - Static method in class org.apache.beam.sdk.testing.JsonMatcher
- jsonBytesLike(Map<String, Object>) - Static method in class org.apache.beam.sdk.testing.JsonMatcher
- JsonIO - Class in org.apache.beam.sdk.io.json
-
PTransform
s for reading and writing JSON files. - JsonIO() - Constructor for class org.apache.beam.sdk.io.json.JsonIO
- JsonIO.Write<T> - Class in org.apache.beam.sdk.io.json
-
PTransform
for writing JSON files. - JsonMatcher<T> - Class in org.apache.beam.sdk.testing
-
Matcher to compare a string or byte[] representing a JSON Object, independent of field order.
- JsonMatcher(Map<String, Object>) - Constructor for class org.apache.beam.sdk.testing.JsonMatcher
- JsonPayloadSerializerProvider - Class in org.apache.beam.sdk.schemas.io.payloads
- JsonPayloadSerializerProvider() - Constructor for class org.apache.beam.sdk.schemas.io.payloads.JsonPayloadSerializerProvider
- JsonReadSchemaTransformFormatProvider - Class in org.apache.beam.sdk.io.fileschematransform
-
A
FileReadSchemaTransformFormatProvider
that reads newline-delimited JSONs. - JsonReadSchemaTransformFormatProvider() - Constructor for class org.apache.beam.sdk.io.fileschematransform.JsonReadSchemaTransformFormatProvider
- JSON-Schema (https://json-schema.org) support - Search tag in class org.apache.beam.sdk.schemas.utils.JsonUtils
- Section
- jsonSchemaFromBeamSchema(Schema) - Static method in class org.apache.beam.sdk.schemas.utils.JsonUtils
- jsonSchemaStringFromBeamSchema(Schema) - Static method in class org.apache.beam.sdk.schemas.utils.JsonUtils
- JSON-Schema supported features - Search tag in class org.apache.beam.sdk.schemas.utils.JsonUtils
- Section
- jsonStringLike(String) - Static method in class org.apache.beam.sdk.testing.JsonMatcher
- jsonStringLike(Map<String, Object>) - Static method in class org.apache.beam.sdk.testing.JsonMatcher
- JsonToRow - Class in org.apache.beam.sdk.transforms
- JsonToRow() - Constructor for class org.apache.beam.sdk.transforms.JsonToRow
- JsonToRow.JsonToRowWithErrFn - Class in org.apache.beam.sdk.transforms
- JsonToRow.JsonToRowWithErrFn.Builder - Class in org.apache.beam.sdk.transforms
- JsonToRow.JsonToRowWithErrFn.ParseWithError - Class in org.apache.beam.sdk.transforms
- JsonToRow.JsonToRowWithErrFn.ParseWithError.Builder - Class in org.apache.beam.sdk.transforms
- JsonToRow.ParseResult - Class in org.apache.beam.sdk.transforms
-
The result of a
JsonToRow.withExceptionReporting(Schema)
transform. - JsonToRow.ParseResult.Builder - Class in org.apache.beam.sdk.transforms
- JsonToRowWithErrFn() - Constructor for class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn
- JsonUtils - Class in org.apache.beam.sdk.schemas.utils
-
Utils to convert JSON records to Beam
Row
. - JsonUtils() - Constructor for class org.apache.beam.sdk.schemas.utils.JsonUtils
- jsonValueFromMessageValue(Descriptors.FieldDescriptor, Object, boolean, Predicate<String>, String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
- JsonWriteConfiguration() - Constructor for class org.apache.beam.sdk.io.json.providers.JsonWriteTransformProvider.JsonWriteConfiguration
- JsonWriteSchemaTransformFormatProvider - Class in org.apache.beam.sdk.io.fileschematransform
-
A
FileWriteSchemaTransformFormatProvider
for JSON format. - JsonWriteSchemaTransformFormatProvider() - Constructor for class org.apache.beam.sdk.io.fileschematransform.JsonWriteSchemaTransformFormatProvider
- JsonWriteTransformProvider - Class in org.apache.beam.sdk.io.json.providers
-
An implementation of
TypedSchemaTransformProvider
forJsonIO.write(java.lang.String)
. - JsonWriteTransformProvider() - Constructor for class org.apache.beam.sdk.io.json.providers.JsonWriteTransformProvider
- JsonWriteTransformProvider.JsonWriteConfiguration - Class in org.apache.beam.sdk.io.json.providers
-
Configuration for writing to BigQuery with Storage Write API.
- JsonWriteTransformProvider.JsonWriteConfiguration.Builder - Class in org.apache.beam.sdk.io.json.providers
-
Builder for
JsonWriteTransformProvider.JsonWriteConfiguration
. - JsonWriteTransformProvider.JsonWriteTransform - Class in org.apache.beam.sdk.io.json.providers
- JvmInitializer - Interface in org.apache.beam.sdk.harness
-
A service interface for defining one-time initialization of the JVM during pipeline execution.
- JvmInitializers - Class in org.apache.beam.sdk.fn
-
Helpers for executing
JvmInitializer
implementations. - JvmInitializers() - Constructor for class org.apache.beam.sdk.fn.JvmInitializers
K
- KAFKA - Static variable in class org.apache.beam.sdk.managed.Managed
- KAFKA_READ_OVERRIDE - Static variable in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
A
PTransformOverride
for runners to swapKafkaIO.Read.ReadFromKafkaViaSDF
to legacy Kafka read if runners doesn't have a good support on executing unbounded Splittable DoFn. - KAFKA_READ_WITH_METADATA_TRANSFORM_URN_V2 - Static variable in class org.apache.beam.sdk.io.kafka.upgrade.KafkaIOTranslation
- KAFKA_WRITE_TRANSFORM_URN_V2 - Static variable in class org.apache.beam.sdk.io.kafka.upgrade.KafkaIOTranslation
- KafkaCheckpointMark - Class in org.apache.beam.sdk.io.kafka
-
Checkpoint for a
KafkaUnboundedReader
. - KafkaCheckpointMark(List<KafkaCheckpointMark.PartitionMark>, Optional<KafkaUnboundedReader<?, ?>>) - Constructor for class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark
- KafkaCheckpointMark.PartitionMark - Class in org.apache.beam.sdk.io.kafka
-
A tuple to hold topic, partition, and offset that comprise the checkpoint for a single partition.
- KafkaCommitOffset<K,
V> - Class in org.apache.beam.sdk.io.kafka -
A
PTransform
that commits offsets ofKafkaRecord
. - KafkaConnectUtils - Class in org.apache.beam.io.debezium
- KafkaConnectUtils() - Constructor for class org.apache.beam.io.debezium.KafkaConnectUtils
- KafkaIO - Class in org.apache.beam.sdk.io.kafka
-
An unbounded source and a sink for Kafka topics.
- KafkaIO.Read<K,
V> - Class in org.apache.beam.sdk.io.kafka -
A
PTransform
to read from Kafka topics. - KafkaIO.Read.External - Class in org.apache.beam.sdk.io.kafka
-
Exposes
KafkaIO.TypedWithoutMetadata
as an external transform for cross-language usage. - KafkaIO.Read.External.Configuration - Class in org.apache.beam.sdk.io.kafka
-
Parameters class to expose the Read transform to an external SDK.
- KafkaIO.Read.FakeFlinkPipelineOptions - Interface in org.apache.beam.sdk.io.kafka
- KafkaIO.ReadSourceDescriptors<K,
V> - Class in org.apache.beam.sdk.io.kafka -
A
PTransform
to read fromKafkaSourceDescriptor
. - KafkaIO.TypedWithoutMetadata<K,
V> - Class in org.apache.beam.sdk.io.kafka -
A
PTransform
to read from Kafka topics. - KafkaIO.Write<K,
V> - Class in org.apache.beam.sdk.io.kafka -
A
PTransform
to write to a Kafka topic with KVs . - KafkaIO.Write.External - Class in org.apache.beam.sdk.io.kafka
-
Exposes
KafkaIO.Write
as an external transform for cross-language usage. - KafkaIO.Write.External.Configuration - Class in org.apache.beam.sdk.io.kafka
-
Parameters class to expose the Write transform to an external SDK.
- KafkaIO.WriteRecords<K,
V> - Class in org.apache.beam.sdk.io.kafka -
A
PTransform
to write to a Kafka topic with ProducerRecord's. - KafkaIOInitializer - Class in org.apache.beam.sdk.io.kafka
-
Initialize KafkaIO feature flags on worker.
- KafkaIOInitializer() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaIOInitializer
- KafkaIOTranslation - Class in org.apache.beam.sdk.io.kafka.upgrade
-
Utility methods for translating
KafkaIO
transforms to and fromRunnerApi
representations. - KafkaIOTranslation() - Constructor for class org.apache.beam.sdk.io.kafka.upgrade.KafkaIOTranslation
- KafkaIOTranslation.ReadRegistrar - Class in org.apache.beam.sdk.io.kafka.upgrade
- KafkaIOTranslation.WriteRegistrar - Class in org.apache.beam.sdk.io.kafka.upgrade
- KafkaIOUtils - Class in org.apache.beam.sdk.io.kafka
-
Common utility functions and default configurations for
KafkaIO.Read
andKafkaIO.ReadSourceDescriptors
. - KafkaIOUtils() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaIOUtils
- KafkaIOUtils.MovingAvg - Class in org.apache.beam.sdk.io.kafka
- KafkaIOUtilsBenchmark - Class in org.apache.beam.sdk.io.kafka.jmh
- KafkaIOUtilsBenchmark() - Constructor for class org.apache.beam.sdk.io.kafka.jmh.KafkaIOUtilsBenchmark
- KafkaIOUtilsBenchmark.AtomicAccumulatorState - Class in org.apache.beam.sdk.io.kafka.jmh
- KafkaIOUtilsBenchmark.PlainAccumulatorState - Class in org.apache.beam.sdk.io.kafka.jmh
- KafkaIOUtilsBenchmark.ProducerState - Class in org.apache.beam.sdk.io.kafka.jmh
- KafkaIOUtilsBenchmark.VolatileAccumulatorState - Class in org.apache.beam.sdk.io.kafka.jmh
- KafkaMetrics - Interface in org.apache.beam.sdk.io.kafka
-
Stores and exports metrics for a batch of Kafka Client RPCs.
- KafkaMetrics.KafkaMetricsImpl - Class in org.apache.beam.sdk.io.kafka
-
Metrics of a batch of RPCs.
- KafkaMetrics.NoOpKafkaMetrics - Class in org.apache.beam.sdk.io.kafka
-
No-op implementation of
KafkaResults
. - KafkaMetricsImpl() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaMetrics.KafkaMetricsImpl
- KafkaPublishTimestampFunction<T> - Interface in org.apache.beam.sdk.io.kafka
-
An interface for providing custom timestamp for elements written to Kafka.
- KafkaReadSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.kafka
-
Configuration for reading from a Kafka topic.
- KafkaReadSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- KafkaReadSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.kafka
-
Builder for the
KafkaReadSchemaTransformConfiguration
. - KafkaReadSchemaTransformProvider - Class in org.apache.beam.sdk.io.kafka
- KafkaReadSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider
- KafkaReadSchemaTransformProvider.ErrorFn - Class in org.apache.beam.sdk.io.kafka
- KafkaRecord<K,
V> - Class in org.apache.beam.sdk.io.kafka -
KafkaRecord contains key and value of the record as well as metadata for the record (topic name, partition id, and offset).
- KafkaRecord(String, int, long, long, KafkaTimestampType, Headers, K, V) - Constructor for class org.apache.beam.sdk.io.kafka.KafkaRecord
- KafkaRecord(String, int, long, long, KafkaTimestampType, Headers, KV<K, V>) - Constructor for class org.apache.beam.sdk.io.kafka.KafkaRecord
- KafkaRecordCoder<K,
V> - Class in org.apache.beam.sdk.io.kafka -
Coder
forKafkaRecord
. - KafkaRecordCoder(Coder<K>, Coder<V>) - Constructor for class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
- KafkaSchemaTransformTranslation - Class in org.apache.beam.sdk.io.kafka
- KafkaSchemaTransformTranslation() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaSchemaTransformTranslation
- KafkaSchemaTransformTranslation.ReadRegistrar - Class in org.apache.beam.sdk.io.kafka
- KafkaSchemaTransformTranslation.WriteRegistrar - Class in org.apache.beam.sdk.io.kafka
- KafkaSinkMetrics - Class in org.apache.beam.sdk.io.kafka
-
Helper class to create per worker metrics for Kafka Sink stages.
- KafkaSinkMetrics() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaSinkMetrics
- KafkaSourceConsumerFn<T> - Class in org.apache.beam.io.debezium
-
Quick Overview
- KafkaSourceConsumerFn.DebeziumSDFDatabaseHistory - Class in org.apache.beam.io.debezium
- KafkaSourceDescriptor - Class in org.apache.beam.sdk.io.kafka
-
Represents a Kafka source description.
- KafkaSourceDescriptor() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaSourceDescriptor
- KafkaTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.kafka
-
Kafka table provider.
- KafkaTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.KafkaTableProvider
- KafkaTimestampType - Enum Class in org.apache.beam.sdk.io.kafka
-
This is a copy of Kafka's
TimestampType
. - KafkaWriteSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration
- KafkaWriteSchemaTransformProvider - Class in org.apache.beam.sdk.io.kafka
- KafkaWriteSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider
- KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.kafka
- KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.kafka
- KEEP_NESTED_NAME - Static variable in class org.apache.beam.sdk.schemas.utils.SelectHelpers
-
This policy keeps the raw nested field name.
- keepEarliest() - Static method in class org.apache.beam.sdk.extensions.timeseries.FillGaps
- keeping(List<String>) - Method in class org.apache.beam.sdk.io.iceberg.IcebergIO.ReadRows
- keepLatest() - Static method in class org.apache.beam.sdk.extensions.timeseries.FillGaps
- keepMostNestedFieldName() - Method in class org.apache.beam.sdk.schemas.transforms.Select.Flattened
-
For nested fields, keep just the most-nested field name.
- key() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.SingletonKeyedWorkItem
- key(String) - Method in class org.apache.beam.sdk.io.aws2.s3.SSECustomerKey.Builder
- KEY - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.RowUtils
- KEY_FIELD_PROPERTY - Static variable in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider
- keyCoder() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.BagUserStateSpec
- keyCoder() - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.MultimapSideInputHandler
-
Returns the
Coder
to use for the elements of the resulting keys iterable. - keyed(CombineWithContext.CombineFnWithContext<V, AccumT, OutputT>, SerializablePipelineOptions, Map<TupleTag<?>, KV<WindowingStrategy<?, ?>, SideInputBroadcast<?>>>, WindowingStrategy<?, ?>) - Static method in class org.apache.beam.runners.spark.translation.SparkCombineFn
- KeyedBufferingElementsHandler - Class in org.apache.beam.runners.flink.translation.wrappers.streaming.stableinput
-
A keyed implementation of a
BufferingElementsHandler
. - KeyedPCollectionTuple<K> - Class in org.apache.beam.sdk.transforms.join
-
An immutable tuple of keyed
PCollections
with key type K. - KeyedPCollectionTuple.TaggedKeyedPCollection<K,
V> - Class in org.apache.beam.sdk.transforms.join -
A utility class to help ensure coherence of tag and input PCollection types.
- keyedStateInternals - Variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- keyedValues() - Static method in class org.apache.beam.sdk.transforms.Deduplicate
-
Returns a deduplication transform that deduplicates keyed values using the key for up to 10 mins within the
processing time domain
. - keyEncoderOf(KvCoder<K, V>) - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderProvider
- keyField - Variable in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
- KeyPairUtils - Class in org.apache.beam.sdk.io.snowflake
- KeyPairUtils() - Constructor for class org.apache.beam.sdk.io.snowflake.KeyPairUtils
- KeyPart() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema.KeyPart
- KeyPrefix() - Constructor for class org.apache.beam.runners.dataflow.internal.IsmFormat.KeyPrefix
- KeyPrefixCoder() - Constructor for class org.apache.beam.runners.dataflow.internal.IsmFormat.KeyPrefixCoder
- keys() - Method in interface org.apache.beam.sdk.state.MapState
-
Returns an
Iterable
over the keys contained in this map. - keys() - Method in interface org.apache.beam.sdk.state.MultimapState
-
Returns an
Iterable
over the keys contained in this multimap. - Keys<K> - Class in org.apache.beam.sdk.transforms
-
Keys<K>
takes aPCollection
ofKV<K, V>
s and returns aPCollection<K>
of the keys. - keySet() - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
- kind - Variable in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
- KinesisClientThrottledException - Exception Class in org.apache.beam.sdk.io.aws2.kinesis
-
Thrown when the Kinesis client was throttled due to rate limits.
- KinesisClientThrottledException(String, KinesisException) - Constructor for exception class org.apache.beam.sdk.io.aws2.kinesis.KinesisClientThrottledException
- KinesisIO - Class in org.apache.beam.sdk.io.aws2.kinesis
-
IO to read from Kinesis streams.
- KinesisIO() - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO
- KinesisIO.Read - Class in org.apache.beam.sdk.io.aws2.kinesis
-
Implementation of
KinesisIO.read()
. - KinesisIO.RecordAggregation - Class in org.apache.beam.sdk.io.aws2.kinesis
-
Configuration of Kinesis record aggregation.
- KinesisIO.RecordAggregation.Builder - Class in org.apache.beam.sdk.io.aws2.kinesis
- KinesisIO.Write<T> - Class in org.apache.beam.sdk.io.aws2.kinesis
-
Implementation of
KinesisIO.write()
. - KinesisIO.Write.Result - Class in org.apache.beam.sdk.io.aws2.kinesis
-
Result of
KinesisIO.write()
. - KinesisIOOptions - Interface in org.apache.beam.sdk.io.aws2.kinesis
-
PipelineOptions for
KinesisIO
. - KinesisIOOptions.KinesisIOOptionsRegistrar - Class in org.apache.beam.sdk.io.aws2.kinesis
-
A registrar containing the default
KinesisIOOptions
. - KinesisIOOptions.MapFactory - Class in org.apache.beam.sdk.io.aws2.kinesis
- KinesisIOOptionsRegistrar() - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.KinesisIOOptions.KinesisIOOptionsRegistrar
- KinesisPartitioner<T> - Interface in org.apache.beam.sdk.io.aws2.kinesis
-
Kinesis interface for custom partitioner.
- KinesisPartitioner.ExplicitPartitioner<T> - Interface in org.apache.beam.sdk.io.aws2.kinesis
-
An explicit partitioner that always returns a
Nonnull
explicit hash key. - KinesisRecord - Class in org.apache.beam.sdk.io.aws2.kinesis
-
KinesisClientRecord
enhanced with utility methods. - KinesisRecord(ByteBuffer, String, long, String, Instant, Instant, String, String) - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
- KinesisRecord(KinesisClientRecord, String, String) - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
- KinesisTransformRegistrar - Class in org.apache.beam.sdk.io.aws2.kinesis
-
Exposes
KinesisIO.Write
andKinesisIO.Read
as an external transform for cross-language usage. - KinesisTransformRegistrar() - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar
- KinesisTransformRegistrar.KinesisReadToBytes - Class in org.apache.beam.sdk.io.aws2.kinesis
- KinesisTransformRegistrar.ReadDataBuilder - Class in org.apache.beam.sdk.io.aws2.kinesis
- KinesisTransformRegistrar.ReadDataBuilder.Configuration - Class in org.apache.beam.sdk.io.aws2.kinesis
- KinesisTransformRegistrar.WriteBuilder - Class in org.apache.beam.sdk.io.aws2.kinesis
- KinesisTransformRegistrar.WriteBuilder.Configuration - Class in org.apache.beam.sdk.io.aws2.kinesis
- knownBuilderInstances() - Method in interface org.apache.beam.sdk.expansion.ExternalTransformRegistrar
-
A mapping from URN to an
ExternalTransformBuilder
instance. - knownBuilderInstances() - Method in class org.apache.beam.sdk.extensions.schemaio.expansion.ExternalSchemaIOTransformRegistrar
- knownBuilderInstances() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar
- knownBuilderInstances() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.ExternalTransformRegistrarImpl
- knownBuilderInstances() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar
- knownBuilderInstances() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.SnowflakeTransformRegistrar
- knownBuilders() - Method in class org.apache.beam.io.debezium.DebeziumTransformRegistrar
- knownBuilders() - Method in interface org.apache.beam.sdk.expansion.ExternalTransformRegistrar
-
Deprecated.Prefer implementing 'knownBuilderInstances'. This method will be removed in a future version of Beam.
- knownBuilders() - Method in class org.apache.beam.sdk.extensions.sql.expansion.ExternalSqlTransformRegistrar
- knownBuilders() - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead
- knownBuilders() - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite
- knownBuilders() - Method in class org.apache.beam.sdk.io.GenerateSequence.External
- knownBuilders() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External
- knownBuilders() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write.External
- knownTransforms() - Method in interface org.apache.beam.sdk.expansion.service.ExpansionService.ExpansionServiceRegistrar
- knownTransforms() - Method in class org.apache.beam.sdk.expansion.service.ExpansionService.ExternalTransformRegistrarLoader
- knownUrns() - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator
- knownUrns() - Method in interface org.apache.beam.runners.flink.FlinkPortablePipelineTranslator
- knownUrns() - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator
- knownUrns() - Method in class org.apache.beam.runners.spark.translation.SparkBatchPortablePipelineTranslator
- knownUrns() - Method in interface org.apache.beam.runners.spark.translation.SparkPortablePipelineTranslator
- knownUrns() - Method in class org.apache.beam.runners.spark.translation.SparkStreamingPortablePipelineTranslator
- KuduIO - Class in org.apache.beam.sdk.io.kudu
-
A bounded source and sink for Kudu.
- KuduIO.FormatFunction<T> - Interface in org.apache.beam.sdk.io.kudu
-
An interface used by the KuduIO Write to convert an input record into an Operation to apply as a mutation in Kudu.
- KuduIO.Read<T> - Class in org.apache.beam.sdk.io.kudu
-
Implementation of
KuduIO.read()
. - KuduIO.Write<T> - Class in org.apache.beam.sdk.io.kudu
-
A
PTransform
that writes to Kudu. - kv(SerializableMatcher<? super K>, SerializableMatcher<? super V>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
that matches anyKV
with matching key and value. - KV<K,
V> - Class in org.apache.beam.sdk.values -
An immutable key/value pair.
- KV.OrderByKey<K,
V> - Class in org.apache.beam.sdk.values -
A
Comparator
that ordersKVs
by the natural ordering of their keys. - KV.OrderByValue<K,
V> - Class in org.apache.beam.sdk.values -
A
Comparator
that ordersKVs
by the natural ordering of their values. - KvCoder<K,
V> - Class in org.apache.beam.sdk.coders -
A
KvCoder
encodesKV
s. - kvEncoder(Encoder<K>, Encoder<V>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers
- kvEncoderOf(KvCoder<K, V>) - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderProvider
- KvKeySelector<InputT,
K> - Class in org.apache.beam.runners.flink.translation.types - KvKeySelector(Coder<K>) - Constructor for class org.apache.beam.runners.flink.translation.types.KvKeySelector
- kvs() - Static method in class org.apache.beam.sdk.transforms.ToString
-
Transforms each element of the input
PCollection
to aString
by using theObject.toString()
on the key followed by a "," followed by theObject.toString()
of the value. - kvs(String) - Static method in class org.apache.beam.sdk.transforms.ToString
-
Transforms each element of the input
PCollection
to aString
by using theObject.toString()
on the key followed by the specified delimiter followed by theObject.toString()
of the value. - kvs(TypeDescriptor<K>, TypeDescriptor<V>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
The
TypeDescriptor
forKV
. - KvSwap<K,
V> - Class in org.apache.beam.sdk.transforms -
KvSwap<K, V>
takes aPCollection<KV<K, V>>
and returns aPCollection<KV<V, K>>
, where all the keys and values have been swapped. - KvToFlinkKeyKeySelector<K,
V> - Class in org.apache.beam.runners.flink.translation.wrappers.streaming -
KeySelector
that retrieves a key from aKV
. - KvToFlinkKeyKeySelector(Coder<K>) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.KvToFlinkKeyKeySelector
- kvWithKey(K) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
that matches anyKV
with the specified key. - kvWithKey(Coder<K>, K) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
that matches anyKV
with the specified key. - kvWithKey(SerializableMatcher<? super K>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
that matches anyKV
with matching key. - kvWithValue(Coder<V>, V) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
that matches anyKV
with the specified value. - kvWithValue(SerializableMatcher<? super V>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
that matches anyKV
with matching value. - kvWithValue(V) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
that matches anyKV
with the specified value.
L
- LabeledMetricNameUtils - Class in org.apache.beam.sdk.metrics
-
Util class for building/parsing labeled
MetricName
. - LabeledMetricNameUtils() - Constructor for class org.apache.beam.sdk.metrics.LabeledMetricNameUtils
- LabeledMetricNameUtils.MetricNameBuilder - Class in org.apache.beam.sdk.metrics
-
Builder class for a labeled
MetricName
. - LabeledMetricNameUtils.ParsedMetricName - Class in org.apache.beam.sdk.metrics
- LABELS - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.RowUtils
- languageHint() - Method in class org.apache.beam.sdk.extensions.ml.AnnotateText
- LargeKeys - Interface in org.apache.beam.sdk.testing
-
Category tags for tests which validate that a Beam runner can handle keys up to a given size.
- LargeKeys.Above100KB - Interface in org.apache.beam.sdk.testing
-
Tests if a runner supports 100KB keys.
- LargeKeys.Above100MB - Interface in org.apache.beam.sdk.testing
-
Tests if a runner supports 100MB keys.
- LargeKeys.Above10KB - Interface in org.apache.beam.sdk.testing
-
Tests if a runner supports 10KB keys.
- LargeKeys.Above10MB - Interface in org.apache.beam.sdk.testing
-
Tests if a runner supports 10MB keys.
- LargeKeys.Above1MB - Interface in org.apache.beam.sdk.testing
-
Tests if a runner supports 1MB keys.
- largest(int) - Static method in class org.apache.beam.sdk.transforms.Top
-
Returns a
PTransform
that takes an inputPCollection<T>
and returns aPCollection<List<T>>
with a single element containing the largestcount
elements of the inputPCollection<T>
, in decreasing order, sorted according to their natural order. - Largest() - Constructor for class org.apache.beam.sdk.transforms.Top.Largest
-
Deprecated.
- largestContinuousRange() - Method in class org.apache.beam.sdk.extensions.ordered.combiner.SequenceRangeAccumulator
- largestDoublesFn(int) - Static method in class org.apache.beam.sdk.transforms.Top
-
Returns a
Top.TopCombineFn
that aggregates the largest count double values. - largestFn(int) - Static method in class org.apache.beam.sdk.transforms.Top
-
Returns a
Top.TopCombineFn
that aggregates the largest count values. - largestIntsFn(int) - Static method in class org.apache.beam.sdk.transforms.Top
-
Returns a
Top.TopCombineFn
that aggregates the largest count int values. - largestLongsFn(int) - Static method in class org.apache.beam.sdk.transforms.Top
-
Returns a
Top.TopCombineFn
that aggregates the largest count long values. - largestPerKey(int) - Static method in class org.apache.beam.sdk.transforms.Top
-
Returns a
PTransform
that takes an inputPCollection<KV<K, V>>
and returns aPCollection<KV<K, List<V>>>
that contains an output element mapping each distinct key in the inputPCollection
to the largestcount
values associated with that key in the inputPCollection<KV<K, V>>
, in decreasing order, sorted according to their natural order. - largestRange(Iterable<ContiguousSequenceRange>) - Static method in class org.apache.beam.sdk.extensions.ordered.ContiguousSequenceRange
- LargestUnique(long) - Constructor for class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique
-
Deprecated.Creates a heap to track the largest
sampleSize
elements. - last - Variable in class org.apache.beam.sdk.transforms.PeriodicSequence.SequenceDefinition
- LAST - Enum constant in enum class org.apache.beam.sdk.extensions.sql.impl.cep.CEPKind
- lastAttemptedOffset - Variable in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
- lastAttemptedPosition - Variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
- lastClaimedOffset - Variable in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
- lastClaimedPosition - Variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
- lastModifiedMillis() - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata
-
Last modification timestamp in milliseconds since Unix epoch.
- LATE - Enum constant in enum class org.apache.beam.sdk.transforms.windowing.PaneInfo.Timing
-
Pane was fired after the output watermark had progressed past the end of the window.
- LatencyRecordingHttpRequestInitializer - Class in org.apache.beam.sdk.extensions.gcp.util
-
HttpRequestInitializer for recording request to response latency of Http-based API calls.
- LatencyRecordingHttpRequestInitializer(Map<String, String>) - Constructor for class org.apache.beam.sdk.extensions.gcp.util.LatencyRecordingHttpRequestInitializer
- LatencyRecordingHttpRequestInitializer(Histogram) - Constructor for class org.apache.beam.sdk.extensions.gcp.util.LatencyRecordingHttpRequestInitializer
- Latest - Class in org.apache.beam.sdk.transforms
- LATEST - Enum constant in enum class org.apache.beam.sdk.io.iceberg.IcebergIO.ReadRows.StartingStrategy
- LATEST - Enum constant in enum class org.apache.beam.sdk.transforms.windowing.TimestampCombiner
-
The policy of taking the latest of a set of timestamps.
- latestContiguousRange() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedEventProcessorResult
- LazyAggregateCombineFn<InputT,
AccumT, - Class in org.apache.beam.sdk.extensions.sql.implOutputT> -
Combine.CombineFn
that wraps anAggregateFn
. - LazyAggregateCombineFn(List<String>, String) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.LazyAggregateCombineFn
- LazyFlinkSourceSplitEnumerator<T> - Class in org.apache.beam.runners.flink.translation.wrappers.streaming.io.source
- LazyFlinkSourceSplitEnumerator(SplitEnumeratorContext<FlinkSourceSplit<T>>, Source<T>, PipelineOptions, int, boolean) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.LazyFlinkSourceSplitEnumerator
- leaseWorkItem(String, LeaseWorkItemRequest) - Method in class org.apache.beam.runners.dataflow.DataflowClient
-
Leases the work item for
jobId
. - LEAST_SIGNIFICANT_BITS_FIELD_NAME - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.UuidLogicalType
- leaveCompositeTransform(TransformHierarchy.Node) - Method in class org.apache.beam.runners.spark.translation.GroupByKeyVisitor
- leaveCompositeTransform(TransformHierarchy.Node) - Method in class org.apache.beam.sdk.Pipeline.PipelineVisitor.Defaults
- leaveCompositeTransform(TransformHierarchy.Node) - Method in interface org.apache.beam.sdk.Pipeline.PipelineVisitor
-
Called for each composite transform after all of its component transforms and their outputs have been visited.
- leavePipeline(Pipeline) - Method in class org.apache.beam.sdk.Pipeline.PipelineVisitor.Defaults
- leavePipeline(Pipeline) - Method in interface org.apache.beam.sdk.Pipeline.PipelineVisitor
-
Called when all values and transforms in a
Pipeline
have been visited. - left(Integer...) - Method in class org.apache.beam.sdk.schemas.transforms.Join.FieldsEqual.Impl
- left(Integer...) - Static method in class org.apache.beam.sdk.schemas.transforms.Join.FieldsEqual
- left(String...) - Method in class org.apache.beam.sdk.schemas.transforms.Join.FieldsEqual.Impl
- left(String...) - Static method in class org.apache.beam.sdk.schemas.transforms.Join.FieldsEqual
- left(FieldAccessDescriptor) - Method in class org.apache.beam.sdk.schemas.transforms.Join.FieldsEqual.Impl
- left(FieldAccessDescriptor) - Static method in class org.apache.beam.sdk.schemas.transforms.Join.FieldsEqual
- LEFT - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamJoinPushThroughJoinRule
-
Instance of the rule that works on logical joins only, and pushes to the left.
- leftOuterBroadcastJoin(PCollection<RhsT>) - Static method in class org.apache.beam.sdk.schemas.transforms.Join
-
Perform a left outer join, broadcasting the right side.
- leftOuterJoin(String, PCollection<KV<K, V1>>, PCollection<KV<K, V2>>, V2) - Static method in class org.apache.beam.sdk.extensions.joinlibrary.Join
-
Left Outer Join of two collections of KV elements.
- leftOuterJoin(PCollection<KV<K, V1>>, PCollection<KV<K, V2>>, V2) - Static method in class org.apache.beam.sdk.extensions.joinlibrary.Join
- leftOuterJoin(PCollection<RhsT>) - Static method in class org.apache.beam.sdk.schemas.transforms.Join
-
Perform a left outer join.
- length - Variable in class org.apache.beam.sdk.jmh.io.TextSourceBenchmark.Data
- lengthBytes(byte[]) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinStringFunctions
- LengthPrefixCoder<T> - Class in org.apache.beam.sdk.coders
-
A
Coder
which is able to take any existing coder and wrap it such that it is only invoked in theouter context
. - LengthPrefixUnknownCoders - Class in org.apache.beam.runners.fnexecution.wire
-
Utilities for replacing or wrapping unknown coders with
LengthPrefixCoder
. - LengthPrefixUnknownCoders() - Constructor for class org.apache.beam.runners.fnexecution.wire.LengthPrefixUnknownCoders
- lengthString(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinStringFunctions
- LESS_THAN - Enum constant in enum class org.apache.beam.sdk.extensions.sql.impl.cep.CEPKind
- LESS_THAN_OR_EQUAL - Enum constant in enum class org.apache.beam.sdk.extensions.sql.impl.cep.CEPKind
- lessThan(Coder<T>, T) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.lessThan(Comparable)
. - lessThan(T) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.lessThan(Comparable)
. - lessThan(T) - Static method in class org.apache.beam.sdk.transforms.Filter
-
Returns a
PTransform
that takes an inputPCollection
and returns aPCollection
with elements that are less than a given value, based on the elements' natural ordering. - lessThanEq(T) - Static method in class org.apache.beam.sdk.transforms.Filter
-
Returns a
PTransform
that takes an inputPCollection<T>
and returns aPCollection<T>
with elements that are less than or equal to a given value, based on the elements' natural ordering. - lessThanOrEqualTo(Coder<T>, T) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.lessThanOrEqualTo(Comparable)
. - lessThanOrEqualTo(T) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.lessThanOrEqualTo(Comparable)
. - LEVEL_CONFIGURATION - Static variable in enum class org.apache.beam.sdk.options.SdkHarnessOptions.LogLevel
-
Map from LogLevel enums to java logging level.
- LHS_TAG - Static variable in class org.apache.beam.sdk.schemas.transforms.Join
- like(String, String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
- LIKE - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
- LIKE_METHOD - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.BeamBuiltinMethods
- limit(Iterable<T>, int) - Static method in class org.apache.beam.sdk.fn.stream.PrefetchableIterables
-
Limits the
PrefetchableIterable
to the specified number of elements. - LimitNumberOfFiles(int) - Constructor for class org.apache.beam.sdk.io.TextRowCountEstimator.LimitNumberOfFiles
- LimitNumberOfTotalBytes(long) - Constructor for class org.apache.beam.sdk.io.TextRowCountEstimator.LimitNumberOfTotalBytes
- Lineage - Class in org.apache.beam.sdk.metrics
-
Standard collection of metrics used to record source and sinks information for lineage tracking.
- LINEAGE_NAMESPACE - Static variable in class org.apache.beam.sdk.metrics.Lineage
- Lineage.Type - Enum Class in org.apache.beam.sdk.metrics
-
Lineage metrics resource types.
- LineReadSchemaTransformFormatProvider - Class in org.apache.beam.sdk.io.fileschematransform
-
A
FileReadSchemaTransformFormatProvider
that reads lines as Strings. - LineReadSchemaTransformFormatProvider() - Constructor for class org.apache.beam.sdk.io.fileschematransform.LineReadSchemaTransformFormatProvider
- LinesReadConverter() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTableProvider.LinesReadConverter
- LinesWriteConverter() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTableProvider.LinesWriteConverter
- LIST - Enum constant in enum class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.Qualifier.Kind
- LIST_PARTITIONS_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
-
Counter for the total number of partitions identified during the execution of the Connector.
- listAllFhirStores(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
List all FHIR stores in a dataset.
- listAllFhirStores(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- ListCoder<T> - Class in org.apache.beam.sdk.coders
- ListCoder(Coder<T>) - Constructor for class org.apache.beam.sdk.coders.ListCoder
- listCollectionIds() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.Read
-
Factory method to create a new type safe builder for
ListCollectionIdsRequest
operations. - listDatabases() - Method in interface org.apache.beam.sdk.extensions.sql.meta.catalog.Catalog
-
Returns a set of existing databases accessible to this catalog.
- listDatabases() - Method in class org.apache.beam.sdk.extensions.sql.meta.catalog.InMemoryCatalog
- listDatabases() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.iceberg.IcebergCatalog
- listDocuments() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.Read
-
Factory method to create a new type safe builder for
ListDocumentsRequest
operations. - listJobMessages(String, String) - Method in class org.apache.beam.runners.dataflow.DataflowClient
-
Lists job messages with the given
jobId
. - listJobs(String) - Method in class org.apache.beam.runners.dataflow.DataflowClient
-
Lists Dataflow
Jobs
in the project associated with theDataflowPipelineOptions
. - listNamespaces() - Method in class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig
- listObjects(String, String, String) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
- listObjects(String, String, String, String) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
- listOf(T) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.utils.ScalaInterop
- lists(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
The
TypeDescriptor
forList
. - listSubscriptions(PubsubClient.ProjectPath, PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Return a list of subscriptions for
topic
inproject
. - listSubscriptions(PubsubClient.ProjectPath, PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
- listSubscriptions(PubsubClient.ProjectPath, PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
- listSubscriptions(PubsubClient.ProjectPath, PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- listTables(String) - Method in class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig
- listTopics(PubsubClient.ProjectPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Return a list of topics for
project
. - listTopics(PubsubClient.ProjectPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
- listTopics(PubsubClient.ProjectPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
- listTopics(PubsubClient.ProjectPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- listView(PCollection<T>, PCollectionViews.TypeDescriptorSupplier<T>, WindowingStrategy<?, W>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
-
Returns a
PCollectionView<List<T>>
capable of processing elements windowed using the providedWindowingStrategy
. - listView(PCollection<T>, TupleTag<Materializations.IterableView<T>>, PCollectionViews.TypeDescriptorSupplier<T>, WindowingStrategy<?, W>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
-
Returns a
PCollectionView<List<T>>
capable of processing elements windowed using the providedWindowingStrategy
. - ListViewFn(PCollectionViews.TypeDescriptorSupplier<T>) - Constructor for class org.apache.beam.sdk.values.PCollectionViews.ListViewFn
-
Deprecated.
- ListViewFn2(PCollectionViews.TypeDescriptorSupplier<T>) - Constructor for class org.apache.beam.sdk.values.PCollectionViews.ListViewFn2
- listViewUsingVoidKey(PCollection<KV<Void, T>>, TupleTag<Materializations.MultimapView<Void, T>>, PCollectionViews.TypeDescriptorSupplier<T>, WindowingStrategy<?, W>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
- listViewUsingVoidKey(TupleTag<Materializations.MultimapView<Void, T>>, PCollection<KV<Void, T>>, PCollectionViews.TypeDescriptorSupplier<T>, WindowingStrategy<?, W>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
- listViewWithRandomAccess(PCollection<KV<Long, PCollectionViews.ValueOrMetadata<T, OffsetRange>>>, PCollectionViews.TypeDescriptorSupplier<T>, WindowingStrategy<?, W>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
-
Returns a
PCollectionView<List<T>>
capable of processing elements windowed using the providedWindowingStrategy
. - listViewWithRandomAccess(PCollection<KV<Long, PCollectionViews.ValueOrMetadata<T, OffsetRange>>>, TupleTag<Materializations.MultimapView<Long, PCollectionViews.ValueOrMetadata<T, OffsetRange>>>, PCollectionViews.TypeDescriptorSupplier<T>, WindowingStrategy<?, W>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
-
Returns a
PCollectionView<List<T>>
capable of processing elements windowed using the providedWindowingStrategy
. - loadAggregateFunction(List<String>, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.JavaUdfLoader
-
Load a user-defined aggregate function from the specified jar.
- loader(PCollection<T>) - Static method in interface org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.SideInputValues
-
Factory to load
SideInputValues
from aDataset
based on the window strategy. - loadFromStream(InputStream) - Static method in class org.apache.beam.sdk.schemas.utils.YamlUtils
- Loading historical data into time-partitioned BigQuery tables - Search tag in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
- Section
- loadPluginClass(String) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- loadProviders() - Static method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformFormatProviders
-
Load all
FileWriteSchemaTransformFormatProvider
implementations. - loadProviders(Class<T>) - Static method in class org.apache.beam.sdk.schemas.io.Providers
- loadScalarFunction(List<String>, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.JavaUdfLoader
-
Load a user-defined scalar function from the specified jar.
- LocalFileSystemRegistrar - Class in org.apache.beam.sdk.io
-
AutoService
registrar for theLocalFileSystem
. - LocalFileSystemRegistrar() - Constructor for class org.apache.beam.sdk.io.LocalFileSystemRegistrar
- localhost - Variable in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntityWithSummary
- localhost - Variable in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKeyWithSummary
- localhost - Variable in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.WriteWithSummary
- LocalMktDate() - Constructor for class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.LocalMktDate
- LocalResources - Class in org.apache.beam.sdk.io
-
Helper functions for producing a
ResourceId
that references a local file or directory. - LocalTimestampMicrosConversion() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.LocalTimestampMicrosConversion
- LocalTimestampMillisConversion() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.LocalTimestampMillisConversion
- LocalWindmillHostportFactory() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions.LocalWindmillHostportFactory
- location - Variable in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
- location - Variable in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
- location(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.Table.Builder
- lockAndRecordPartition(PartitionRecord) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
-
Lock the partition in the metadata table for the DoFn streaming it.
- locked(Lock) - Static method in class org.apache.beam.runners.flink.translation.utils.Locker
- Locker - Class in org.apache.beam.runners.flink.translation.utils
- log(BeamFnApi.LogEntry) - Method in interface org.apache.beam.runners.fnexecution.logging.LogWriter
-
Write the contents of the Log Entry to some logging backend.
- log(BeamFnApi.LogEntry) - Method in class org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter
- LOG_APPEND_TIME - Enum constant in enum class org.apache.beam.sdk.io.kafka.KafkaTimestampType
- LogAppendTimePolicy(Optional<Instant>) - Constructor for class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.LogAppendTimePolicy
- logging(StreamObserver<BeamFnApi.LogControl>) - Method in class org.apache.beam.runners.fnexecution.logging.GrpcLoggingService
- LoggingHandler() - Constructor for class org.apache.beam.runners.dataflow.util.MonitoringUtil.LoggingHandler
- LoggingTransformProvider - Class in org.apache.beam.sdk.schemas.transforms.providers
-
An implementation of
TypedSchemaTransformProvider
for Logging. - LoggingTransformProvider() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider
- LoggingTransformProvider.Configuration - Class in org.apache.beam.sdk.schemas.transforms.providers
- LoggingTransformProvider.Configuration.Builder - Class in org.apache.beam.sdk.schemas.transforms.providers
- LoggingTransformProvider.LoggingTransform - Class in org.apache.beam.sdk.schemas.transforms.providers
-
A
SchemaTransform
for logging. - LOGICAL_TYPE - Enum constant in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- LogicalEndpoint - Class in org.apache.beam.sdk.fn.data
-
A logical endpoint is a pair of an instruction ID corresponding to the
BeamFnApi.ProcessBundleRequest
and the transform within the processing graph. - LogicalEndpoint() - Constructor for class org.apache.beam.sdk.fn.data.LogicalEndpoint
- logicalType(Schema.LogicalType<InputT, BaseT>) - Static method in class org.apache.beam.sdk.schemas.Schema.FieldType
-
Creates a logical type based on a primitive field type.
- logInfo(Function0<String>) - Method in class org.apache.beam.sdk.io.sparkreceiver.WrappedSupervisor
- LogWriter - Interface in org.apache.beam.runners.fnexecution.logging
-
A consumer of
Beam Log Entries
. - LONG_BYTES - Static variable in class org.apache.beam.runners.dataflow.internal.IsmFormat.Footer
- longs() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
The
TypeDescriptor
for Long. - Longs() - Constructor for class org.apache.beam.sdk.jmh.util.VarIntBenchmark.Longs
- longsGlobally() - Static method in class org.apache.beam.sdk.transforms.Max
-
Returns a
PTransform
that takes an inputPCollection<Long>
and returns aPCollection<Long>
whose contents is the maximum of the inputPCollection
's elements, orLong.MIN_VALUE
if there are no elements. - longsGlobally() - Static method in class org.apache.beam.sdk.transforms.Min
-
Returns a
PTransform
that takes an inputPCollection<Long>
and returns aPCollection<Long>
whose contents is the minimum of the inputPCollection
's elements, orLong.MAX_VALUE
if there are no elements. - longsGlobally() - Static method in class org.apache.beam.sdk.transforms.Sum
-
Returns a
PTransform
that takes an inputPCollection<Long>
and returns aPCollection<Long>
whose contents is the sum of the inputPCollection
's elements, or0
if there are no elements. - longsPerKey() - Static method in class org.apache.beam.sdk.transforms.Max
-
Returns a
PTransform
that takes an inputPCollection<KV<K, Long>>
and returns aPCollection<KV<K, Long>>
that contains an output element mapping each distinct key in the inputPCollection
to the maximum of the values associated with that key in the inputPCollection
. - longsPerKey() - Static method in class org.apache.beam.sdk.transforms.Min
-
Returns a
PTransform
that takes an inputPCollection<KV<K, Long>>
and returns aPCollection<KV<K, Long>>
that contains an output element mapping each distinct key in the inputPCollection
to the minimum of the values associated with that key in the inputPCollection
. - longsPerKey() - Static method in class org.apache.beam.sdk.transforms.Sum
-
Returns a
PTransform
that takes an inputPCollection<KV<K, Long>>
and returns aPCollection<KV<K, Long>>
that contains an output element mapping each distinct key in the inputPCollection
to the sum of the values associated with that key in the inputPCollection
. - longToByteArray(long) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigtableUtils
- LookupPipelineVisitor - Class in org.apache.beam.runners.flink.translation.utils
-
Pipeline visitor that fills lookup table of
PTransform
toAppliedPTransform
for usage inFlinkBatchPortablePipelineTranslator.BatchTranslationContext
. - LookupPipelineVisitor() - Constructor for class org.apache.beam.runners.flink.translation.utils.LookupPipelineVisitor
- LossyTimeMicrosConversion() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.LossyTimeMicrosConversion
- LossyTimestampMicrosConversion() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.LossyTimestampMicrosConversion
- LOWER_LATENCY - Enum constant in enum class org.apache.beam.sdk.io.solace.SolaceIO.SubmissionMode
- lpad(byte[], Long) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinStringFunctions
- lpad(byte[], Long, byte[]) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinStringFunctions
- lpad(String, Long) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinStringFunctions
- lpad(String, Long, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinStringFunctions
- LPUSH - Enum constant in enum class org.apache.beam.sdk.io.redis.RedisIO.Write.Method
-
Use LPUSH command.
- ltrim(String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
- ltrim(String, String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
- LTRIM - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
- LTRIM_METHOD - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.BeamBuiltinMethods
- LZO - Enum constant in enum class org.apache.beam.sdk.io.CompressedSource.CompressionMode
-
Deprecated.
- LZO - Enum constant in enum class org.apache.beam.sdk.io.Compression
-
LZO compression using LZO codec.
- LZO - Enum constant in enum class org.apache.beam.sdk.io.FileBasedSink.CompressionType
-
Deprecated.
- LZOP - Enum constant in enum class org.apache.beam.sdk.io.CompressedSource.CompressionMode
-
Deprecated.
- LZOP - Enum constant in enum class org.apache.beam.sdk.io.Compression
-
LZOP compression using LZOP codec.
- LZOP - Enum constant in enum class org.apache.beam.sdk.io.FileBasedSink.CompressionType
-
Deprecated.
M
- main(String[]) - Static method in class org.apache.beam.runners.flink.FlinkJobServerDriver
- main(String[]) - Static method in class org.apache.beam.runners.flink.FlinkMiniClusterEntryPoint
- main(String[]) - Static method in class org.apache.beam.runners.flink.FlinkPipelineRunner
-
Main method to be called only as the entry point to an executable jar with structure as defined in
PortablePipelineJarUtils
. - main(String[]) - Static method in class org.apache.beam.runners.flink.FlinkPortableClientEntryPoint
-
Main method to be called standalone or by Flink (CLI or REST API).
- main(String[]) - Static method in class org.apache.beam.runners.spark.SparkJobServerDriver
- main(String[]) - Static method in class org.apache.beam.runners.spark.SparkPipelineRunner
-
Main method to be called only as the entry point to an executable jar with structure as defined in
PortablePipelineJarUtils
. - main(String[]) - Static method in class org.apache.beam.runners.spark.structuredstreaming.examples.WordCount
- main(String[]) - Static method in class org.apache.beam.sdk.expansion.service.ExpansionService
- main(String[]) - Static method in class org.apache.beam.sdk.extensions.sql.example.BeamSqlDataCatalogExample
- main(String[]) - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOITCommon
-
Use this to create the index for reading before IT read tests.
- main(String[]) - Static method in class org.apache.beam.sdk.transformservice.launcher.TransformServiceLauncher
- mainOutputTag - Variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- makeCost(double, double) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel.Factory
- makeCost(double, double, double) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel.Factory
- makeHL7v2ListRequest(String, String, String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Make hl 7 v 2 list request list messages response.
- makeHL7v2ListRequest(String, String, String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- makeHugeCost() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel.Factory
- makeInfiniteCost() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel.Factory
- makeListRequest(HealthcareApiClient, String, Instant, Instant, String, String, String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.HL7v2MessagePages
-
Make list request list messages response.
- makeOrderKeysFromCollation(RelCollation) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPUtils
-
Transform a list of keys in Calcite to
ORDER BY
toOrderKey
s. - makeOutput(FileIO.ReadableFile, OffsetRange, FileBasedSource<InT>, BoundedSource.BoundedReader<InT>) - Method in class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceTransform.AbstractReadFileRangesFn
- makeProgress() - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.ReadFiles.BlockTracker
- makeRel(RelOptCluster, RelTraitSet, RelBuilder, RelNode, RexProgram) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter.RelType
- makeSendTimeBoundHL7v2ListRequest(String, Instant, Instant, String, String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Time Bound HL7v2 list request.
- makeSendTimeBoundHL7v2ListRequest(String, Instant, Instant, String, String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- makeTinyCost() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel.Factory
- makeZeroCost() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel.Factory
- Managed - Class in org.apache.beam.sdk.managed
-
Top-level
PTransform
s that build and instantiate turnkey transforms. - Managed() - Constructor for class org.apache.beam.sdk.managed.Managed
- Managed.ManagedTransform - Class in org.apache.beam.sdk.managed
- ManagedChannelFactory - Class in org.apache.beam.sdk.fn.channel
-
A Factory which creates
ManagedChannel
instances. - ManagedFactory<T> - Interface in org.apache.beam.sdk.io.gcp.pubsublite.internal
-
A ManagedFactory produces instances and tears down any produced instances when it is itself closed.
- ManagedFactoryImpl<T> - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
- ManagedSchemaTransformProvider - Class in org.apache.beam.sdk.managed
- ManagedSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.managed.ManagedSchemaTransformProvider
- ManagedSchemaTransformTranslation - Class in org.apache.beam.sdk.managed
- ManagedSchemaTransformTranslation() - Constructor for class org.apache.beam.sdk.managed.ManagedSchemaTransformTranslation
- ManagedSchemaTransformTranslation.ManagedTransformRegistrar - Class in org.apache.beam.sdk.managed
- ManagedTransform() - Constructor for class org.apache.beam.sdk.managed.Managed.ManagedTransform
- ManagedTransformRegistrar() - Constructor for class org.apache.beam.sdk.managed.ManagedSchemaTransformTranslation.ManagedTransformRegistrar
- Manual(Instant) - Constructor for class org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimators.Manual
- ManualDockerEnvironmentOptions - Interface in org.apache.beam.sdk.options
-
Pipeline options to tune DockerEnvironment.
- ManualDockerEnvironmentOptions.Options - Class in org.apache.beam.sdk.options
-
Register the
ManualDockerEnvironmentOptions
. - ManualWatermarkEstimator<WatermarkEstimatorStateT> - Interface in org.apache.beam.sdk.transforms.splittabledofn
-
A
WatermarkEstimator
which is controlled manually from within aDoFn
. - map() - Static method in class org.apache.beam.sdk.state.StateSpecs
- map(byte[]) - Method in class org.apache.beam.runners.twister2.translators.functions.ByteToElemFunction
- map(ResultSet) - Method in interface org.apache.beam.sdk.io.cassandra.Mapper
-
This method is called when reading data from Cassandra.
- map(BytesXMLMessage) - Static method in class org.apache.beam.sdk.io.solace.data.Solace.SolaceRecordMapper
-
Maps a
BytesXMLMessage
(if not null) to aSolace.Record
. - map(Tuple<byte[], byte[]>) - Method in class org.apache.beam.runners.twister2.translators.functions.ByteToWindowFunctionPrimitive
- map(Tuple<byte[], Iterator<byte[]>>) - Method in class org.apache.beam.runners.twister2.translators.functions.ByteToWindowFunction
- map(Coder<K>, Coder<V>) - Static method in class org.apache.beam.sdk.state.StateSpecs
-
Identical to
StateSpecs.map()
, but with key and value coders explicitly supplied. - map(Schema.FieldType, Schema.FieldType) - Static method in class org.apache.beam.sdk.schemas.Schema.FieldType
-
Create a map type for the given key and value types.
- map(Schema.FieldType, Schema.FieldType, boolean) - Static method in class org.apache.beam.sdk.schemas.Schema.FieldType
-
Deprecated.Set the nullability on the valueType instead
- map(WindowedValue<KV<K, V>>) - Method in class org.apache.beam.runners.twister2.translators.functions.MapToTupleFunction
- map(WindowedValue<V>) - Method in class org.apache.beam.runners.twister2.translators.functions.ElemToBytesFunction
- map(T) - Method in class org.apache.beam.runners.flink.translation.functions.FlinkIdentityFunction
- MAP - Enum constant in enum class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.Qualifier.Kind
- MAP - Enum constant in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- MAP_TYPES - Static variable in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- MapCoder<K,
V> - Class in org.apache.beam.sdk.coders - MapControlClientPool - Class in org.apache.beam.runners.fnexecution.control
-
A
ControlClientPool
backed by a client map. - MapCsvToStringArrayFn(ValueProvider<String>) - Constructor for class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read.MapCsvToStringArrayFn
- MapElements<InputT,
OutputT> - Class in org.apache.beam.sdk.transforms -
PTransform
s for mapping a simple function over the elements of aPCollection
. - MapElements.MapWithFailures<InputT,
OutputT, - Class in org.apache.beam.sdk.transformsFailureT> -
A
PTransform
that adds exception handling toMapElements
. - mapEncoder(Encoder<K>, Encoder<V>, Class<MapT>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers
- MapFactory() - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.KinesisIOOptions.MapFactory
- MapKeys<K1,
K2, - Class in org.apache.beam.sdk.transformsV> -
MapKeys
maps aSerializableFunction<K1,K2>
over keys of aPCollection<KV<K1,V>>
and returns aPCollection<KV<K2, V>>
. - mapMessage(Message) - Method in interface org.apache.beam.sdk.io.jms.JmsIO.MessageMapper
- MapOfIntBundle() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.MapOfIntBundle
- MapOfNestedIntBundle() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.MapOfNestedIntBundle
- mapOutputs(Map<TupleTag<?>, PCollection<?>>, PCollection<ElemT>) - Method in class org.apache.beam.runners.spark.translation.streaming.CreateStreamingSparkView.Factory
- mapOutputs(Map<TupleTag<?>, PCollection<?>>, PCollectionTuple) - Method in class org.apache.beam.runners.direct.ParDoMultiOverrideFactory
- mapPartition(Iterable<WindowedValue<InputT>>, Collector<RawUnionValue>) - Method in class org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction
-
For non-stateful processing via a simple MapPartitionFunction.
- Mapper<T> - Interface in org.apache.beam.sdk.io.cassandra
-
This interface allows you to implement a custom mapper to read and persist elements from/to Cassandra.
- MapperFactory - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper
-
Factory class for creating instances that will map a struct to a connector model.
- MapperFactory(Dialect) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper.MapperFactory
- Mapping between Beam and ClickHouse types - Search tag in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO
- Section
- MappingUtils - Class in org.apache.beam.sdk.io.cdap
-
Util class for mapping plugins.
- MappingUtils() - Constructor for class org.apache.beam.sdk.io.cdap.MappingUtils
- mapQualifier() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- mapQualifier() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.MapQualifierListContext
- MapQualifierContext(ParserRuleContext, int) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.MapQualifierContext
- MapQualifierListContext(FieldSpecifierNotationParser.QualifierListContext) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.MapQualifierListContext
- mapRow(String[]) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakeIO.CsvMapper
- mapRow(ResultSet) - Method in interface org.apache.beam.sdk.io.jdbc.JdbcIO.RowMapper
- mapRow(ResultSet) - Method in interface org.apache.beam.sdk.io.jdbc.JdbcReadWithPartitionsHelper
- mapRow(ResultSet) - Method in class org.apache.beam.sdk.io.jdbc.SchemaUtil.BeamRowMapper
- mapRow(ResultSet) - Method in interface org.apache.beam.sdk.io.singlestore.SingleStoreIO.RowMapper
- mapRow(Record) - Method in interface org.apache.beam.sdk.io.neo4j.Neo4jIO.RowMapper
- mapRow(T) - Method in interface org.apache.beam.sdk.io.singlestore.SingleStoreIO.UserDataMapper
- mapRow(T) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakeIO.UserDataMapper
- maps(TypeDescriptor<K>, TypeDescriptor<V>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
The
TypeDescriptor
forMap
. - mapSourceFunction(SerializablePipelineOptions, String) - Static method in class org.apache.beam.runners.spark.stateful.StateSpecFunctions
-
A
StateSpec
function to support reading from anUnboundedSource
. - mapSourceRecord(SourceRecord) - Method in class org.apache.beam.io.debezium.SourceRecordJson.SourceRecordJsonMapper
- mapSourceRecord(SourceRecord) - Method in interface org.apache.beam.io.debezium.SourceRecordMapper
- MapState<K,
V> - Interface in org.apache.beam.sdk.state -
A
ReadableState
cell mapping keys to values. - mapToRequest(ByteString, ImageContext) - Method in class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromBytes
-
Maps the
ByteString
with encoded image data and the optionalImageContext
into anAnnotateImageRequest
. - mapToRequest(String, ImageContext) - Method in class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromGcsUri
- mapToRequest(KV<ByteString, ImageContext>, ImageContext) - Method in class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromBytesWithContext
- mapToRequest(KV<String, ImageContext>, ImageContext) - Method in class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromGcsUriWithContext
- MapToTupleFunction<K,
V> - Class in org.apache.beam.runners.twister2.translators.functions -
Map to tuple function.
- MapToTupleFunction() - Constructor for class org.apache.beam.runners.twister2.translators.functions.MapToTupleFunction
- MapToTupleFunction(Coder<K>, WindowedValues.WindowedValueCoder<V>) - Constructor for class org.apache.beam.runners.twister2.translators.functions.MapToTupleFunction
- MapValues<K,
V1, - Class in org.apache.beam.sdk.transformsV2> -
MapValues
maps aSerializableFunction<V1,V2>
over values of aPCollection<KV<K,V1>>
and returns aPCollection<KV<K, V2>>
. - mapView(PCollection<KV<K, V>>, PCollectionViews.TypeDescriptorSupplier<K>, PCollectionViews.TypeDescriptorSupplier<V>, WindowingStrategy<?, W>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
-
Returns a
PCollectionView<Map<K, V>>
capable of processing elements windowed using the providedWindowingStrategy
. - MapViewFn(PCollectionViews.TypeDescriptorSupplier<K>, PCollectionViews.TypeDescriptorSupplier<V>) - Constructor for class org.apache.beam.sdk.values.PCollectionViews.MapViewFn
-
Deprecated.
- MapViewFn2(PCollectionViews.TypeDescriptorSupplier<K>, PCollectionViews.TypeDescriptorSupplier<V>) - Constructor for class org.apache.beam.sdk.values.PCollectionViews.MapViewFn2
- mapViewUsingVoidKey(TupleTag<Materializations.MultimapView<Void, KV<K, V>>>, PCollection<KV<Void, KV<K, V>>>, PCollectionViews.TypeDescriptorSupplier<K>, PCollectionViews.TypeDescriptorSupplier<V>, WindowingStrategy<?, W>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
- markDone() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
-
Marks this range tracker as being done.
- markDone() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
-
Marks this range tracker as being done.
- markNewPartitionForDeletion(NewPartition) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
-
This is the 1st step of 2 phase delete.
- match() - Static method in class org.apache.beam.sdk.io.FileIO
-
Matches a filepattern using
FileSystems.match(java.util.List<java.lang.String>)
and produces a collection of matched resources (both files and directories) asMatchResult.Metadata
. - match(Class<V>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.utils.ScalaInterop
- match(String) - Static method in class org.apache.beam.sdk.io.FileSystems
-
Like
FileSystems.match(List)
, but for a single resource specification. - match(String, EmptyMatchTreatment) - Static method in class org.apache.beam.sdk.io.FileSystems
-
Like
FileSystems.match(String)
, but with a configurableEmptyMatchTreatment
. - match(List<String>) - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem
- match(List<String>) - Method in class org.apache.beam.sdk.io.FileSystem
-
This is the entry point to convert user-provided specs to
ResourceIds
. - match(List<String>) - Static method in class org.apache.beam.sdk.io.FileSystems
-
This is the entry point to convert user-provided specs to
ResourceIds
. - match(List<String>, EmptyMatchTreatment) - Static method in class org.apache.beam.sdk.io.FileSystems
-
Like
FileSystems.match(List)
, but with a configurableEmptyMatchTreatment
. - match(StateSpec.Cases<ResultT>) - Method in interface org.apache.beam.sdk.state.StateSpec
-
For internal use only; no backwards-compatibility guarantees.
- Match() - Constructor for class org.apache.beam.sdk.io.FileIO.Match
- matchAll() - Static method in class org.apache.beam.sdk.io.FileIO
-
Like
FileIO.match()
, but matches each filepattern in a collection of filepatterns. - MatchAll() - Constructor for class org.apache.beam.sdk.io.FileIO.MatchAll
- MatchConfiguration() - Constructor for class org.apache.beam.sdk.io.FileIO.MatchConfiguration
- MatcherAndError() - Constructor for class org.apache.beam.sdk.extensions.gcp.util.CustomHttpErrors.MatcherAndError
- MatcherCheckerFn(SerializableMatcher<T>) - Constructor for class org.apache.beam.sdk.testing.PAssert.MatcherCheckerFn
- matches(Object) - Method in class org.apache.beam.sdk.testing.RegexMatcher
- matches(Object) - Method in class org.apache.beam.sdk.testing.TestPipelineOptions.AlwaysPassMatcher
- matches(String) - Method in enum class org.apache.beam.sdk.io.CompressedSource.CompressionMode
-
Deprecated.Returns
true
if the given file name implies that the contents are compressed according to the compression embodied by this factory. - matches(String) - Method in enum class org.apache.beam.sdk.io.Compression
- matches(String) - Method in enum class org.apache.beam.sdk.io.TextIO.CompressionType
-
Deprecated.
- matches(String) - Method in enum class org.apache.beam.sdk.io.TFRecordIO.CompressionType
-
Deprecated.
- matches(String) - Method in enum class org.apache.beam.sdk.io.xml.XmlIO.Read.CompressionType
-
Deprecated.
- matches(String) - Static method in class org.apache.beam.sdk.testing.RegexMatcher
- matches(String) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.Matches
PTransform
that checks if the entire line matches the Regex. - matches(String, int) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.Matches
PTransform
that checks if the entire line matches the Regex. - matches(String, String) - Method in class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.MatchFn
- matches(String, String) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.MatchesName
PTransform
that checks if the entire line matches the Regex. - matches(Pattern) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.Matches
PTransform
that checks if the entire line matches the Regex. - matches(Pattern, int) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.Matches
PTransform
that checks if the entire line matches the Regex. - matches(Pattern, String) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.MatchesName
PTransform
that checks if the entire line matches the Regex. - matches(MetricsFilter, MetricKey) - Static method in class org.apache.beam.sdk.metrics.MetricFiltering
-
Matching logic is implemented here rather than in MetricsFilter because we would like MetricsFilter to act as a "dumb" value-object, with the possibility of replacing it with a Proto/JSON/etc.
- matches(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamCalcRule
- matches(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamCalcSplittingRule
- matches(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamCoGBKJoinRule
- matches(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamSideInputJoinRule
- matches(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamSideInputLookupJoinRule
- Matches(Pattern, int) - Constructor for class org.apache.beam.sdk.transforms.Regex.Matches
- matchesKV(String, int, int) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.MatchesKV
PTransform
that checks if the entire line matches the Regex. - matchesKV(String, String, String) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.MatchesNameKV
PTransform
that checks if the entire line matches the Regex. - matchesKV(Pattern, int, int) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.MatchesKV
PTransform
that checks if the entire line matches the Regex. - matchesKV(Pattern, String, String) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.MatchesNameKV
PTransform
that checks if the entire line matches the Regex. - MatchesKV(Pattern, int, int) - Constructor for class org.apache.beam.sdk.transforms.Regex.MatchesKV
- MatchesName(Pattern, String) - Constructor for class org.apache.beam.sdk.transforms.Regex.MatchesName
- MatchesNameKV(Pattern, String, String) - Constructor for class org.apache.beam.sdk.transforms.Regex.MatchesNameKV
- matchesSafely(BigqueryMatcher.TableAndQuery) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher
- matchesSafely(ShardedFile) - Method in class org.apache.beam.sdk.testing.FileChecksumMatcher
- matchesSafely(T) - Method in class org.apache.beam.sdk.testing.JsonMatcher
- matchesScope(String, Set<String>) - Static method in class org.apache.beam.sdk.metrics.MetricFiltering
-
matchesScope(actualScope, scopes)
returns true if the scope of a metric is matched by any of the filters inscopes
. - MatchFn() - Constructor for class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.MatchFn
- Matching filepatterns - Search tag in class org.apache.beam.sdk.io.FileIO
- Section
- matchNewDirectory(String, String...) - Static method in class org.apache.beam.sdk.io.FileSystems
-
Returns a new
ResourceId
that represents the named directory resource. - matchNewResource(String, boolean) - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem
- matchNewResource(String, boolean) - Method in class org.apache.beam.sdk.io.FileSystem
-
Returns a new
ResourceId
for this filesystem that represents the named resource. - matchNewResource(String, boolean) - Static method in class org.apache.beam.sdk.io.FileSystems
-
Returns a new
ResourceId
that represents the named resource of a type corresponding to the resource type. - matchResources(List<ResourceId>) - Static method in class org.apache.beam.sdk.io.FileSystems
-
Returns
MatchResults
for the givenresourceIds
. - MatchResult - Class in org.apache.beam.sdk.io.fs
-
The result of
FileSystem.match(java.util.List<java.lang.String>)
. - MatchResult.Metadata - Class in org.apache.beam.sdk.io.fs
-
MatchResult.Metadata
of a matched file. - MatchResult.Metadata.Builder - Class in org.apache.beam.sdk.io.fs
-
Builder class for
MatchResult.Metadata
. - MatchResult.Status - Enum Class in org.apache.beam.sdk.io.fs
-
Status of a
MatchResult
. - matchSingleFileSpec(String) - Static method in class org.apache.beam.sdk.io.FileSystems
-
Returns the
MatchResult.Metadata
for a single file resource. - Materialization<T> - Interface in org.apache.beam.sdk.transforms
-
For internal use only; no backwards-compatibility guarantees.
- Materializations - Class in org.apache.beam.sdk.transforms
-
For internal use only; no backwards-compatibility guarantees.
- Materializations() - Constructor for class org.apache.beam.sdk.transforms.Materializations
- Materializations.IterableView<V> - Interface in org.apache.beam.sdk.transforms
-
Represents the
PrimitiveViewT
supplied to theViewFn
when it declares to use theiterable materialization
. - Materializations.MultimapView<K,
V> - Interface in org.apache.beam.sdk.transforms -
Represents the
PrimitiveViewT
supplied to theViewFn
when it declares to use themultimap materialization
. - MATERIALIZED - Enum constant in enum class org.apache.beam.sdk.io.clickhouse.TableSchema.DefaultType
- materializedOrAlias() - Method in class org.apache.beam.sdk.io.clickhouse.TableSchema.Column
- Max - Class in org.apache.beam.sdk.transforms
-
PTransform
s for computing the maximum of the elements in aPCollection
, or the maximum of the values associated with each key in aPCollection
ofKV
s. - MAX_HASH_KEY - Static variable in interface org.apache.beam.sdk.io.aws2.kinesis.KinesisPartitioner
- MAX_INCLUSIVE_END_AT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
-
Represents the max end at that can be specified for a change stream.
- MAX_LENGTH - Static variable in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeVarchar
- MAX_SIZE - Static variable in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeBinary
- MAX_UNIX_MILLIS - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.DateTimeUtils
- maxBackoff() - Method in class org.apache.beam.sdk.io.aws2.common.RetryConfiguration
- maxBackoff(Duration) - Method in class org.apache.beam.sdk.io.aws2.common.RetryConfiguration.Builder
- maxBufferedTime(Duration) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.RecordAggregation.Builder
-
Buffer timeout for user records.
- maxBufferingDuration() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportCatalogItems
- maxBufferingDuration() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportUserEvents
- MaxBundleSizeFactory() - Constructor for class org.apache.beam.runners.flink.FlinkPipelineOptions.MaxBundleSizeFactory
- MaxBundleTimeFactory() - Constructor for class org.apache.beam.runners.flink.FlinkPipelineOptions.MaxBundleTimeFactory
- maxBytes(int) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.RecordAggregation.Builder
-
Max bytes per aggregated record.
- maxConnections() - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration
-
The maximum number of connections allowed in the connection pool.
- maxConnections(Integer) - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration.Builder
-
The maximum number of connections allowed in the connection pool.
- maxCumulativeBackoff() - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
- MAXIMUM_PRECISION - Static variable in class org.apache.beam.sdk.extensions.zetasketch.HllCount
-
The maximum
precision
value you can set inHllCount.Init.Builder.withPrecision(int)
is 24. - maximumLookback() - Method in class org.apache.beam.sdk.transforms.windowing.WindowMappingFn
-
The maximum distance between the end of any main input window
mainWindow
and the end of the side input window returned byWindowMappingFn.getSideInputWindow(BoundedWindow)
- maxInsertBlockSize() - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
- maxNumRecords() - Method in class org.apache.beam.sdk.io.aws2.sqs.providers.SqsReadConfiguration
- maxRetries() - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
- MaxStackTraceDepthToReportFactory() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions.MaxStackTraceDepthToReportFactory
- maxTimestamp() - Method in class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow
- maxTimestamp() - Method in class org.apache.beam.sdk.transforms.windowing.BoundedWindow
-
Returns the inclusive upper bound of timestamps for values in this window.
- maxTimestamp() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow
- maxTimestamp() - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
-
Returns the largest timestamp that can be included in this window.
- maxTimestamp(Iterable<BoundedWindow>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers.Utils
-
The end of the only window (max timestamp).
- maybeApplyFilter(CloseableIterable<Record>, IcebergScanConfig) - Static method in class org.apache.beam.sdk.io.iceberg.ReadUtils
- mayFinish() - Method in class org.apache.beam.sdk.transforms.windowing.AfterEach
- mayFinish() - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
- mayFinish() - Method in class org.apache.beam.sdk.transforms.windowing.DefaultTrigger
- mayFinish() - Method in class org.apache.beam.sdk.transforms.windowing.OrFinallyTrigger
- mayFinish() - Method in class org.apache.beam.sdk.transforms.windowing.Repeatedly
- mayFinish() - Method in class org.apache.beam.sdk.transforms.windowing.ReshuffleTrigger
- mayFinish() - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
-
For internal use only; no backwards-compatibility guarantees.
- mayFinish() - Method in class org.apache.beam.sdk.transforms.windowing.Trigger.OnceTrigger
- md5(String) - Method in class org.apache.beam.sdk.io.aws2.s3.SSECustomerKey.Builder
- md5Bytes(byte[]) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinHashFunctions
-
MD5(X)
- md5String(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinHashFunctions
-
MD5(X)
- Mean - Class in org.apache.beam.sdk.transforms
-
PTransform
s for computing the arithmetic mean (a.k.a. - MemoryMonitorOptions - Interface in org.apache.beam.sdk.options
-
Options that are used to control the Memory Monitor.
- merge(ImplT, SparkCombineFn<?, ?, AccumT, ?>) - Method in interface org.apache.beam.runners.spark.translation.SparkCombineFn.WindowedAccumulator
-
Merge other accumulator into this one.
- merge(Collection<W>, W) - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn.MergeContext
-
Signals to the framework that the windows in
toBeMerged
should be merged together to formmergeResult
. - merge(SequenceRangeAccumulator) - Method in class org.apache.beam.sdk.extensions.ordered.combiner.SequenceRangeAccumulator
- merge(BoundedWindow, Iterable<? extends Instant>) - Method in enum class org.apache.beam.sdk.transforms.windowing.TimestampCombiner
-
Merges the given timestamps, which may have originated in separate windows, into the context of the result window.
- merge(BoundedWindow, Instant...) - Method in enum class org.apache.beam.sdk.transforms.windowing.TimestampCombiner
- merge(Accumulator<MetricsContainerStepMap, MetricsContainerStepMap>) - Method in class org.apache.beam.runners.flink.metrics.MetricsAccumulator
- merge(AccumulatorV2<MetricsContainerStepMap, MetricsContainerStepMap>) - Method in class org.apache.beam.runners.spark.metrics.MetricsContainerStepMapAccumulator
- merge(AccumulatorV2<MetricsContainerStepMap, MetricsContainerStepMap>) - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.MetricsAccumulator
- mergeAccumulator(AccumT) - Method in interface org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn.Accumulator
-
Adds the input values represented by the given accumulator into this accumulator.
- mergeAccumulators(AccumT, Iterable<AccumT>) - Method in interface org.apache.beam.sdk.extensions.sql.udf.AggregateFn
-
Returns an accumulator representing the accumulation of all the input values accumulated in the merging accumulators.
- mergeAccumulators(Iterable<AccumT>) - Method in class org.apache.beam.sdk.extensions.sql.impl.LazyAggregateCombineFn
- mergeAccumulators(Iterable<AccumT>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions.PositionAwareCombineFn
- mergeAccumulators(Iterable<AccumT>) - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
- mergeAccumulators(Iterable<AccumT>) - Method in interface org.apache.beam.sdk.state.CombiningState
-
Merge the given accumulators according to the underlying
Combine.CombineFn
. - mergeAccumulators(Iterable<AccumT>) - Method in class org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn
- mergeAccumulators(Iterable<AccumT>) - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
-
Returns an accumulator representing the accumulation of all the input values accumulated in the merging accumulators.
- mergeAccumulators(Iterable<AccumT>, CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
-
Returns an accumulator representing the accumulation of all the input values accumulated in the merging accumulators.
- mergeAccumulators(Iterable<HyperLogLogPlus>) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
- mergeAccumulators(Iterable<MergingDigest>) - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
- mergeAccumulators(Iterable<double[]>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
- mergeAccumulators(Iterable<int[]>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
- mergeAccumulators(Iterable<Iterable<T>>) - Method in class org.apache.beam.runners.flink.FlinkStreamingAggregationsTranslators.ConcatenateAsIterable
- mergeAccumulators(Iterable<Object[]>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
- mergeAccumulators(Iterable<Object[]>, CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
- mergeAccumulators(Iterable<String>) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggByte
- mergeAccumulators(Iterable<String>) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggString
- mergeAccumulators(Iterable<List<String>>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Concatenate
- mergeAccumulators(Iterable<List<T>>) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.ArrayAgg.ArrayAggArray
- mergeAccumulators(Iterable<List<V>>) - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
- mergeAccumulators(Iterable<long[]>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CountIf.CountIfFn
- mergeAccumulators(Iterable<long[]>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
- mergeAccumulators(Iterable<SequenceRangeAccumulator>) - Method in class org.apache.beam.sdk.extensions.ordered.combiner.DefaultSequenceCombiner
- mergeAccumulators(Iterable<SketchFrequencies.Sketch<InputT>>) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
- mergeAccumulators(Iterable<CovarianceAccumulator>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CovarianceFn
- mergeAccumulators(Iterable<VarianceAccumulator>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.VarianceFn
- mergeAccumulators(Iterable<BeamBuiltinAggregations.BitXOr.Accum>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations.BitXOr
- mergeAccumulators(Iterable<ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique>) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
-
Deprecated.
- mergeAccumulators(Iterable<Combine.Holder<V>>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
- mergeAccumulators(Iterable<Top.BoundedHeap<KV<Integer, T>, SerializableComparator<KV<Integer, T>>>>) - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
- mergeAccumulators(Long, Iterable<Long>) - Method in class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.Sum
- MergeContext() - Constructor for class org.apache.beam.sdk.transforms.windowing.WindowFn.MergeContext
- MergeOverlappingIntervalWindows - Class in org.apache.beam.sdk.transforms.windowing
-
For internal use only; no backwards compatibility guarantees.
- MergeOverlappingIntervalWindows() - Constructor for class org.apache.beam.sdk.transforms.windowing.MergeOverlappingIntervalWindows
- mergeWideningNullable(Schema, Schema) - Static method in class org.apache.beam.sdk.schemas.SchemaUtils
-
Given two schema that have matching types, return a nullable-widened schema.
- mergeWindows(WindowFn.MergeContext) - Static method in class org.apache.beam.sdk.transforms.windowing.MergeOverlappingIntervalWindows
-
Merge overlapping
IntervalWindow
s. - mergeWindows(WindowFn.MergeContext) - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
- mergeWindows(WindowFn.MergeContext) - Method in class org.apache.beam.sdk.transforms.windowing.NonMergingWindowFn
- mergeWindows(WindowFn.MergeContext) - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
-
Does whatever merging of windows is necessary.
- mergeWithOuter(ResourceHint, boolean) - Method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHint
-
Reconciles values of a hint when the hint specified on a transform is also defined in an outer context, for example on a composite transform, or specified in the transform's execution environment.
- mergeWithOuter(ResourceHints) - Method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHints
- MERGING - Enum constant in enum class org.apache.beam.runners.spark.translation.SparkCombineFn.WindowedAccumulator.Type
- message() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.IncomingMessage
-
Underlying Message.
- message() - Method in class org.apache.beam.sdk.schemas.transforms.Cast.CompatibilityError
- messageFromBeamRow(Descriptors.Descriptor, Row, String, long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BeamRowToStorageApiProto
-
Forwards (@param changeSequenceNum) to
BeamRowToStorageApiProto.messageFromBeamRow(Descriptor, Row, String, String)
viaLong.toHexString(long)
. - messageFromBeamRow(Descriptors.Descriptor, Row, String, String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BeamRowToStorageApiProto
-
Given a Beam
Row
object, returns a protocol-buffer message that can be used to write data using the BigQuery Storage streaming API. - messageFromGenericRecord(Descriptors.Descriptor, GenericRecord, String, long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.AvroGenericRecordToStorageApiProto
- messageFromGenericRecord(Descriptors.Descriptor, GenericRecord, String, String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.AvroGenericRecordToStorageApiProto
-
Given an Avro
GenericRecord
object, returns a protocol-buffer message that can be used to write data using the BigQuery Storage streaming API. - messageFromMap(TableRowToStorageApiProto.SchemaInformation, Descriptors.Descriptor, AbstractMap<String, Object>, boolean, boolean, TableRow, String, String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
- messageFromTableRow(TableRowToStorageApiProto.SchemaInformation, Descriptors.Descriptor, TableRow, boolean, boolean, TableRow, String, long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
- messageFromTableRow(TableRowToStorageApiProto.SchemaInformation, Descriptors.Descriptor, TableRow, boolean, boolean, TableRow, String, String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
-
Given a BigQuery TableRow, returns a protocol-buffer message that can be used to write data using the BigQuery Storage API.
- messageId() - Method in class org.apache.beam.sdk.extensions.sbe.SbeSchema.IrOptions
- messageName() - Method in class org.apache.beam.sdk.extensions.sbe.SbeSchema.IrOptions
- MessageProducer - Interface in org.apache.beam.sdk.io.solace.broker
-
Base class for publishing messages to a Solace broker.
- MessageProducerUtils - Class in org.apache.beam.sdk.io.solace.broker
- MessageProducerUtils() - Constructor for class org.apache.beam.sdk.io.solace.broker.MessageProducerUtils
- MessageReceiver - Interface in org.apache.beam.sdk.io.solace.broker
-
Interface for receiving messages from a Solace broker.
- meta(List<?>, byte[]) - Static method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmRecord
- metadata() - Method in class org.apache.beam.sdk.io.fs.MatchResult
-
MatchResult.Metadata
of matched files. - Metadata() - Constructor for class org.apache.beam.sdk.io.fs.MatchResult.Metadata
- Metadata(long, Instant, Instant, long, MetricsContainerStepMap) - Constructor for class org.apache.beam.runners.spark.io.SparkUnboundedSource.Metadata
- METADATA - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata
-
TupleTag for the main output.
- MetadataCoder - Class in org.apache.beam.sdk.io.fs
-
A
Coder
forMatchResult.Metadata
. - MetadataCoderV2 - Class in org.apache.beam.sdk.io.fs
- MetadataSpannerConfigFactory - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams
-
This class generates a SpannerConfig for the change stream metadata database by copying only the necessary fields from the SpannerConfig of the primary database.
- MetadataSpannerConfigFactory() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.MetadataSpannerConfigFactory
- MetadataTableAdminDao - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao
-
Data access object for creating and dropping the metadata table.
- MetadataTableAdminDao(BigtableTableAdminClient, BigtableInstanceAdminClient, String, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
- MetadataTableDao - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao
-
Data access object for managing the state of the metadata Bigtable table.
- MetadataTableDao(BigtableDataClient, String, ByteString) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
- MetadataTableEncoder - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.encoder
-
Helper methods that simplifies some conversion and extraction of metadata table content.
- MetadataTableEncoder() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.encoder.MetadataTableEncoder
- metaStore() - Method in interface org.apache.beam.sdk.extensions.sql.meta.catalog.Catalog
-
The underlying
MetaStore
that actually manages tables. - metaStore() - Method in class org.apache.beam.sdk.extensions.sql.meta.catalog.InMemoryCatalog
- metaStore() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.iceberg.IcebergCatalog
- metaStore(MetaStore) - Method in class org.apache.beam.sdk.extensions.sql.BeamSqlCli
- MetaStore - Interface in org.apache.beam.sdk.extensions.sql.meta.store
-
The interface to handle CRUD of
BeamSql
table metadata. - method - Variable in class org.apache.beam.sdk.extensions.sql.impl.UdfImplReflectiveFunctionBase
-
Method that implements the function.
- method() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.UserFunctionDefinitions.JavaScalarFunction
- METHOD - Static variable in interface org.apache.beam.sdk.extensions.sql.impl.planner.NodeStatsMetadata
- Metric - Interface in org.apache.beam.sdk.metrics
-
Marker interface for all user-facing metrics.
- MetricFiltering - Class in org.apache.beam.sdk.metrics
-
Implements matching for metrics filters.
- MetricKey - Class in org.apache.beam.sdk.metrics
-
Metrics are keyed by the step name they are associated with and the name of the metric.
- MetricKey() - Constructor for class org.apache.beam.sdk.metrics.MetricKey
- metricName() - Method in class org.apache.beam.sdk.metrics.MetricKey
-
The name of the metric.
- MetricName - Class in org.apache.beam.sdk.metrics
-
The name of a metric consists of a
MetricName.getNamespace()
and aMetricName.getName()
. - MetricName() - Constructor for class org.apache.beam.sdk.metrics.MetricName
- MetricNameFilter - Class in org.apache.beam.sdk.metrics
-
The name of a metric.
- MetricNameFilter() - Constructor for class org.apache.beam.sdk.metrics.MetricNameFilter
- MetricQueryResults - Class in org.apache.beam.sdk.metrics
-
The results of a query for metrics.
- MetricQueryResults() - Constructor for class org.apache.beam.sdk.metrics.MetricQueryResults
- metricRegistry() - Method in class org.apache.beam.runners.spark.metrics.SparkBeamMetricSource
- metricRegistry() - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.SparkBeamMetricSource
- MetricResult<T> - Class in org.apache.beam.sdk.metrics
-
The results of a single current metric.
- MetricResult() - Constructor for class org.apache.beam.sdk.metrics.MetricResult
- MetricResults - Class in org.apache.beam.sdk.metrics
-
Methods for interacting with the metrics of a pipeline that has been executed.
- MetricResults() - Constructor for class org.apache.beam.sdk.metrics.MetricResults
- metrics() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
- metrics() - Method in class org.apache.beam.runners.dataflow.util.DataflowTemplateJob
- metrics() - Method in class org.apache.beam.runners.direct.DirectRunner.DirectPipelineResult
- metrics() - Method in class org.apache.beam.runners.flink.FlinkDetachedRunnerResult
- metrics() - Method in class org.apache.beam.runners.flink.FlinkRunnerResult
- metrics() - Method in class org.apache.beam.runners.jet.FailedRunningPipelineResults
- metrics() - Method in class org.apache.beam.runners.jet.JetPipelineResult
- metrics() - Method in class org.apache.beam.runners.spark.SparkPipelineResult
- metrics() - Method in class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingPipelineResult
- metrics() - Method in class org.apache.beam.runners.twister2.Twister2PipelineResult
- metrics() - Method in interface org.apache.beam.sdk.PipelineResult
-
Returns the object to access metrics from the pipeline.
- Metrics - Class in org.apache.beam.runners.flink.metrics
-
Helper for pretty-printing
Flink metrics
. - Metrics - Class in org.apache.beam.sdk.metrics
-
The
Metrics
is a utility class for producing various kinds of metrics for reporting properties of an executing pipeline. - Metrics() - Constructor for class org.apache.beam.runners.flink.metrics.Metrics
- METRICS_NAMESPACE - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
- METRICS_NAMESPACE - Static variable in class org.apache.beam.sdk.io.kafka.KafkaSinkMetrics
- MetricsAccumulator - Class in org.apache.beam.runners.flink.metrics
-
Accumulator of
MetricsContainerStepMap
. - MetricsAccumulator - Class in org.apache.beam.runners.spark.metrics
-
For resilience,
Accumulators
are required to be wrapped in a Singleton. - MetricsAccumulator - Class in org.apache.beam.runners.spark.structuredstreaming.metrics
-
AccumulatorV2
for Beam metrics captured inMetricsContainerStepMap
. - MetricsAccumulator() - Constructor for class org.apache.beam.runners.flink.metrics.MetricsAccumulator
- MetricsAccumulator() - Constructor for class org.apache.beam.runners.spark.metrics.MetricsAccumulator
- MetricsAccumulator() - Constructor for class org.apache.beam.runners.spark.structuredstreaming.metrics.MetricsAccumulator
- MetricsAccumulator.AccumulatorCheckpointingSparkListener - Class in org.apache.beam.runners.spark.metrics
-
Spark Listener which checkpoints
MetricsContainerStepMap
values for fault-tolerance. - MetricsContainer - Interface in org.apache.beam.sdk.metrics
-
Holds the metrics for a single step.
- MetricsContainerHolder() - Constructor for class org.apache.beam.sdk.metrics.MetricsEnvironment.MetricsContainerHolder
- metricsContainers - Variable in class org.apache.beam.runners.flink.metrics.FlinkMetricContainer
- metricsContainers - Variable in class org.apache.beam.runners.flink.metrics.FlinkMetricContainerWithoutAccumulator
- MetricsContainerStepMapAccumulator - Class in org.apache.beam.runners.spark.metrics
-
AccumulatorV2
implementation forMetricsContainerStepMap
. - MetricsContainerStepMapAccumulator(MetricsContainerStepMap) - Constructor for class org.apache.beam.runners.spark.metrics.MetricsContainerStepMapAccumulator
- MetricsEnvironment - Class in org.apache.beam.sdk.metrics
-
Manages and provides the metrics container associated with each thread.
- MetricsEnvironment() - Constructor for class org.apache.beam.sdk.metrics.MetricsEnvironment
- MetricsEnvironment.MetricsContainerHolder - Class in org.apache.beam.sdk.metrics
- MetricsEnvironment.MetricsEnvironmentState - Interface in org.apache.beam.sdk.metrics
-
Set the
MetricsContainer
for the associatedMetricsEnvironment
. - MetricsFilter - Class in org.apache.beam.sdk.metrics
-
Simple POJO representing a filter for querying metrics.
- MetricsFilter() - Constructor for class org.apache.beam.sdk.metrics.MetricsFilter
- MetricsFilter.Builder - Class in org.apache.beam.sdk.metrics
-
Builder for creating a
MetricsFilter
. - MetricsOptions - Interface in org.apache.beam.sdk.metrics
-
Extension of
PipelineOptions
that definesMetricsSink
specific options. - MetricsOptions.NoOpMetricsSink - Class in org.apache.beam.sdk.metrics
-
A
DefaultValueFactory
that obtains the class of theNoOpMetricsSink
if it exists on the classpath, and throws an exception otherwise. - MetricsSink - Interface in org.apache.beam.sdk.metrics
-
Interface for all metric sinks.
- MicrobatchSource<T,
CheckpointMarkT> - Class in org.apache.beam.runners.spark.io -
A
Source
that accommodates Spark's micro-batch oriented nature and wraps anUnboundedSource
. - MicrobatchSource.Reader - Class in org.apache.beam.runners.spark.io
-
Mostly based on
BoundedReadFromUnboundedSource
'sUnboundedToBoundedSourceAdapter
, with some adjustments for Spark specifics. - microsecondToInstant(long) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.TimestampConverter
- MicrosInstant - Class in org.apache.beam.sdk.schemas.logicaltypes
-
A timestamp represented as microseconds since the epoch.
- MicrosInstant() - Constructor for class org.apache.beam.sdk.schemas.logicaltypes.MicrosInstant
- MILLIS_PER_DAY - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.DateTimeUtils
- mimeType() - Method in class org.apache.beam.sdk.io.fs.CreateOptions
-
The file-like resource mime type.
- Min - Class in org.apache.beam.sdk.transforms
-
PTransform
s for computing the minimum of the elements in aPCollection
, or the minimum of the values associated with each key in aPCollection
ofKV
s. - MIN_HASH_KEY - Static variable in interface org.apache.beam.sdk.io.aws2.kinesis.KinesisPartitioner
- MIN_UNIX_MILLIS - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.DateTimeUtils
- MINIMUM_PRECISION - Static variable in class org.apache.beam.sdk.extensions.zetasketch.HllCount
-
The minimum
precision
value you can set inHllCount.Init.Builder.withPrecision(int)
is 10. - MINIMUM_UPLOAD_BUFFER_SIZE_BYTES - Static variable in class org.apache.beam.sdk.io.aws2.options.S3Options.S3UploadBufferSizeBytesFactory
- MINIMUM_UPLOAD_BUFFER_SIZE_BYTES - Static variable in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
- minus(NodeStats) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.NodeStats
- minus(RelOptCost) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
- MINUS - Enum constant in enum class org.apache.beam.sdk.extensions.sql.impl.cep.CEPKind
- MINUS - Enum constant in enum class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSetOperatorRelBase.OpType
- minWatermarkHoldMs() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals
-
Returns the minimum over all watermark holds.
- Mod - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
Represents a modification in a table emitted within a
DataChangeRecord
. - Mod(String, String, String) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.Mod
-
Constructs a mod from the primary key values, the old state of the row and the new state of the row.
- modeNames - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
- modeToProtoMode(String, String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
- modifyAckDeadline(PubsubClient.SubscriptionPath, List<String>, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Modify the ack deadline for messages from
subscription
withackIds
to bedeadlineSeconds
from now. - modifyAckDeadline(PubsubClient.SubscriptionPath, List<String>, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
- modifyAckDeadline(PubsubClient.SubscriptionPath, List<String>, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
- modifyAckDeadline(PubsubClient.SubscriptionPath, List<String>, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- modifyEnvironmentBeforeSubmission(Environment) - Method in class org.apache.beam.runners.dataflow.DataflowRunnerHooks
-
Allows the user to modify the environment of their job before their job is submitted to the service for execution.
- ModType - Enum Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
Represents the type of modification applied in the
DataChangeRecord
. - MongoDbGridFSIO - Class in org.apache.beam.sdk.io.mongodb
-
IO to read and write data on MongoDB GridFS.
- MongoDbGridFSIO() - Constructor for class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO
- MongoDbGridFSIO.ConnectionConfiguration - Class in org.apache.beam.sdk.io.mongodb
-
Encapsulate the MongoDB GridFS connection logic.
- MongoDbGridFSIO.Parser<T> - Interface in org.apache.beam.sdk.io.mongodb
-
Interface for the parser that is used to parse the GridFSDBFile into the appropriate types.
- MongoDbGridFSIO.ParserCallback<T> - Interface in org.apache.beam.sdk.io.mongodb
-
Callback for the parser to use to submit data.
- MongoDbGridFSIO.Read<T> - Class in org.apache.beam.sdk.io.mongodb
-
A
PTransform
to read data from MongoDB GridFS. - MongoDbGridFSIO.Read.BoundedGridFSSource - Class in org.apache.beam.sdk.io.mongodb
-
A
BoundedSource
for MongoDB GridFS. - MongoDbGridFSIO.Write<T> - Class in org.apache.beam.sdk.io.mongodb
-
A
PTransform
to write data to MongoDB GridFS. - MongoDbGridFSIO.WriteFn<T> - Interface in org.apache.beam.sdk.io.mongodb
-
Function that is called to write the data to the give GridFS OutputStream.
- MongoDbIO - Class in org.apache.beam.sdk.io.mongodb
-
IO to read and write data on MongoDB.
- MongoDbIO.Read - Class in org.apache.beam.sdk.io.mongodb
-
A
PTransform
to read data from MongoDB. - MongoDbIO.Write - Class in org.apache.beam.sdk.io.mongodb
-
A
PTransform
to write to a MongoDB database. - MongoDbTable - Class in org.apache.beam.sdk.extensions.sql.meta.provider.mongodb
- MongoDbTable.DocumentToRow - Class in org.apache.beam.sdk.extensions.sql.meta.provider.mongodb
- MongoDbTable.RowToDocument - Class in org.apache.beam.sdk.extensions.sql.meta.provider.mongodb
- MongoDbTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.mongodb
- MongoDbTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTableProvider
- Monitoring - Search tag in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
- Section
- Monitoring - Class in org.apache.beam.io.requestresponse
-
Configures
Metric
s throughout various features ofRequestResponseIO
. - Monitoring() - Constructor for class org.apache.beam.io.requestresponse.Monitoring
- Monitoring.Builder - Class in org.apache.beam.io.requestresponse
- MonitoringUtil - Class in org.apache.beam.runners.dataflow.util
-
A helper class for monitoring jobs submitted to the service.
- MonitoringUtil(DataflowClient) - Constructor for class org.apache.beam.runners.dataflow.util.MonitoringUtil
-
Construct a helper for monitoring.
- MonitoringUtil.JobMessagesHandler - Interface in org.apache.beam.runners.dataflow.util
-
An interface that can be used for defining callbacks to receive a list of JobMessages containing monitoring information.
- MonitoringUtil.LoggingHandler - Class in org.apache.beam.runners.dataflow.util
-
A handler that logs monitoring messages.
- MonitoringUtil.TimeStampComparator - Class in org.apache.beam.runners.dataflow.util
-
Comparator for sorting rows in increasing order based on timestamp.
- MonotonicallyIncreasing(Instant) - Constructor for class org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimators.MonotonicallyIncreasing
- months(int) - Static method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows
-
Returns a
WindowFn
that windows elements into periods measured by months. - MOST_SIGNIFICANT_BITS_FIELD_NAME - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.UuidLogicalType
- MoveOptions - Interface in org.apache.beam.sdk.io.fs
-
An object that configures
FileSystems.copy(java.util.List<org.apache.beam.sdk.io.fs.ResourceId>, java.util.List<org.apache.beam.sdk.io.fs.ResourceId>, org.apache.beam.sdk.io.fs.MoveOptions...)
,FileSystems.rename(java.util.List<org.apache.beam.sdk.io.fs.ResourceId>, java.util.List<org.apache.beam.sdk.io.fs.ResourceId>, org.apache.beam.sdk.io.fs.MoveOptions...)
, andFileSystems.delete(java.util.Collection<org.apache.beam.sdk.io.fs.ResourceId>, org.apache.beam.sdk.io.fs.MoveOptions...)
. - MoveOptions.StandardMoveOptions - Enum Class in org.apache.beam.sdk.io.fs
-
Defines the standard
MoveOptions
. - MovingAvg() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaIOUtils.MovingAvg
- MqttIO - Class in org.apache.beam.sdk.io.mqtt
-
An unbounded source for MQTT broker.
- MqttIO.ConnectionConfiguration - Class in org.apache.beam.sdk.io.mqtt
-
A POJO describing a MQTT connection.
- MqttIO.Read<T> - Class in org.apache.beam.sdk.io.mqtt
-
A
PTransform
to read from a MQTT broker. - MqttIO.Write<InputT> - Class in org.apache.beam.sdk.io.mqtt
-
A
PTransform
to write and send a message to a MQTT server. - MqttRecord - Class in org.apache.beam.sdk.io.mqtt
-
A container class for MQTT message metadata, including the topic name and payload.
- MqttRecord() - Constructor for class org.apache.beam.sdk.io.mqtt.MqttRecord
- msgSpoolUsage() - Method in class org.apache.beam.sdk.io.solace.data.Semp.QueueData
- MSSQL - Static variable in class org.apache.beam.sdk.io.jdbc.JdbcUtil
- MultiDoFnFunction<InputT,
OutputT> - Class in org.apache.beam.runners.spark.translation -
DoFunctions ignore outputs that are not the main output.
- MultiDoFnFunction(MetricsContainerStepMapAccumulator, String, DoFn<InputT, OutputT>, SerializablePipelineOptions, TupleTag<OutputT>, List<TupleTag<?>>, Coder<InputT>, Map<TupleTag<?>, Coder<?>>, Map<TupleTag<?>, KV<WindowingStrategy<?, ?>, SideInputBroadcast<?>>>, WindowingStrategy<?, ?>, boolean, DoFnSchemaInformation, Map<String, PCollectionView<?>>, boolean, boolean) - Constructor for class org.apache.beam.runners.spark.translation.MultiDoFnFunction
- MultiLanguageBuilderMethod - Annotation Interface in org.apache.beam.sdk.expansion.service
- MultiLanguageConstructorMethod - Annotation Interface in org.apache.beam.sdk.expansion.service
- multimap() - Static method in class org.apache.beam.sdk.state.StateSpecs
-
Create a
StateSpec
for aMultimapState
, optimized for key lookups, key puts, and clear. - multimap() - Static method in class org.apache.beam.sdk.transforms.Materializations
-
For internal use only; no backwards-compatibility guarantees.
- multimap(Coder<K>, Coder<V>) - Static method in class org.apache.beam.sdk.state.StateSpecs
-
Identical to
StateSpecs.multimap()
, but with key and value coders explicitly supplied. - MULTIMAP_MATERIALIZATION_URN - Static variable in class org.apache.beam.sdk.transforms.Materializations
-
The URN for a
Materialization
where the primitive view type is a multimap of fully specified windowed values. - MultimapState<K,
V> - Interface in org.apache.beam.sdk.state -
A
ReadableState
cell mapping keys to bags of values. - multimapView(PCollection<KV<K, V>>, PCollectionViews.TypeDescriptorSupplier<K>, PCollectionViews.TypeDescriptorSupplier<V>, WindowingStrategy<?, W>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
-
Returns a
PCollectionView<Map<K, Iterable<V>>>
capable of processing elements windowed using the providedWindowingStrategy
. - MultimapViewFn(PCollectionViews.TypeDescriptorSupplier<K>, PCollectionViews.TypeDescriptorSupplier<V>) - Constructor for class org.apache.beam.sdk.values.PCollectionViews.MultimapViewFn
-
Deprecated.
- MultimapViewFn2(PCollectionViews.TypeDescriptorSupplier<K>, PCollectionViews.TypeDescriptorSupplier<V>) - Constructor for class org.apache.beam.sdk.values.PCollectionViews.MultimapViewFn2
- multimapViewUsingVoidKey(TupleTag<Materializations.MultimapView<Void, KV<K, V>>>, PCollection<KV<Void, KV<K, V>>>, PCollectionViews.TypeDescriptorSupplier<K>, PCollectionViews.TypeDescriptorSupplier<V>, WindowingStrategy<?, W>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
- MultiOutputOutputManagerFactory(TupleTag<OutputT>, Map<TupleTag<?>, OutputTag<WindowedValue<?>>>, Map<TupleTag<?>, Coder<WindowedValue<?>>>, Map<TupleTag<?>, Integer>, SerializablePipelineOptions) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator.MultiOutputOutputManagerFactory
- MultiOutputOutputManagerFactory(TupleTag<OutputT>, Coder<WindowedValue<OutputT>>, SerializablePipelineOptions) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator.MultiOutputOutputManagerFactory
- multiOutputOverrideFactory(DataflowPipelineOptions) - Static method in class org.apache.beam.runners.dataflow.BatchStatefulParDoOverrides
-
Returns a
PTransformOverrideFactory
that replaces a multi-outputParDo
with a composite transform specialized for theDataflowRunner
. - multiplexElements(Iterator<BeamFnApi.Elements.Data>, Iterator<BeamFnApi.Elements.Timers>) - Method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver
-
Dispatches the data and timers from the elements to corresponding receivers.
- multiply(double) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.NodeStats
- multiplyBy(double) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
- MUST_BE_CONSISTENT_IF_SUCCEEDS - Enum constant in enum class org.apache.beam.sdk.testing.SourceTestUtils.ExpectedSplitOutcome
-
The operation must either fail, or succeed and the results be consistent.
- MUST_FAIL - Enum constant in enum class org.apache.beam.sdk.testing.SourceTestUtils.ExpectedSplitOutcome
-
The operation must fail (return
null
). - MUST_SUCCEED_AND_BE_CONSISTENT - Enum constant in enum class org.apache.beam.sdk.testing.SourceTestUtils.ExpectedSplitOutcome
-
The operation must succeed and the results must be consistent.
- Mutability and thread safety - Search tag in org.apache.beam.sdk.io.BoundedSource.BoundedReader.getCurrentSource()
- Section
- Mutability and thread safety - Search tag in org.apache.beam.sdk.io.BoundedSource.BoundedReader.getCurrentSource()
- Section
- Mutability and thread safety - Search tag in org.apache.beam.sdk.io.BoundedSource.BoundedReader.getCurrentSource()
- Section
- Mutability and thread safety - Search tag in org.apache.beam.sdk.io.BoundedSource.BoundedReader.getCurrentSource()
- Section
- mutablePairEncoder(Encoder<T1>, Encoder<T2>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers
-
Creates a Spark
Encoder
for Spark'sMutablePair
ofStructType
with fields `_1
` and `_2
`. - MutableState<EventT,
ResultT> - Interface in org.apache.beam.sdk.extensions.ordered -
Mutable state mutates when events apply to it.
- mutate(EventT) - Method in interface org.apache.beam.sdk.extensions.ordered.MutableState
-
The interface assumes that events will mutate the state without the possibility of throwing an error.
- MutationGroup - Class in org.apache.beam.sdk.io.gcp.spanner
-
A bundle of mutations that must be submitted atomically.
- MYSQL - Enum constant in enum class org.apache.beam.io.debezium.Connectors
- MYSQL - Static variable in class org.apache.beam.sdk.io.jdbc.JdbcUtil
N
- name - Variable in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlDropTable
- name - Variable in enum class org.apache.beam.sdk.io.kafka.KafkaTimestampType
- name - Variable in class org.apache.beam.sdk.transforms.PTransform
-
The base name of this
PTransform
, e.g., from defaults, ornull
if not yet assigned. - name() - Element in annotation interface org.apache.beam.sdk.expansion.service.MultiLanguageBuilderMethod
- name() - Element in annotation interface org.apache.beam.sdk.expansion.service.MultiLanguageConstructorMethod
- name() - Method in interface org.apache.beam.sdk.extensions.sql.meta.catalog.Catalog
-
The name of this catalog, specified by the user.
- name() - Method in class org.apache.beam.sdk.extensions.sql.meta.catalog.InMemoryCatalog
- name() - Method in class org.apache.beam.sdk.io.clickhouse.TableSchema.Column
- name() - Method in interface org.apache.beam.sdk.schemas.FieldValueGetter
- name() - Method in interface org.apache.beam.sdk.schemas.FieldValueSetter
-
Returns the name of the field.
- name(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.Table.Builder
- name(TypeDescription) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.InjectPackageStrategy
- named() - Method in class org.apache.beam.sdk.extensions.sql.impl.QueryPlanner.QueryParameters
- named(Class<?>, String) - Static method in class org.apache.beam.sdk.metrics.MetricName
- named(Class<?>, String) - Static method in class org.apache.beam.sdk.metrics.MetricNameFilter
- named(Class<?>, String, Map<String, String>) - Static method in class org.apache.beam.sdk.metrics.MetricName
- named(String, String) - Static method in class org.apache.beam.sdk.metrics.MetricName
- named(String, String) - Static method in class org.apache.beam.sdk.metrics.MetricNameFilter
- named(String, String, Map<String, String>) - Static method in class org.apache.beam.sdk.metrics.MetricName
- NAMED - Enum constant in enum class org.apache.beam.sdk.extensions.sql.impl.QueryPlanner.QueryParameters.Kind
- nameOf(int) - Method in class org.apache.beam.sdk.schemas.Schema
-
Return the name of field by index.
- names() - Method in class org.apache.beam.sdk.metrics.MetricsFilter
- Naming ParDo transforms - Search tag in class org.apache.beam.sdk.transforms.ParDo
- Section
- NanosDuration - Class in org.apache.beam.sdk.schemas.logicaltypes
-
A duration represented in nanoseconds.
- NanosDuration() - Constructor for class org.apache.beam.sdk.schemas.logicaltypes.NanosDuration
- NanosInstant - Class in org.apache.beam.sdk.schemas.logicaltypes
-
A timestamp represented as nanoseconds since the epoch.
- NanosInstant() - Constructor for class org.apache.beam.sdk.schemas.logicaltypes.NanosInstant
- narrowing(Schema) - Static method in class org.apache.beam.sdk.schemas.transforms.Cast
- Narrowing() - Constructor for class org.apache.beam.sdk.schemas.transforms.Cast.Narrowing
- NATIVE - Enum constant in enum class org.apache.beam.sdk.extensions.sorter.ExternalSorter.Options.SorterType
- nativeSQL(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- Natural() - Constructor for class org.apache.beam.sdk.transforms.Top.Natural
- naturalOrder() - Static method in class org.apache.beam.sdk.transforms.Max
- naturalOrder() - Static method in class org.apache.beam.sdk.transforms.Min
- naturalOrder(T) - Static method in class org.apache.beam.sdk.transforms.Max
- naturalOrder(T) - Static method in class org.apache.beam.sdk.transforms.Min
- navigationFirstValue() - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions
- navigationLastValue() - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions
- NeedsDocker - Interface in org.apache.beam.runners.fnexecution.environment.testing
-
Category for integration tests that require Docker.
- needsMerge() - Method in class org.apache.beam.sdk.values.WindowingStrategy
- NeedsRunner - Interface in org.apache.beam.sdk.testing
-
Category tag for validation tests which utilize
TestPipeline
for execution and expect to be executed by aPipelineRunner
. - Neo4j Aura - Search tag in class org.apache.beam.sdk.io.neo4j.Neo4jIO
- Section
- Neo4jIO - Class in org.apache.beam.sdk.io.neo4j
-
This is a Beam IO to read from, and write data to, Neo4j.
- Neo4jIO() - Constructor for class org.apache.beam.sdk.io.neo4j.Neo4jIO
- Neo4jIO.DriverConfiguration - Class in org.apache.beam.sdk.io.neo4j
-
This describes all the information needed to create a Neo4j
Session
. - Neo4jIO.DriverProviderFromDriverConfiguration - Class in org.apache.beam.sdk.io.neo4j
-
Wraps a
Neo4jIO.DriverConfiguration
to provide aDriver
. - Neo4jIO.ReadAll<ParameterT,
OutputT> - Class in org.apache.beam.sdk.io.neo4j -
This is the class which handles the work behind the
Neo4jIO.readAll()
method. - Neo4jIO.RowMapper<T> - Interface in org.apache.beam.sdk.io.neo4j
-
An interface used by
Neo4jIO.ReadAll
for converting each row of a Neo4jResult
recordRecord
into an element of the resultingPCollection
. - Neo4jIO.WriteUnwind<ParameterT> - Class in org.apache.beam.sdk.io.neo4j
-
This is the class which handles the work behind the
Neo4jIO.writeUnwind()
method. - nested() - Method in class org.apache.beam.sdk.coders.Coder.Context
-
Deprecated.
- NESTED - Static variable in class org.apache.beam.sdk.coders.Coder.Context
-
Deprecated.The nested context: the value being encoded or decoded is (potentially) a part of a larger record/stream contents, and may have other parts encoded or decoded after it.
- NestedBytesBundle() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.NestedBytesBundle
- nestedFieldsById() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
-
Return the nested fields keyed by field ids.
- nestedFieldsByName() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
-
Return the nested fields keyed by field name.
- NestedIntBundle() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.NestedIntBundle
- Nested style - Search tag in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
- Section
- never() - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
-
Returns a
Watch.Growth.TerminationCondition
that never holds (i.e., poll each input until its output is complete). - Never - Class in org.apache.beam.sdk.transforms.windowing
-
A
Trigger
which never fires. - Never() - Constructor for class org.apache.beam.sdk.transforms.windowing.Never
- Never.NeverTrigger - Class in org.apache.beam.sdk.transforms.windowing
-
The actual trigger class for
Never
triggers. - neverRetry() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy
-
Never retry any failures.
- NEW_PARTITION_PREFIX - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
- NEW_ROW - Enum constant in enum class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ValueCaptureType
- NEW_ROW_AND_OLD_VALUES - Enum constant in enum class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ValueCaptureType
- NEW_VALUES - Enum constant in enum class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ValueCaptureType
- newBuilder() - Static method in class org.apache.beam.sdk.extensions.ml.AnnotateText
- newBuilder() - Static method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
- newBuilder() - Static method in class org.apache.beam.sdk.extensions.ml.DLPInspectText
- newBuilder() - Static method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
- newBuilder() - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.UserFunctionDefinitions
- newBuilder() - Static method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
Factory method to return a new instance of
RpcQosOptions.Builder
with all values set to their initial default values. - newBuilder() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PublisherOptions
- newBuilder() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.SubscriberOptions
- newBuilder() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions
- newBuilder() - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
- newBuilder() - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
Creates a builder for constructing a partition metadata instance.
- newBuilder() - Static method in class org.apache.beam.sdk.io.splunk.SplunkEvent
-
Provides a builder for creating
SplunkEvent
objects. - newBuilder() - Static method in class org.apache.beam.sdk.io.splunk.SplunkWriteError
-
Provides a builder for creating
SplunkWriteError
objects. - newBuilder() - Static method in class org.apache.beam.sdk.schemas.io.Failure
- newBundle(Map<String, RemoteOutputReceiver<?>>, Map<KV<String, String>, RemoteOutputReceiver<Timer<?>>>, StateRequestHandler, BundleProgressHandler, BundleFinalizationHandler, BundleCheckpointHandler) - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient.BundleProcessor
-
Start a new bundle for the given
BeamFnApi.ProcessBundleDescriptor
identifier. - newBundle(Map<String, RemoteOutputReceiver<?>>, Map<KV<String, String>, RemoteOutputReceiver<Timer<?>>>, StateRequestHandler, BundleProgressHandler, BundleSplitHandler, BundleCheckpointHandler, BundleFinalizationHandler) - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient.BundleProcessor
-
Start a new bundle for the given
BeamFnApi.ProcessBundleDescriptor
identifier. - newBundle(Map<String, RemoteOutputReceiver<?>>, BundleProgressHandler) - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient.BundleProcessor
-
Start a new bundle for the given
BeamFnApi.ProcessBundleDescriptor
identifier. - newBundle(Map<String, RemoteOutputReceiver<?>>, StateRequestHandler, BundleProgressHandler) - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient.BundleProcessor
-
Start a new bundle for the given
BeamFnApi.ProcessBundleDescriptor
identifier. - newClient(String, String, PubsubOptions) - Method in interface org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.PubsubClientFactory
- newClient(String, String, PubsubOptions, String) - Method in interface org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.PubsubClientFactory
-
Construct a new Pubsub client.
- newConfiguration(SerializableConfiguration) - Static method in class org.apache.beam.sdk.io.hadoop.SerializableConfiguration
-
Returns new populated
Configuration
object. - newConnection(UnregisteredDriver, AvaticaFactory, String, Properties, CalciteSchema, JavaTypeFactory) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteFactoryWrapper
- newDatabaseMetaData(AvaticaConnection) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteFactoryWrapper
- newDataflowClient(DataflowPipelineOptions) - Static method in class org.apache.beam.runners.dataflow.util.DataflowTransport
-
Returns a Google Cloud Dataflow client builder.
- newDlqTransform(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryDlqProvider
- newDlqTransform(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubDlqProvider
- newDlqTransform(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.DlqProvider
- newDlqTransform(String) - Method in interface org.apache.beam.sdk.schemas.io.GenericDlqProvider
-
Generate a DLQ output from the provided config value.
- newGoogleAdsClient(GoogleAdsOptions, String, Long, Long) - Method in class org.apache.beam.sdk.io.googleads.DefaultGoogleAdsClientFactory
- newGoogleAdsClient(GoogleAdsOptions, String, Long, Long) - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsClientFactory
- newJob(SerializableConfiguration) - Static method in class org.apache.beam.sdk.io.hadoop.SerializableConfiguration
-
Returns new configured
Job
object. - NewPartition - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.model
-
Represent new partition as a result of splits and merges.
- NewPartition(Range.ByteStringRange, List<ChangeStreamContinuationToken>, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
- NewPartition(Range.ByteStringRange, List<ChangeStreamContinuationToken>, Instant, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
- newPluginInstance(String) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- newPopulation(Schema.TypeName) - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CovarianceFn
- newPopulation(Schema.TypeName) - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.VarianceFn
- newPopulation(SerializableFunction<BigDecimal, V>) - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CovarianceFn
- newPopulation(SerializableFunction<BigDecimal, V>) - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.VarianceFn
- newPreparedStatement(AvaticaConnection, Meta.StatementHandle, Meta.Signature, int, int, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteFactoryWrapper
- newProvider(T) - Method in class org.apache.beam.sdk.testing.TestPipeline
-
Returns a new
ValueProvider
that is inaccessible beforeTestPipeline.run()
, but will be accessible while the pipeline runs. - newReader(PulsarClient, String) - Method in class org.apache.beam.sdk.io.pulsar.ReadFromPulsarDoFn
- newResultSet(AvaticaStatement, QueryState, Meta.Signature, TimeZone, Meta.Frame) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteFactoryWrapper
- newResultSetMetaData(AvaticaStatement, Meta.Signature) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteFactoryWrapper
- newSample(Schema.TypeName) - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CovarianceFn
- newSample(Schema.TypeName) - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.VarianceFn
- newSample(SerializableFunction<BigDecimal, V>) - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CovarianceFn
- newSample(SerializableFunction<BigDecimal, V>) - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.VarianceFn
- newStatement(AvaticaConnection, Meta.StatementHandle, int, int, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteFactoryWrapper
- newStorageClient(GcsOptions) - Static method in class org.apache.beam.sdk.extensions.gcp.util.Transport
-
Returns a Cloud Storage client builder using the specified
GcsOptions
. - newTracker() - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
- newTracker() - Method in class org.apache.beam.sdk.io.range.OffsetRange
- newTracker() - Method in interface org.apache.beam.sdk.transforms.splittabledofn.HasDefaultTracker
-
Creates a new tracker for
this
. - newTracker(KafkaSourceConsumerFn.OffsetHolder) - Method in class org.apache.beam.io.debezium.KafkaSourceConsumerFn
- newTracker(PartitionMetadata, TimestampRange) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
- newTracker(TimestampRange) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
- newTracker(Watch.GrowthState) - Method in class org.apache.beam.sdk.transforms.Watch.WatchGrowthFn
- NewVsCopy() - Constructor for class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark.NewVsCopy
- newWatermarkEstimator() - Method in interface org.apache.beam.sdk.transforms.splittabledofn.HasDefaultWatermarkEstimator
-
Creates a new watermark estimator for
this
. - newWatermarkEstimator(Instant) - Method in class org.apache.beam.io.debezium.KafkaSourceConsumerFn
- newWatermarkEstimator(Instant) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.DetectNewPartitionsDoFn
- newWatermarkEstimator(Instant) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.ReadChangeStreamPartitionDoFn
- newWatermarkEstimator(Instant) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
- newWatermarkEstimator(Instant) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
- newWatermarkEstimator(Instant) - Method in class org.apache.beam.sdk.io.pulsar.ReadFromPulsarDoFn
- newWatermarkEstimator(Instant) - Method in class org.apache.beam.sdk.transforms.Watch.WatchGrowthFn
- next() - Method in class org.apache.beam.runners.jet.processors.BoundedSourceP
- next() - Method in class org.apache.beam.runners.spark.translation.streaming.ParDoStateUpdateFn.SparkTimerInternalsIterator
- next() - Method in class org.apache.beam.runners.twister2.translation.wrappers.Twister2BoundedSource
- next() - Method in class org.apache.beam.runners.twister2.translation.wrappers.Twister2EmptySource
- next() - Method in class org.apache.beam.runners.twister2.translators.functions.ImpulseSource
- next() - Method in class org.apache.beam.sdk.extensions.arrow.ArrowConversion.RecordBatchRowIterator
- next() - Method in class org.apache.beam.sdk.fn.stream.DataStreams.DataStreamDecoder
- next() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.FhirResourcePagesIterator
- next() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.HL7v2MessagePages.HL7v2MessagePagesIterator
- next() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSet
-
Moves the pointer to the next record in the
ResultSet
if there is one. - next(Timestamp) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampUtils
-
Adds one nanosecond to the given timestamp.
- NEXT - Enum constant in enum class org.apache.beam.sdk.extensions.sql.impl.cep.CEPKind
- nextBatch(TimestampedValue<T>...) - Method in class org.apache.beam.runners.spark.io.CreateStream
-
Enqueue next micro-batch elements.
- nextBatch(T...) - Method in class org.apache.beam.runners.spark.io.CreateStream
-
For non-timestamped elements.
- nextFieldId() - Method in class org.apache.beam.sdk.values.Row.Builder
- nextRecord(WindowedValue<byte[]>) - Method in class org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat
- nextRecord(WindowedValue<T>) - Method in class org.apache.beam.runners.flink.translation.wrappers.SourceInputFormat
- nextSinkId() - Method in class org.apache.beam.runners.spark.translation.SparkTranslationContext
-
Generate a unique pCollection id number to identify runner-generated sinks.
- NFA - Class in org.apache.beam.sdk.extensions.sql.impl.nfa
-
NFA
is an implementation of non-deterministic finite automata. - NO_FIRING - Static variable in class org.apache.beam.sdk.transforms.windowing.PaneInfo
-
PaneInfo
to use for elements on (and before) initial window assignment (including elements read from sources) before they have passed through aGroupByKey
and are associated with a particular trigger firing. - NO_TIMESTAMP_TYPE - Enum constant in enum class org.apache.beam.sdk.io.kafka.KafkaTimestampType
- No-argument SolaceIO#read() top-level method - Search tag in class org.apache.beam.sdk.io.solace.SolaceIO
- Section
- NodeStats - Class in org.apache.beam.sdk.extensions.sql.impl.planner
-
This is a utility class to represent rowCount, rate and window.
- NodeStats() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.planner.NodeStats
- NodeStatsMetadata - Interface in org.apache.beam.sdk.extensions.sql.impl.planner
-
This is a metadata used for row count and rate estimation.
- NodeStatsMetadata.Handler - Interface in org.apache.beam.sdk.extensions.sql.impl.planner
-
Handler API.
- No Global Shared State - Search tag in class org.apache.beam.sdk.transforms.ParDo
- Section
- noMoreSplits() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- NON_MERGING - Enum constant in enum class org.apache.beam.runners.spark.translation.SparkCombineFn.WindowedAccumulator.Type
- NON_PARALLEL_INPUTS - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- NonCumulativeCostImpl() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner.NonCumulativeCostImpl
- NonDeterministicException(Coder<?>, String) - Constructor for exception class org.apache.beam.sdk.coders.Coder.NonDeterministicException
- NonDeterministicException(Coder<?>, String, Coder.NonDeterministicException) - Constructor for exception class org.apache.beam.sdk.coders.Coder.NonDeterministicException
- NonDeterministicException(Coder<?>, List<String>) - Constructor for exception class org.apache.beam.sdk.coders.Coder.NonDeterministicException
- NonDeterministicException(Coder<?>, List<String>, Coder.NonDeterministicException) - Constructor for exception class org.apache.beam.sdk.coders.Coder.NonDeterministicException
- none() - Static method in class org.apache.beam.sdk.schemas.Schema.Options
- none() - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
-
Default empty
DisplayData
instance. - NONE - Enum constant in enum class org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions.AutoscalingAlgorithmType
-
Use numWorkers machines.
- NONE - Enum constant in enum class org.apache.beam.sdk.extensions.sql.impl.cep.CEPKind
- NONE - Enum constant in enum class org.apache.beam.sdk.extensions.sql.impl.QueryPlanner.QueryParameters.Kind
- NONE - Enum constant in enum class org.apache.beam.sdk.extensions.sql.meta.ProjectSupport
- NONE - Enum constant in enum class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider.PushDownOptions
- NONE - Enum constant in enum class org.apache.beam.sdk.io.TFRecordIO.CompressionType
-
Deprecated.
- NONE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.cep.Quantifier
- NONE - Static variable in interface org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler.Stats
- NONE - Static variable in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker.Progress
-
Constant Progress instance to be used when no work has been completed yet.
- NonKeyedBufferingElementsHandler<T> - Class in org.apache.beam.runners.flink.translation.wrappers.streaming.stableinput
-
A non-keyed implementation of a
BufferingElementsHandler
. - NonMergingWindowFn<T,
W> - Class in org.apache.beam.sdk.transforms.windowing -
Abstract base class for
WindowFns
that do not merge windows. - NonMergingWindowFn() - Constructor for class org.apache.beam.sdk.transforms.windowing.NonMergingWindowFn
- nonSeekableInputIndex() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
- NOOP_CHECKPOINT_MARK - Static variable in interface org.apache.beam.sdk.io.UnboundedSource.CheckpointMark
- NoopCheckpointMark() - Constructor for class org.apache.beam.sdk.io.UnboundedSource.CheckpointMark.NoopCheckpointMark
- NoOpCounter - Class in org.apache.beam.sdk.metrics
-
A no-op implementation of Counter.
- NoopCredentialFactory - Class in org.apache.beam.sdk.extensions.gcp.auth
-
Construct an oauth credential to be used by the SDK and the SDK workers.
- NoopCredentialFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.auth.NoopCredentialFactory
- NoOpHistogram - Class in org.apache.beam.sdk.metrics
-
A no-op implementation of Histogram.
- NoOpMetricsSink() - Constructor for class org.apache.beam.sdk.metrics.MetricsOptions.NoOpMetricsSink
- NoopPathValidator - Class in org.apache.beam.sdk.extensions.gcp.storage
-
For internal use only; no backwards compatibility guarantees.
- NoOpStepContext - Class in org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions
-
A
StepContext
for Spark Batch Runner execution. - NoOpStepContext - Class in org.apache.beam.runners.twister2.utils
-
doc.
- NoOpStepContext() - Constructor for class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.NoOpStepContext
- NoOpStepContext() - Constructor for class org.apache.beam.runners.twister2.utils.NoOpStepContext
- NoOpWatermarkCache - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.cache
-
Synchronously compute the earliest partition watermark, by delegating the call to
invalid reference
PartitionMetadataDao#getUnfinishedMinWatermark()
- NoOpWatermarkCache(PartitionMetadataDao) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.cache.NoOpWatermarkCache
- normalize() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- NormalizedRange - Class in org.apache.beam.sdk.io.azure.cosmos
- NormalizedRange(String, String) - Constructor for class org.apache.beam.sdk.io.azure.cosmos.NormalizedRange
- NoSuchSchemaException - Exception Class in org.apache.beam.sdk.schemas
-
Indicates that we are missing a schema for a type.
- NoSuchSchemaException() - Constructor for exception class org.apache.beam.sdk.schemas.NoSuchSchemaException
- not(SerializableMatcher<T>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.not(org.hamcrest.Matcher<T>)
. - NOT_EQUALS - Enum constant in enum class org.apache.beam.sdk.extensions.sql.impl.cep.CEPKind
- NOT_FOUND - Enum constant in enum class org.apache.beam.sdk.io.fs.MatchResult.Status
- Note on Data Encoding - Search tag in class org.apache.beam.sdk.transforms.Combine.CombineFn
- Section
- Note on Serialization - Search tag in class org.apache.beam.sdk.transforms.PTransform
- Section
- Note on timestamps - Search tag in class org.apache.beam.sdk.io.iceberg.IcebergIO
- Section
- notEqualTo(T) - Method in interface org.apache.beam.sdk.testing.PAssert.SingletonAssert
-
Asserts that the value in question is not equal to the provided value, according to
Object.equals(java.lang.Object)
. - notifyCheckpointComplete(long) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- notifyCheckpointComplete(long) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.ExecutableStageDoFnOperator
- notifyCheckpointComplete(long) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.unbounded.FlinkUnboundedSourceReader
- notifyCheckpointComplete(long) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.UnboundedSourceWrapper
- notifyNoMoreSplits() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- notifyOfRemovedMetric(Metric, String, MetricGroup) - Method in class org.apache.beam.runners.flink.metrics.FileReporter
- notRegistered() - Method in class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.UnusedFn
- now() - Method in interface org.apache.beam.runners.direct.Clock
-
Returns the current time as an
Instant
. - now(Matcher<Iterable<? extends Row>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery.RowsAssertion
- nullable - Variable in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.TypeWithNullability
- nullable() - Method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- nullable(String, Schema.FieldType) - Static method in class org.apache.beam.sdk.schemas.Schema.Field
-
Return's a nullable field with the give name and type.
- nullable(TableSchema.TypeName) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- NULLABLE_DATE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
- NULLABLE_TIME - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
- NULLABLE_TIMESTAMP - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
- NULLABLE_TIMESTAMP_WITH_LOCAL_TZ - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
- NullableCoder<T> - Class in org.apache.beam.sdk.coders
-
A
NullableCoder
encodes nullable values of typeT
using a nestedCoder<T>
that does not toleratenull
values. - nullContext() - Static method in class org.apache.beam.sdk.state.StateContexts
-
Returns a fake
StateContext
. - NullCredentialInitializer - Class in org.apache.beam.sdk.extensions.gcp.auth
-
A
HttpRequestInitializer
for requests that don't have credentials. - NullCredentialInitializer() - Constructor for class org.apache.beam.sdk.extensions.gcp.auth.NullCredentialInitializer
- nullRow(Schema) - Static method in class org.apache.beam.sdk.values.Row
-
Creates a new record filled with nulls.
- nulls() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
The
TypeDescriptor
for nulls/Void. - NullSizeEstimator<T> - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator
-
NoOp implementation of a size estimator.
- NullSizeEstimator() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator.NullSizeEstimator
- NullThroughputEstimator<T> - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator
-
NoOp implementation of a throughput estimator.
- NullThroughputEstimator() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.NullThroughputEstimator
- nullValue() - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.nullValue()
. - NUM_QUERY_SPLITS_MAX - Static variable in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
-
An upper bound on the number of splits for a query.
- NUM_REDUCES - Static variable in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO
-
MRJobConfig.NUM_REDUCES
. - NUMBER - Enum constant in enum class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields.Fanout.Kind
- numberingDenseRank() - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions
- numberingPercentRank() - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions
- numberingRank() - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions
- numberingRowNumber() - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions
- numberOfRanges() - Method in class org.apache.beam.sdk.extensions.ordered.combiner.SequenceRangeAccumulator
- numberOfRecordsForRate - Variable in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
- NUMERIC_LITERAL_WRAPPER - Static variable in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamBigQuerySqlDialect
- NUMERIC_TYPES - Static variable in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- numProcessingTimeTimers() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- numRecordsInCounter - Variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- numRetries() - Method in class org.apache.beam.sdk.io.aws2.common.RetryConfiguration
- numRetries(int) - Method in class org.apache.beam.sdk.io.aws2.common.RetryConfiguration.Builder
- numSupported() - Method in interface org.apache.beam.sdk.extensions.sql.meta.BeamSqlTableFilter
-
This is primarily used by the cost based optimization to determine the benefit of performing predicate push-down for an IOSourceRel.
- numSupported() - Method in class org.apache.beam.sdk.extensions.sql.meta.DefaultTableFilter
- numSupported() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryFilter
- numSupported() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.iceberg.IcebergFilter
- numSupported() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableFilter
O
- OBJECT_TYPE_NAME - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- ObjectPool<KeyT,
ObjectT> - Class in org.apache.beam.sdk.io.aws2.common -
Reference counting object pool to easily share invalid input: '&' destroy objects.
- ObjectPool(Function<KeyT, ObjectT>) - Constructor for class org.apache.beam.sdk.io.aws2.common.ObjectPool
- ObjectPool(Function<KeyT, ObjectT>, ThrowingConsumer<Exception, ObjectT>) - Constructor for class org.apache.beam.sdk.io.aws2.common.ObjectPool
- ObjectPool.ClientPool<ClientT> - Class in org.apache.beam.sdk.io.aws2.common
-
Client pool to easily share AWS clients per configuration.
- observe(RestrictionTracker<RestrictionT, PositionT>, RestrictionTrackers.ClaimObserver<PositionT>) - Static method in class org.apache.beam.sdk.fn.splittabledofn.RestrictionTrackers
-
Returns a thread safe
RestrictionTracker
which reports all claim attempts to the specifiedRestrictionTrackers.ClaimObserver
. - observeTimestamp(Instant) - Method in interface org.apache.beam.sdk.transforms.splittabledofn.TimestampObservingWatermarkEstimator
-
Update watermark estimate with latest output timestamp.
- observeTimestamp(Instant) - Method in class org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimators.MonotonicallyIncreasing
- of() - Static method in class org.apache.beam.runners.dataflow.internal.IsmFormat.FooterCoder
- of() - Static method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmShardCoder
-
Returns an IsmShardCoder.
- of() - Static method in class org.apache.beam.runners.dataflow.internal.IsmFormat.KeyPrefixCoder
- of() - Static method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
- of() - Static method in class org.apache.beam.runners.flink.translation.functions.FlinkIdentityFunction
- of() - Static method in class org.apache.beam.runners.fnexecution.wire.ByteStringCoder
- of() - Static method in class org.apache.beam.sdk.coders.BigDecimalCoder
- of() - Static method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
- of() - Static method in class org.apache.beam.sdk.coders.BigEndianLongCoder
- of() - Static method in class org.apache.beam.sdk.coders.BigEndianShortCoder
- of() - Static method in class org.apache.beam.sdk.coders.BigIntegerCoder
- of() - Static method in class org.apache.beam.sdk.coders.BitSetCoder
- of() - Static method in class org.apache.beam.sdk.coders.BooleanCoder
-
Returns the singleton instance of
BooleanCoder
. - of() - Static method in class org.apache.beam.sdk.coders.ByteArrayCoder
- of() - Static method in class org.apache.beam.sdk.coders.ByteCoder
- of() - Static method in class org.apache.beam.sdk.coders.DoubleCoder
- of() - Static method in class org.apache.beam.sdk.coders.DurationCoder
- of() - Static method in class org.apache.beam.sdk.coders.FloatCoder
- of() - Static method in class org.apache.beam.sdk.coders.InstantCoder
- of() - Static method in class org.apache.beam.sdk.coders.StringUtf8Coder
- of() - Static method in class org.apache.beam.sdk.coders.TextualIntegerCoder
- of() - Static method in class org.apache.beam.sdk.coders.VarIntCoder
- of() - Static method in class org.apache.beam.sdk.coders.VarLongCoder
- of() - Static method in class org.apache.beam.sdk.coders.VoidCoder
- of() - Static method in class org.apache.beam.sdk.expansion.service.ExpansionServiceSchemaTransformProvider
- of() - Static method in class org.apache.beam.sdk.extensions.ordered.combiner.SequenceRangeAccumulator.SequenceRangeAccumulatorCoder
- of() - Static method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
- of() - Static method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.HyperLogLogPlusCoder
- of() - Static method in class org.apache.beam.sdk.io.aws2.dynamodb.AttributeValueCoder
- of() - Static method in class org.apache.beam.sdk.io.DefaultFilenamePolicy.ParamsCoder
- of() - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocumentCoder
- of() - Static method in class org.apache.beam.sdk.io.fs.MetadataCoder
-
Returns the singleton
MetadataCoder
instance. - of() - Static method in class org.apache.beam.sdk.io.fs.MetadataCoderV2
-
Returns the singleton
MetadataCoderV2
instance. - of() - Static method in class org.apache.beam.sdk.io.fs.ResourceIdCoder
-
Creates a
ResourceIdCoder
. - of() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
- of() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertErrorCoder
- of() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.RowMutation.RowMutationCoder
- of() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoder
- of() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV2
- of() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV3
- of() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
- of() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResultCoder
- of() - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2MessageCoder
- of() - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadResponseCoder
- of() - Static method in class org.apache.beam.sdk.io.gcp.healthcare.JsonArrayCoder
- of() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
- of() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdAndOrderingKeyCoder
- of() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdCoder
- of() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
- of() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithMessageIdCoder
- of() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithTopicCoder
- of() - Static method in class org.apache.beam.sdk.io.pulsar.PulsarMessageCoder
- of() - Static method in class org.apache.beam.sdk.io.range.OffsetRange.Coder
- of() - Static method in class org.apache.beam.sdk.io.ReadableFileCoder
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeDate
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeDateTime
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeTime
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeTimestamp
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeTimestampLTZ
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeTimestampNTZ
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeTimestampTZ
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.geospatial.SnowflakeGeography
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.logical.SnowflakeBoolean
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeDouble
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeFloat
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeInteger
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeNumber
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeReal
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.structured.SnowflakeArray
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.structured.SnowflakeObject
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.structured.SnowflakeVariant
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeBinary
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeChar
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeString
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeText
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeVarBinary
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeVarchar
- of() - Static method in class org.apache.beam.sdk.io.splunk.SplunkEventCoder
- of() - Static method in class org.apache.beam.sdk.schemas.transforms.Cast.Narrowing
- of() - Static method in class org.apache.beam.sdk.schemas.transforms.Cast.Widening
- of() - Static method in class org.apache.beam.sdk.transforms.Mean
-
A
Combine.CombineFn
that computes the arithmetic mean (a.k.a. - of() - Static method in class org.apache.beam.sdk.transforms.Reshuffle
- of() - Static method in class org.apache.beam.sdk.transforms.ToJson
- of() - Static method in class org.apache.beam.sdk.transforms.windowing.DefaultTrigger
-
Returns the default trigger.
- of() - Static method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
- of() - Static method in class org.apache.beam.sdk.transforms.windowing.PaneInfo.PaneInfoCoder
- of(int) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.FixedBytes
-
Return an instance of FixedBytes with specified byte array length.
- of(int) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.FixedPrecisionNumeric
-
Create a FixedPrecisionNumeric instance with specified scale and unspecified precision.
- of(int) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.FixedString
-
Return an instance of FixedString with specified string length.
- of(int) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.VariableBytes
-
Return an instance of VariableBytes with specified max byte array length.
- of(int) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.VariableString
-
Return an instance of VariableString with specified max string length.
- of(int) - Static method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields.Fanout
- of(int...) - Static method in class org.apache.beam.sdk.io.range.ByteKey
-
Creates a new
ByteKey
backed by a copy of the specifiedint[]
. - of(int, int) - Static method in class org.apache.beam.runners.dataflow.internal.IsmFormat.KeyPrefix
- of(int, int) - Static method in class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeDecimal
- of(int, int) - Static method in class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeNumber
- of(int, int) - Static method in class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeNumeric
- of(int, int) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.FixedPrecisionNumeric
-
Create a FixedPrecisionNumeric instance with specified precision and scale.
- of(int, int, List<Coder<?>>, Coder<V>) - Static method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmRecordCoder
-
Returns an IsmRecordCoder with the specified key component coders, value coder.
- of(int, long) - Static method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmShard
-
Returns an IsmShard with the given id, block offset and no index offset.
- of(int, long, long) - Static method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmShard
-
Returns an IsmShard with the given id, block offset, and index offset.
- of(int, ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Top
-
Returns a
PTransform
that takes an inputPCollection<T>
and returns aPCollection<List<T>>
with a single element containing the largestcount
elements of the inputPCollection<T>
, in decreasing order, sorted using the givenComparator<T>
. - of(int, Partition.PartitionFn<? super T>) - Static method in class org.apache.beam.sdk.transforms.Partition
-
Returns a new
Partition
PTransform
that divides its inputPCollection
into the given number of partitions, using the given partitioning function. - of(int, Partition.PartitionWithSideInputsFn<? super T>, Requirements) - Static method in class org.apache.beam.sdk.transforms.Partition
-
Returns a new
Partition
PTransform
that divides its inputPCollection
into the given number of partitions, using the given partitioning function. - of(long) - Static method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeBinary
- of(long) - Static method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeString
- of(long) - Static method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeText
- of(long) - Static method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeVarchar
- of(long, long, long) - Static method in class org.apache.beam.runners.dataflow.internal.IsmFormat.Footer
- of(long, long, Instant) - Static method in class org.apache.beam.sdk.extensions.ordered.ContiguousSequenceRange
- of(K) - Static method in class org.apache.beam.sdk.transforms.WithKeys
-
Returns a
PTransform
that takes aPCollection<V>
and returns aPCollection<KV<K, V>>
, where each of the values in the inputPCollection
has been paired with the given key. - of(RestrictionT, RestrictionT) - Static method in class org.apache.beam.sdk.transforms.splittabledofn.SplitResult
-
Returns a
SplitResult
for the specified primary and residual restrictions. - of(T, T...) - Static method in class org.apache.beam.sdk.transforms.Create
-
Returns a new
Create.Values
transform that produces aPCollection
containing the specified elements. - of(ClosureT, Requirements) - Static method in class org.apache.beam.sdk.transforms.Contextful
-
Constructs a pair of the given closure and its requirements.
- of(TableRow, RowMutationInformation) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.RowMutation
- of(Timestamp, Timestamp) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRange
-
Constructs a timestamp range.
- of(ByteString) - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.Uuid
- of(Descriptors.Descriptor) - Static method in class org.apache.beam.sdk.extensions.protobuf.DynamicProtoCoder
-
Returns a
DynamicProtoCoder
for the Protocol BuffersDynamicMessage
for the givenDescriptors.Descriptor
. - of(PubsubMessage, long, long, String, String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.IncomingMessage
- of(PubsubMessage, long, String, String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.OutgoingMessage
- of(ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Max
-
A
CombineFn
that computes the maximum of a collection of elements of typeT
using an arbitraryComparator
, useful as an argument toCombine.globally(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>)
orCombine.perKey(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>)
. - of(ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Min
-
A
CombineFn
that computes the minimum of a collection of elements of typeT
using an arbitraryComparator
, useful as an argument toCombine.globally(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>)
orCombine.perKey(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>)
. - of(String, int) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.FixedBytes
-
Return an instance of FixedBytes with specified byte array length.
- of(String, int) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.FixedString
-
Return an instance of FixedString with specified string length.
- of(String, int) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.VariableBytes
-
Return an instance of VariableBytes with specified max byte array length.
- of(String, int) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.VariableString
-
Return an instance of VariableString with specified max string length.
- of(Boolean) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
- of(Byte) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
- of(Class<? extends InputT>) - Static method in class org.apache.beam.sdk.extensions.jackson.AsJsons
-
Creates a
AsJsons
PTransform
that will transform aPCollection<InputT>
into aPCollection
of JSONStrings
representing those objects using a JacksonObjectMapper
. - of(Class<? extends OutputT>) - Static method in class org.apache.beam.sdk.extensions.jackson.ParseJsons
-
Creates a
ParseJsons
PTransform
that will parse JSONStrings
into aPCollection<OutputT>
using a JacksonObjectMapper
. - of(Class<HL7v2Message>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2MessageCoder
- of(Class<HL7v2ReadResponse>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadResponseCoder
- of(Class<T>) - Static method in class org.apache.beam.sdk.coders.SerializableCoder
-
Returns a
SerializableCoder
instance for the provided element class. - of(Class<T>) - Static method in class org.apache.beam.sdk.coders.StringDelegateCoder
- of(Class<T>) - Static method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
-
Returns an
AvroCoder
instance for the provided element class. - of(Class<T>) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory
-
Returns an
AvroDatumFactory
instance for the provided element type. - of(Class<T>) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory.ReflectDatumFactory
- of(Class<T>) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory.SpecificDatumFactory
- of(Class<T>) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
-
Returns a
ProtoCoder
for the given Protocol BuffersMessage
. - of(Class<T>) - Static method in class org.apache.beam.sdk.io.hadoop.WritableCoder
-
Returns a
WritableCoder
instance for the provided element class. - of(Class<T>) - Static method in class org.apache.beam.sdk.io.xml.JAXBCoder
-
Create a coder for a given type of JAXB annotated objects.
- of(Class<T>) - Static method in class org.apache.beam.sdk.values.TypeDescriptor
-
Returns a
TypeDescriptor
representing the given type. - of(Class<T>, boolean) - Static method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
-
Returns an
AvroCoder
instance for the given class, respecting whether to use Avro's Reflect* or Specific* suite for encoding and decoding. - of(Class<T>, boolean) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory
-
Returns an
AvroDatumFactory
instance for the provided element type respecting Avro's Reflect* or Specific* suite for encoding and decoding. - of(Class<T>, Schema) - Static method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
-
Returns an
AvroCoder
instance for the provided element type using the provided Avro schema - of(Class<T>, Schema, boolean) - Static method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
-
Returns an
AvroCoder
instance for the provided element type using the provided Avro schema, respecting whether to use Avro's Reflect* or Specific* suite for encoding and decoding. - of(Class<T>, TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.coders.StringDelegateCoder
- of(Class<T>, TProtocolFactory) - Static method in class org.apache.beam.sdk.io.thrift.ThriftCoder
- of(Double) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
- of(Float) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
- of(Integer) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
- of(Iterable<PCollection<T>>) - Static method in class org.apache.beam.sdk.values.PCollectionList
-
Returns a
PCollectionList
containing the givenPCollections
, in order. - of(Iterable<T>) - Static method in class org.apache.beam.sdk.transforms.Create
-
Returns a new
Create.Values
transform that produces aPCollection
containing elements of the providedIterable
. - of(Long) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
- of(Object...) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestBoundedTable
-
Convenient way to build a mocked bounded table.
- of(Object...) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestUnboundedTable
-
Convenient way to build a mocked unbounded table.
- of(Type) - Static method in class org.apache.beam.sdk.values.TypeDescriptor
-
Returns a
TypeDescriptor
representing the given type. - of(Short) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
- of(String) - Static method in class org.apache.beam.sdk.extensions.python.transforms.DataframeTransform
-
Instantiates a multi-language wrapper for a Python DataframeTransform with a given lambda function.
- of(String) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
- of(String) - Static method in class org.apache.beam.sdk.extensions.yaml.YamlTransform
-
Creates a new YamlTransform mapping a single input PCollection<Row> to a single PCollection<Row> output.
- of(String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleParameter
- of(String, int, String) - Static method in class org.apache.beam.sdk.io.kafka.ConfluentSchemaRegistryDeserializerProvider
- of(String, int, String, Integer) - Static method in class org.apache.beam.sdk.io.kafka.ConfluentSchemaRegistryDeserializerProvider
- of(String, int, String, Integer, Map<String, ?>) - Static method in class org.apache.beam.sdk.io.kafka.ConfluentSchemaRegistryDeserializerProvider
- of(String, long, boolean) - Static method in class org.apache.beam.sdk.io.redis.RedisCursor
- of(String, String, Map<String, T>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
-
Creates a FhirSearchParameter of type T.
- of(String, Long, Long, MessageId, String, String) - Static method in class org.apache.beam.sdk.io.pulsar.PulsarSourceDescriptor
- of(String, String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleParameter
- of(String, String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadParameter
- of(String, String) - Static method in class org.apache.beam.sdk.io.kafka.ConfluentSchemaRegistryDeserializerProvider
- of(String, String, Integer) - Static method in class org.apache.beam.sdk.io.kafka.ConfluentSchemaRegistryDeserializerProvider
- of(String, String, Integer, Map<String, ?>) - Static method in class org.apache.beam.sdk.io.kafka.ConfluentSchemaRegistryDeserializerProvider
- of(String, String, String) - Static method in class org.apache.beam.runners.dataflow.util.PackageUtil.StagedFile
- of(String, String, RunnerApi.FunctionSpec, Coder<T>, Coder<W>) - Static method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.SideInputSpec
- of(String, Map<String, T>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
-
Creates a FhirSearchParameter of type T, without a key.
- of(String, TableSchema.ColumnType) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema.Column
- of(String, TableSchema.ColumnType, TableSchema.DefaultType, Object) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema.Column
- of(String, HL7v2Message) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadResponse
-
From metadata and hl7v2Message to
HL7v2ReadResponse
. - of(String, SnowflakeDataType) - Static method in class org.apache.beam.sdk.io.snowflake.data.SnowflakeColumn
- of(String, SnowflakeDataType, boolean) - Static method in class org.apache.beam.sdk.io.snowflake.data.SnowflakeColumn
- of(String, Schema) - Static method in class org.apache.beam.sdk.extensions.python.transforms.RunInference
-
Instantiates a multi-language wrapper for a Python RunInference with a given model loader.
- of(String, Schema.FieldType) - Static method in class org.apache.beam.sdk.schemas.Schema.Field
-
Return's a field with the give name and type.
- of(String, Schema.FieldType, Schema.FieldType) - Static method in class org.apache.beam.sdk.extensions.python.transforms.RunInference
-
Instantiates a multi-language wrapper for a Python RunInference with a given model loader.
- of(String, PCollection<KV<K, InputT>>) - Static method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
-
A version of
KeyedPCollectionTuple.of(TupleTag, PCollection)
that takes in a string instead of a TupleTag. - of(String, PCollection<Row>) - Static method in class org.apache.beam.sdk.values.PCollectionRowTuple
-
Returns a singleton
PCollectionRowTuple
containing the givenPCollection
keyed by the given tag. - of(String, PCollection<Row>, String, PCollection<Row>) - Static method in class org.apache.beam.sdk.values.PCollectionRowTuple
-
A version of
PCollectionRowTuple.of(String, PCollection)
that takes in two PCollections of the same type. - of(String, PCollection<Row>, String, PCollection<Row>, String, PCollection<Row>) - Static method in class org.apache.beam.sdk.values.PCollectionRowTuple
-
A version of
PCollectionRowTuple.of(String, PCollection)
that takes in three PCollections of the same type. - of(String, PCollection<Row>, String, PCollection<Row>, String, PCollection<Row>, String, PCollection<Row>) - Static method in class org.apache.beam.sdk.values.PCollectionRowTuple
-
A version of
PCollectionRowTuple.of(String, PCollection)
that takes in four PCollections of the same type. - of(String, PCollection<Row>, String, PCollection<Row>, String, PCollection<Row>, String, PCollection<Row>, String, PCollection<Row>) - Static method in class org.apache.beam.sdk.values.PCollectionRowTuple
-
A version of
PCollectionRowTuple.of(String, PCollection)
that takes in five PCollections of the same type. - of(String, PCollection<T>) - Static method in class org.apache.beam.sdk.values.PCollectionTuple
-
A version of
PCollectionTuple.of(TupleTag, PCollection)
that takes in a String instead of aTupleTag
. - of(String, PCollection<T>, String, PCollection<T>) - Static method in class org.apache.beam.sdk.values.PCollectionTuple
-
A version of
PCollectionTuple.of(String, PCollection)
that takes in two PCollections of the same type. - of(String, PCollection<T>, String, PCollection<T>, String, PCollection<T>) - Static method in class org.apache.beam.sdk.values.PCollectionTuple
-
A version of
PCollectionTuple.of(String, PCollection)
that takes in three PCollections of the same type. - of(String, PCollection<T>, String, PCollection<T>, String, PCollection<T>, String, PCollection<T>) - Static method in class org.apache.beam.sdk.values.PCollectionTuple
-
A version of
PCollectionTuple.of(String, PCollection)
that takes in four PCollections of the same type. - of(String, PCollection<T>, String, PCollection<T>, String, PCollection<T>, String, PCollection<T>, String, PCollection<T>) - Static method in class org.apache.beam.sdk.values.PCollectionTuple
-
A version of
PCollectionTuple.of(String, PCollection)
that takes in five PCollections of the same type. - of(BigDecimal) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
- of(BigInteger, BigInteger) - Static method in class org.apache.beam.sdk.io.cassandra.RingRange
- of(ByteBuffer) - Static method in class org.apache.beam.runners.flink.adapter.FlinkKey
- of(Consumer<PCollection<T>>) - Static method in class org.apache.beam.sdk.transforms.Tee
-
Returns a new Tee PTransform that will apply an auxilary transform to the input as well as pass it on.
- of(List<?>, V) - Static method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmRecord
-
Returns an IsmRecord with the specified key components and value.
- of(List<Coder<?>>) - Static method in class org.apache.beam.sdk.transforms.join.UnionCoder
-
Builds a union coder with the given list of element coders.
- of(List<Trigger>) - Static method in class org.apache.beam.sdk.transforms.windowing.AfterAll
-
Returns an
AfterAll
Trigger
with the given subtriggers. - of(List<Trigger>) - Static method in class org.apache.beam.sdk.transforms.windowing.AfterFirst
-
Returns an
AfterFirst
Trigger
with the given subtriggers. - of(List<TupleTag<?>>) - Static method in class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
- of(List<TupleTag<?>>) - Static method in class org.apache.beam.sdk.values.TupleTagList
-
Returns a
TupleTagList
containing the givenTupleTags
, in order. - of(Map<K, V>) - Static method in class org.apache.beam.sdk.transforms.Create
-
Returns a new
Create.Values
transform that produces aPCollection
ofKV
s corresponding to the keys and values of the specifiedMap
. - of(K, int) - Static method in class org.apache.beam.sdk.values.ShardedKey
- of(K, Coder<K>) - Static method in class org.apache.beam.runners.flink.adapter.FlinkKey
- of(K, Coder<K>) - Static method in class org.apache.beam.runners.local.StructuralKey
-
Create a new Structural Key of the provided key that can be encoded by the provided coder.
- of(K, V) - Static method in class org.apache.beam.sdk.values.KV
-
Returns a
KV
with the given key and value. - of(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
-
Returns an
AvroGenericCoder
instance for the Avro schema. - of(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.coders.AvroGenericCoder
- of(Caller<RequestT, ResponseT>, Coder<ResponseT>) - Static method in class org.apache.beam.io.requestresponse.RequestResponseIO
-
Instantiates a
RequestResponseIO
with aCaller
and aRequestResponseIO
Coder
with a default package private implementation ofCallShouldBackoff
based on https://sre.google/sre-book/handling-overload. - of(BeamFnApi.ProcessBundleDescriptor, List<RemoteInputDestination>, Map<String, Coder>, Map<String, Map<String, ProcessBundleDescriptors.SideInputSpec>>, Map<String, Map<String, ProcessBundleDescriptors.BagUserStateSpec>>, Map<String, Map<String, ProcessBundleDescriptors.TimerSpec>>) - Static method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.ExecutableProcessBundleDescriptor
- of(JobApi.MetricResults) - Static method in class org.apache.beam.runners.portability.PortableMetrics
- of(SideInputReader) - Static method in class org.apache.beam.runners.spark.util.CachedSideInputReader
-
Create a new cached
SideInputReader
. - of(SideInputReader, Collection<PCollectionView<?>>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.CachedSideInputReader
-
Creates a SideInputReader that caches results for costly
Materializations
if present, otherwise the SideInputReader is returned as is. - of(Coder<IntermediateT>, DelegateCoder.CodingFunction<T, IntermediateT>, DelegateCoder.CodingFunction<IntermediateT, T>) - Static method in class org.apache.beam.sdk.coders.DelegateCoder
- of(Coder<IntermediateT>, DelegateCoder.CodingFunction<T, IntermediateT>, DelegateCoder.CodingFunction<IntermediateT, T>, TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.coders.DelegateCoder
- of(Coder<K>) - Static method in class org.apache.beam.runners.dataflow.internal.IsmFormat.MetadataKeyCoder
- of(Coder<K>, Coder<ElemT>, Coder<? extends BoundedWindow>) - Static method in class org.apache.beam.runners.flink.translation.wrappers.streaming.SingletonKeyedWorkItemCoder
-
Create a new
KeyedWorkItemCoder
with the provided key coder, element coder, and window coder. - of(Coder<K>, Coder<V>) - Static method in class org.apache.beam.sdk.coders.KvCoder
- of(Coder<K>, Coder<V>) - Static method in class org.apache.beam.sdk.coders.MapCoder
-
Produces a MapCoder with the given keyCoder and valueCoder.
- of(Coder<K>, Coder<V>) - Static method in class org.apache.beam.sdk.coders.SortedMapCoder
-
Produces a MapCoder with the given keyCoder and valueCoder.
- of(Coder<K>, Coder<V>) - Static method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
- of(Coder<K>, Coder<V>) - Static method in class org.apache.beam.sdk.io.kafka.ProducerRecordCoder
- of(Coder<KeyT>) - Static method in class org.apache.beam.sdk.coders.ShardedKeyCoder
- of(Coder<MatchResult.Metadata>) - Static method in class org.apache.beam.sdk.io.ReadableFileCoder
- of(Coder<BoundedWindow>, Coder<DestinationT>) - Static method in class org.apache.beam.sdk.io.FileBasedSink.FileResultCoder
- of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.CollectionCoder
- of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.DequeCoder
- of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.IterableCoder
- of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.LengthPrefixCoder
- of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.ListCoder
- of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.NullableCoder
- of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.OptionalCoder
- of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.SetCoder
-
Produces a
SetCoder
with the givenelementCoder
. - of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.SnappyCoder
-
Wraps the given coder into a
SnappyCoder
. - of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.TimestampPrefixingWindowCoder
- of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.ZstdCoder
-
Wraps the given coder into a
ZstdCoder
. - of(Coder<T>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameterCoder
- of(Coder<T>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOErrorCoder
- of(Coder<T>) - Static method in class org.apache.beam.sdk.testing.TestStream.TestStreamCoder
- of(Coder<T>) - Static method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
- of(Coder<T>) - Static method in class org.apache.beam.sdk.values.WindowedValues.ParamWindowedValueCoder
-
Returns the
WindowedValues.ParamWindowedValueCoder
for the given valueCoder andGlobalWindow.Coder.INSTANCE
usingBoundedWindow.TIMESTAMP_MIN_VALUE
as the timestamp,WindowedValues.GLOBAL_WINDOWS
as the window andPaneInfo.NO_FIRING
as the pane info for parameters. - of(Coder<T>) - Static method in class org.apache.beam.sdk.values.WindowedValues.ValueOnlyWindowedValueCoder
-
Deprecated.
- of(Coder<T>, byte[]) - Static method in class org.apache.beam.sdk.coders.ZstdCoder
-
Wraps the given coder into a
ZstdCoder
. - of(Coder<T>, byte[], int) - Static method in class org.apache.beam.sdk.coders.ZstdCoder
-
Wraps the given coder into a
ZstdCoder
. - of(Coder<T>, int) - Static method in class org.apache.beam.sdk.coders.ZstdCoder
-
Wraps the given coder into a
ZstdCoder
. - of(Coder<T>, String) - Static method in class org.apache.beam.runners.fnexecution.data.RemoteInputDestination
- of(Coder<T>, Coder<? extends BoundedWindow>) - Static method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
- of(Coder<T>, Coder<? extends BoundedWindow>) - Static method in class org.apache.beam.sdk.values.WindowedValues.FullWindowedValueCoder
- of(Coder<T>, Coder<? extends BoundedWindow>) - Static method in class org.apache.beam.sdk.values.WindowedValues.ParamWindowedValueCoder
-
Returns the
WindowedValues.ParamWindowedValueCoder
for the given valueCoder and windowCoder usingBoundedWindow.TIMESTAMP_MIN_VALUE
as the timestamp,WindowedValues.GLOBAL_WINDOWS
as the window andPaneInfo.NO_FIRING
as the pane info for parameters. - of(Coder<T>, Coder<? extends BoundedWindow>, Instant, Collection<? extends BoundedWindow>, PaneInfo) - Static method in class org.apache.beam.sdk.values.WindowedValues.ParamWindowedValueCoder
-
Returns the
WindowedValues.ParamWindowedValueCoder
for the given valueCoder and windowCoder using the supplied parameterized timestamp, windows and pane info forWindowedValues
. - of(Coder<T>, Coder<ErrorT>, Coder<? extends BoundedWindow>) - Static method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow.Coder
- of(Coder<T>, FnDataReceiver<T>) - Static method in class org.apache.beam.runners.fnexecution.control.RemoteOutputReceiver
- of(Coder<T>, Duration) - Static method in class org.apache.beam.runners.spark.io.CreateStream
-
Creates a new Spark based stream without forced watermark sync, intended for test purposes.
- of(Coder<T>, Duration, boolean) - Static method in class org.apache.beam.runners.spark.io.CreateStream
-
Creates a new Spark based stream intended for test purposes.
- of(Coder<ValueT>) - Static method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
- of(AvroDatumFactory<T>, Schema) - Static method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
-
Returns an
AvroCoder
instance for the providedAvroDatumFactory
using the provided Avro schema. - of(ProtoDomain, Descriptors.Descriptor) - Static method in class org.apache.beam.sdk.extensions.protobuf.DynamicProtoCoder
-
Returns a
DynamicProtoCoder
for the Protocol BuffersDynamicMessage
for the givenDescriptors.Descriptor
. - of(ProtoDomain, String) - Static method in class org.apache.beam.sdk.extensions.protobuf.DynamicProtoCoder
-
Returns a
DynamicProtoCoder
for the Protocol BuffersDynamicMessage
for the given message name in aProtoDomain
. - of(BeamSqlTable) - Static method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteTable
- of(TableSchema.Column...) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema
- of(TableSchema.TypeName) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- of(RowMutationInformation.MutationType, long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.RowMutationInformation
-
Deprecated.
- of(RowMutationInformation.MutationType, String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.RowMutationInformation
-
Instantiate
RowMutationInformation
withRowMutationInformation.MutationType
and the , which sets the BigQuery API_CHANGE_SEQUENCE_NUMBER
pseudo column, enabling custom user-supplied ordering ofRowMutation
s. - of(FhirBundleParameter, String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleResponse
- of(PubsubMessage, long, String, String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.OutgoingMessage
- of(JdbcIO.DataSourceConfiguration) - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceProviderFromDataSourceConfiguration
- of(JdbcIO.DataSourceConfiguration) - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO.PoolableDataSourceProvider
- of(Neo4jIO.DriverConfiguration) - Static method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverProviderFromDriverConfiguration
- of(ByteKeyRange) - Static method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
-
Instantiates a new
ByteKeyRangeTracker
with the specified range. - of(ByteKeyRange) - Static method in class org.apache.beam.sdk.transforms.splittabledofn.ByteKeyRangeTracker
- of(ByteKey, ByteKey) - Static method in class org.apache.beam.sdk.io.range.ByteKeyRange
-
Creates a new
ByteKeyRange
with the given start and end keys. - of(SnowflakeColumn...) - Static method in class org.apache.beam.sdk.io.snowflake.data.SnowflakeTableSchema
- of(SnowflakeIO.DataSourceConfiguration) - Static method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceProviderFromDataSourceConfiguration
- of(ValueProvider<X>, SerializableFunction<X, T>) - Static method in class org.apache.beam.sdk.options.ValueProvider.NestedValueProvider
-
Creates a
ValueProvider.NestedValueProvider
that wraps the provided value. - of(FieldAccessDescriptor) - Static method in class org.apache.beam.sdk.schemas.transforms.WithKeys
- of(FieldAccessDescriptor.FieldDescriptor.ListQualifier) - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.Qualifier
- of(FieldAccessDescriptor.FieldDescriptor.MapQualifier) - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.Qualifier
- of(Schema) - Static method in class org.apache.beam.sdk.coders.RowCoder
- of(Schema) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestBoundedTable
-
Build a mocked bounded table with the specified type.
- of(Schema) - Static method in class org.apache.beam.sdk.io.jdbc.SchemaUtil.BeamRowMapper
- of(Schema) - Static method in class org.apache.beam.sdk.schemas.SchemaCoder
- of(Schema.Field...) - Static method in class org.apache.beam.sdk.schemas.Schema
- of(Schema.TypeName) - Static method in class org.apache.beam.sdk.schemas.Schema.FieldType
-
Create a
Schema.FieldType
for the given type. - of(Schema, String, RexCall, Quantifier) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPPattern
- of(Schema, Cast.Validator) - Static method in class org.apache.beam.sdk.schemas.transforms.Cast
- of(Schema, TypeDescriptor<T>, SerializableFunction<T, Row>, SerializableFunction<Row, T>) - Static method in class org.apache.beam.sdk.schemas.SchemaCoder
-
Returns a
SchemaCoder
for the specified class. - of(SerializableFunction<TopicPartition, Boolean>) - Static method in class org.apache.beam.sdk.io.kafka.CheckStopReadingFnWrapper
- of(DisplayData.Path, Class<?>, String) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData.Identifier
- of(DoFn<InputT, OutputT>) - Static method in class org.apache.beam.sdk.transforms.DoFnTester
-
Deprecated.Use
TestPipeline
with theDirectRunner
. - of(DoFn<InputT, OutputT>) - Static method in class org.apache.beam.sdk.transforms.ParDo
- of(CoGbkResultSchema, UnionCoder) - Static method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
-
Returns a
CoGbkResult.CoGbkResultCoder
for the given schema andUnionCoder
. - of(PTransform<PCollection<RequestT>, Result<KV<RequestT, ResponseT>>>, PTransform<PCollection<KV<RequestT, ResponseT>>, Result<KV<RequestT, ResponseT>>>) - Static method in class org.apache.beam.io.requestresponse.Cache.Pair
- of(PTransform<PCollection<T>, ?>) - Static method in class org.apache.beam.sdk.transforms.Tee
-
Returns a new Tee PTransform that will apply an auxilary transform to the input as well as pass it on.
- of(SerializableBiFunction<V, V, V>) - Static method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
-
Returns a
CombineFn
that uses the givenSerializableBiFunction
to combine values. - of(SerializableFunction<Iterable<V>, V>) - Static method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
-
Returns a
CombineFn
that uses the givenSerializableFunction
to combine values. - of(SerializableFunction<Iterable<V>, V>) - Static method in class org.apache.beam.sdk.transforms.Combine.SimpleCombineFn
-
Deprecated.Returns a
CombineFn
that uses the givenSerializableFunction
to combine values. - of(SerializableFunction<Iterable<V>, V>, int) - Static method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
-
Returns a
CombineFn
that uses the givenSerializableFunction
to combine values, attempting to buffer at leastbufferSize
values between invocations. - of(SerializableFunction<Row, byte[]>, SerializableFunction<byte[], Row>) - Static method in interface org.apache.beam.sdk.schemas.io.payloads.PayloadSerializer
- of(SerializableFunction<Row, Integer>) - Static method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields.Fanout
- of(SerializableFunction<T, Instant>) - Static method in class org.apache.beam.sdk.transforms.WithTimestamps
-
For a
SerializableFunction
fn
fromT
toInstant
, outputs aPTransform
that takes an inputPCollection<T>
and outputs aPCollection<T>
containing every elementv
in the input where each element is output with a timestamp obtained as the result offn.apply(v)
. - of(SerializableFunction<V, K>) - Static method in class org.apache.beam.sdk.transforms.WithKeys
-
Returns a
PTransform
that takes aPCollection<V>
and returns aPCollection<KV<K, V>>
, where each of the values in the inputPCollection
has been paired with a key computed from the value by invoking the givenSerializableFunction
. - of(Trigger.OnceTrigger...) - Static method in class org.apache.beam.sdk.transforms.windowing.AfterAll
-
Returns an
AfterAll
Trigger
with the given subtriggers. - of(Trigger.OnceTrigger...) - Static method in class org.apache.beam.sdk.transforms.windowing.AfterFirst
-
Returns an
AfterFirst
Trigger
with the given subtriggers. - of(WindowFn<T, W>) - Static method in class org.apache.beam.sdk.values.WindowingStrategy
- of(PCollection<OutputElementT>, PCollection<FailureElementT>) - Static method in class org.apache.beam.sdk.transforms.WithFailures.Result
- of(PCollection<T>) - Static method in class org.apache.beam.sdk.values.PCollectionList
-
Returns a singleton
PCollectionList
containing the givenPCollection
. - of(PCollectionTuple) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read.Result
- of(PCollectionTuple) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.Result
- of(PCollectionTuple, TupleTag<OutputElementT>, TupleTag<FailureElementT>) - Static method in class org.apache.beam.sdk.transforms.WithFailures.Result
- of(PCollectionView<ViewT>) - Static method in class org.apache.beam.runners.spark.translation.streaming.CreateStreamingSparkView.CreateSparkPCollectionView
- of(PCollectionView<ViewT>) - Static method in class org.apache.beam.sdk.transforms.View.CreatePCollectionView
- of(Row) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.FixedPrecisionNumeric
-
Create a FixedPrecisionNumeric instance with specified argument row.
- of(TupleTag<?>) - Static method in class org.apache.beam.sdk.values.TupleTagList
-
Returns a singleton
TupleTagList
containing the givenTupleTag
. - of(TupleTag<?>, PCollection<?>) - Static method in class org.apache.beam.sdk.values.TaggedPValue
- of(TupleTag<InputT>, PCollection<KV<K, InputT>>) - Static method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
-
Returns a new
KeyedPCollectionTuple<K>
with the given tag and initial PCollection. - of(TupleTag<T>, PCollection<T>) - Static method in class org.apache.beam.sdk.values.PCollectionTuple
- of(TupleTag<V>, List<V>) - Static method in class org.apache.beam.sdk.transforms.join.CoGbkResult
-
Returns a new CoGbkResult that contains just the given tag and given data.
- of(TypeDescriptor<PubsubMessage>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdAndOrderingKeyCoder
- of(TypeDescriptor<PubsubMessage>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdCoder
- of(TypeDescriptor<PubsubMessage>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
- of(TypeDescriptor<PubsubMessage>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithTopicCoder
- of(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.coders.SerializableCoder
-
Returns a
SerializableCoder
instance for the provided element type. - of(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
-
Returns an
AvroCoder
instance for the provided element type. - of(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
- of(TypeDescriptor<T>, boolean) - Static method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
-
Returns an
AvroCoder
instance for the provided element type, respecting whether to use Avro's Reflect* or Specific* suite for encoding and decoding. - of(RelFieldCollation) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.OrderKey
- of(RexCall) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPCall
- of(RexLiteral) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
- of(RexNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPOperation
- of(RexPatternFieldRef) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPFieldRef
- of(SqlOperator) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPOperator
- of(FrameworkConfig, ExpressionConverter, RelOptCluster, QueryTrait) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.ConversionContext
- of(Table<String, String, byte[]>, Collection<byte[]>) - Static method in class org.apache.beam.runners.spark.stateful.StateAndTimers
- of(TopicPartition, Long, Instant, Long, Instant, List<String>) - Static method in class org.apache.beam.sdk.io.kafka.KafkaSourceDescriptor
- of(Duration) - Static method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
-
Partitions the timestamp space into half-open intervals of the form [N * size, (N + 1) * size), where 0 is the epoch.
- of(Duration) - Static method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
-
Assigns timestamps into half-open intervals of the form [N * period, N * period + size), where 0 is the epoch.
- of(Duration, String...) - Static method in class org.apache.beam.sdk.extensions.timeseries.FillGaps
-
Construct the transform for the given duration and key fields.
- of(Duration, FieldAccessDescriptor) - Static method in class org.apache.beam.sdk.extensions.timeseries.FillGaps
-
Construct the transform for the given duration and key fields.
- of(Duration, Duration) - Static method in class org.apache.beam.sdk.extensions.sql.impl.TVFSlidingWindowFn
- of(ReadableDateTime) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
- of(OutputT, PCollection<FailureElementT>) - Static method in class org.apache.beam.sdk.transforms.WithFailures.Result
- of(RestrictionT) - Static method in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker.TruncateResult
-
Returns a
RestrictionTracker.TruncateResult
for the given restriction. - of(T) - Static method in class org.apache.beam.sdk.options.ValueProvider.StaticValueProvider
-
Creates a
ValueProvider.StaticValueProvider
that wraps the provided value. - of(T, ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Max
-
A
CombineFn
that computes the maximum of a collection of elements of typeT
using an arbitraryComparator
andidentity
, useful as an argument toCombine.globally(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>)
orCombine.perKey(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>)
. - of(T, ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Min
-
A
CombineFn
that computes the minimum of a collection of elements of typeT
using an arbitraryComparator
and anidentity
, useful as an argument toCombine.globally(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>)
orCombine.perKey(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>)
. - of(T, Exception) - Static method in class org.apache.beam.sdk.transforms.WithFailures.ExceptionElement
- of(T, CloseableResource.Closer<T>) - Static method in class org.apache.beam.runners.portability.CloseableResource
-
Creates a
CloseableResource
with the given resource and closer. - of(T, Coder<T>) - Static method in class org.apache.beam.runners.spark.translation.ValueAndCoderLazySerializable
- of(T, Instant, Collection<? extends BoundedWindow>, PaneInfo) - Static method in class org.apache.beam.sdk.values.WindowedValues
-
Returns a
WindowedValue
with the given value, timestamp, and windows. - of(T, Instant, BoundedWindow, PaneInfo) - Static method in class org.apache.beam.sdk.values.ValueInSingleWindow
- of(T, Instant, BoundedWindow, PaneInfo) - Static method in class org.apache.beam.sdk.values.WindowedValues
-
Returns a
WindowedValue
with the given value, timestamp, and window. - of(T, Instant, BoundedWindow, PaneInfo, ErrorT) - Static method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow
- of(V, Instant) - Static method in class org.apache.beam.sdk.values.TimestampedValue
-
Returns a new
TimestampedValue
with the given value and timestamp. - ofByteSize(long) - Static method in class org.apache.beam.sdk.transforms.GroupIntoBatches
-
Aim to create batches each with the specified byte size.
- ofByteSize(long, SerializableFunction<InputT, Long>) - Static method in class org.apache.beam.sdk.transforms.GroupIntoBatches
-
Aim to create batches each with the specified byte size.
- ofCallerAndSetupTeardown(CallerSetupTeardownT, Coder<ResponseT>) - Static method in class org.apache.beam.io.requestresponse.RequestResponseIO
-
Instantiates a
RequestResponseIO
with aRequestResponseIO
Coder
, a default package private implementation ofCallShouldBackoff
based on https://sre.google/sre-book/handling-overload, and an implementation of both theCaller
andSetupTeardown
interfaces. - ofDoubles() - Static method in class org.apache.beam.sdk.transforms.Max
-
A
CombineFn
that computes the maximum of a collection ofDouble
s, useful as an argument toCombine.globally(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>)
orCombine.perKey(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>)
. - ofDoubles() - Static method in class org.apache.beam.sdk.transforms.Min
-
A
CombineFn
that computes the minimum of a collection ofDouble
s, useful as an argument toCombine.globally(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>)
orCombine.perKey(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>)
. - ofDoubles() - Static method in class org.apache.beam.sdk.transforms.Sum
-
A
SerializableFunction
that computes the sum of anIterable
ofDouble
s, useful as an argument toCombine.globally(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>)
orCombine.perKey(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>)
. - ofExpandedValue(PCollection<?>) - Static method in class org.apache.beam.sdk.values.TaggedPValue
- OFF - Enum constant in enum class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.Level
-
Deprecated.Special level used to turn off logging.
- OFF - Enum constant in enum class org.apache.beam.sdk.options.PipelineOptions.CheckEnabled
- OFF - Enum constant in enum class org.apache.beam.sdk.options.SdkHarnessOptions.LogLevel
-
Special level used to turn off logging.
- offer(ArtifactRetrievalService, ArtifactStagingServiceGrpc.ArtifactStagingServiceStub, String) - Static method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService
-
Lazily stages artifacts by letting an ArtifactStagingService resolve and request artifacts.
- offerCoders(Coder[]) - Method in interface org.apache.beam.sdk.state.StateSpec
-
For internal use only; no backwards-compatibility guarantees.
- offeringClientsToPool(ControlClientPool.Sink, HeaderAccessor) - Static method in class org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService
-
Creates a new
FnApiControlClientPoolService
which will enqueue and vend new SDK harness connections. - ofFirstElement() - Static method in class org.apache.beam.sdk.transforms.windowing.AfterSynchronizedProcessingTime
- offset(Duration) - Method in interface org.apache.beam.sdk.state.Timer
-
Offsets the target timestamp used by
Timer.setRelative()
by the given duration. - OFFSET_INFINITY - Static variable in class org.apache.beam.sdk.io.range.OffsetRangeTracker
-
Offset corresponding to infinity.
- offsetBasedDeduplicationSupported() - Method in class org.apache.beam.sdk.io.UnboundedSource
-
If offsetBasedDeduplicationSupported returns true, then the UnboundedSource needs to provide the following: UnboundedReader which provides offsets that are unique for each element and lexicographically ordered.
- OffsetBasedReader(OffsetBasedSource<T>) - Constructor for class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
- OffsetBasedSource<T> - Class in org.apache.beam.sdk.io
-
A
BoundedSource
that uses offsets to define starting and ending positions. - OffsetBasedSource(long, long, long) - Constructor for class org.apache.beam.sdk.io.OffsetBasedSource
- OffsetBasedSource.OffsetBasedReader<T> - Class in org.apache.beam.sdk.io
-
A
Source.Reader
that implements code common to readers of allOffsetBasedSource
s. - OffsetByteRangeCoder - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
- OffsetByteRangeCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.OffsetByteRangeCoder
- OffsetRange - Class in org.apache.beam.sdk.io.range
-
A restriction represented by a range of integers [from, to).
- OffsetRange(long, long) - Constructor for class org.apache.beam.sdk.io.range.OffsetRange
- OffsetRange.Coder - Class in org.apache.beam.sdk.io.range
-
A coder for
OffsetRange
s. - OffsetRangeTracker - Class in org.apache.beam.sdk.io.range
-
A
RangeTracker
for non-negative positions of typelong
. - OffsetRangeTracker - Class in org.apache.beam.sdk.transforms.splittabledofn
-
A
RestrictionTracker
for claiming offsets in anOffsetRange
in a monotonically increasing fashion. - OffsetRangeTracker(long, long) - Constructor for class org.apache.beam.sdk.io.range.OffsetRangeTracker
-
Creates an
OffsetRangeTracker
for the specified range. - OffsetRangeTracker(OffsetRange) - Constructor for class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
- ofIntegers() - Static method in class org.apache.beam.sdk.transforms.Max
-
A
CombineFn
that computes the maximum of a collection ofInteger
s, useful as an argument toCombine.globally(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>)
orCombine.perKey(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>)
. - ofIntegers() - Static method in class org.apache.beam.sdk.transforms.Min
-
A
CombineFn
that computes the minimum of a collection ofInteger
s, useful as an argument toCombine.globally(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>)
orCombine.perKey(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>)
. - ofIntegers() - Static method in class org.apache.beam.sdk.transforms.Sum
-
A
SerializableFunction
that computes the sum of anIterable
ofInteger
s, useful as an argument toCombine.globally(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>)
orCombine.perKey(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>)
. - ofKVs(String, Schema.FieldType, Schema.FieldType, Coder<KeyT>) - Static method in class org.apache.beam.sdk.extensions.python.transforms.RunInference
- ofKVs(String, Schema, Coder<KeyT>) - Static method in class org.apache.beam.sdk.extensions.python.transforms.RunInference
- ofLongs() - Static method in class org.apache.beam.sdk.transforms.Max
-
A
CombineFn
that computes the maximum of a collection ofLong
s, useful as an argument toCombine.globally(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>)
orCombine.perKey(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>)
. - ofLongs() - Static method in class org.apache.beam.sdk.transforms.Min
-
A
CombineFn
that computes the minimum of a collection ofLong
s, useful as an argument toCombine.globally(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>)
orCombine.perKey(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>)
. - ofLongs() - Static method in class org.apache.beam.sdk.transforms.Sum
-
A
SerializableFunction
that computes the sum of anIterable
ofLong
s, useful as an argument toCombine.globally(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>)
orCombine.perKey(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>)
. - ofNamed(Map<String, ?>) - Static method in class org.apache.beam.sdk.extensions.sql.impl.QueryPlanner.QueryParameters
- ofNone() - Static method in class org.apache.beam.sdk.extensions.sql.impl.QueryPlanner.QueryParameters
- ofPatientEverything(HealthcareApiClient, String, Map<String, Object>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.FhirResourcePagesIterator
-
Instantiates a new GetPatientEverything FHIR resource pages iterator.
- ofPositional(List) - Static method in class org.apache.beam.sdk.extensions.sql.impl.QueryPlanner.QueryParameters
- ofPrimitiveOutputsInternal(Pipeline, TupleTagList, Map<TupleTag<?>, Coder<?>>, WindowingStrategy<?, ?>, PCollection.IsBounded) - Static method in class org.apache.beam.sdk.values.PCollectionTuple
-
For internal use only; no backwards-compatibility guarantees.
- ofProvider(ValueProvider<T>, Coder<T>) - Static method in class org.apache.beam.sdk.transforms.Create
-
Returns an
Create.OfValueProvider
transform that produces aPCollection
of a single element provided by the givenValueProvider
. - ofSearch(HealthcareApiClient, String, String, Map<String, Object>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.FhirResourcePagesIterator
-
Instantiates a new search FHIR resource pages iterator.
- ofSize(long) - Static method in class org.apache.beam.sdk.transforms.GroupIntoBatches
-
Aim to create batches each with the specified element count.
- OK - Enum constant in enum class org.apache.beam.sdk.io.fs.MatchResult.Status
- OK - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
- OLD_AND_NEW_VALUES - Enum constant in enum class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ValueCaptureType
- on(List<PCollection<?>>) - Static method in class org.apache.beam.sdk.transforms.Wait
-
Waits on the given signal collections.
- on(Join.FieldsEqual.Impl) - Method in class org.apache.beam.sdk.schemas.transforms.Join.Impl
-
Join the PCollections using the provided predicate.
- on(PCollection<?>...) - Static method in class org.apache.beam.sdk.transforms.Wait
-
Waits on the given signal collections.
- ON_TIME - Enum constant in enum class org.apache.beam.sdk.transforms.windowing.PaneInfo.Timing
-
Pane was fired by a
AfterWatermark.pastEndOfWindow()
trigger because the input watermark progressed after the end of the window. - ON_TIME_AND_ONLY_FIRING - Static variable in class org.apache.beam.sdk.transforms.windowing.PaneInfo
-
PaneInfo
to use when there will be exactly one firing and it is on time. - onAdvance(int, int) - Method in class org.apache.beam.sdk.fn.stream.AdvancingPhaser
- onBatchCompleted(JavaStreamingListenerBatchCompleted) - Method in class org.apache.beam.runners.spark.metrics.MetricsAccumulator.AccumulatorCheckpointingSparkListener
- onBatchCompleted(JavaStreamingListenerBatchCompleted) - Method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder.WatermarkAdvancingStreamingListener
- onBeforeRequest(String, String, Message) - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsIO.RateLimitPolicy
-
Called before a request is sent.
- onBeforeRequest(String, String, Message) - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsV19.SimpleRateLimitPolicy
- onBundleSuccess() - Method in interface org.apache.beam.sdk.transforms.DoFn.BundleFinalizer.Callback
- OnceTrigger(List<Trigger>) - Constructor for class org.apache.beam.sdk.transforms.windowing.Trigger.OnceTrigger
- onCheckpoint(BeamFnApi.ProcessBundleResponse) - Method in interface org.apache.beam.runners.fnexecution.control.BundleCheckpointHandler
- onCheckpoint(BeamFnApi.ProcessBundleResponse) - Method in class org.apache.beam.runners.fnexecution.control.BundleCheckpointHandlers.StateAndTimerBundleCheckpointHandler
- onClaimed(PositionT) - Method in interface org.apache.beam.sdk.fn.splittabledofn.RestrictionTrackers.ClaimObserver
-
Called when
RestrictionTracker.tryClaim(PositionT)
returns true. - onClaimFailed(PositionT) - Method in interface org.apache.beam.sdk.fn.splittabledofn.RestrictionTrackers.ClaimObserver
-
Called when
RestrictionTracker.tryClaim(PositionT)
returns false. - onClose(Consumer<FnApiControlClient>) - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClient
- onCompleted() - Method in class org.apache.beam.sdk.fn.stream.BufferingStreamObserver
- onCompleted() - Method in class org.apache.beam.sdk.fn.stream.DirectStreamObserver
- onCompleted() - Method in class org.apache.beam.sdk.fn.stream.ForwardingClientResponseObserver
- onCompleted() - Method in class org.apache.beam.sdk.fn.stream.SynchronizedStreamObserver
- onCompleted(BeamFnApi.ProcessBundleResponse) - Method in interface org.apache.beam.runners.fnexecution.control.BundleProgressHandler
-
Handles the bundle's completion report.
- oneOfEncoder(List<Encoder<T>>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers
-
Creates a one-of Spark
Encoder
ofStructType
where each alternative is represented as colum / field named by its index with a separateEncoder
each. - OneOfType - Class in org.apache.beam.sdk.schemas.logicaltypes
-
A logical type representing a union of fields.
- OneOfType.Value - Class in org.apache.beam.sdk.schemas.logicaltypes
-
Represents a single OneOf value.
- onError(String, String, Message, GoogleAdsError) - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsV19.SimpleRateLimitPolicy
- onError(String, String, Message, GoogleAdsErrorT) - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsIO.RateLimitPolicy
-
Called after a request fails with a retryable error.
- onError(Throwable) - Method in class org.apache.beam.sdk.fn.stream.BufferingStreamObserver
- onError(Throwable) - Method in class org.apache.beam.sdk.fn.stream.DirectStreamObserver
- onError(Throwable) - Method in class org.apache.beam.sdk.fn.stream.ForwardingClientResponseObserver
- onError(Throwable) - Method in class org.apache.beam.sdk.fn.stream.SynchronizedStreamObserver
- onEventTime(InternalTimer<FlinkKey, TimerInternals.TimerData>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- onEventTime(InternalTimer<FlinkKey, VoidNamespace>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.DedupingOperator
- onGcTimer(Instant, ValueState<SortedMap<Instant, TimestampedValue<ValueT>>>, ValueState<SortedMap<Instant, Long>>) - Method in class org.apache.beam.sdk.extensions.timeseries.FillGaps.FillGapsDoFn
- onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamAggregateProjectMergeRule
- onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamAggregationRule
- onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamBasicAggregationRule
- onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamCalcMergeRule
- onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamCalcSplittingRule
- onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamCoGBKJoinRule
- onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamIOPushDownRule
- onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamJoinAssociateRule
- onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamJoinPushThroughJoinRule
- onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamSideInputJoinRule
- onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamUnnestRule
- onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.BeamZetaSqlCalcMergeRule
- onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.BeamZetaSqlUnnestRule
- onMerge(ReduceFn.OnMergeContext) - Method in class org.apache.beam.runners.twister2.translators.functions.internal.SystemReduceFnBuffering
- onNext(ReqT) - Method in class org.apache.beam.sdk.fn.stream.ForwardingClientResponseObserver
- onNext(T) - Method in class org.apache.beam.sdk.fn.stream.BufferingStreamObserver
- onNext(T) - Method in class org.apache.beam.sdk.fn.stream.DirectStreamObserver
- onNext(V) - Method in class org.apache.beam.sdk.fn.stream.SynchronizedStreamObserver
- onPollComplete(StateT) - Method in interface org.apache.beam.sdk.transforms.Watch.Growth.TerminationCondition
-
Called by the
Watch
transform to compute a new termination state after every poll completion. - onProcessingTime(long) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.UnboundedSourceWrapper
- onProcessingTime(InternalTimer<FlinkKey, TimerInternals.TimerData>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- onProcessingTime(InternalTimer<FlinkKey, VoidNamespace>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.DedupingOperator
- onProgress(BeamFnApi.ProcessBundleProgressResponse) - Method in interface org.apache.beam.runners.fnexecution.control.BundleProgressHandler
-
Handles a progress report from the bundle while it is executing.
- onReceiverStart() - Method in class org.apache.beam.sdk.io.sparkreceiver.WrappedSupervisor
- onSeenNewOutput(Instant, StateT) - Method in interface org.apache.beam.sdk.transforms.Watch.Growth.TerminationCondition
-
Called by the
Watch
transform to compute a new termination state, in case after calling theWatch.Growth.PollFn
for the current input, theWatch.Growth.PollResult
included a previously unseenOutputT
. - onStartup() - Method in interface org.apache.beam.sdk.harness.JvmInitializer
-
Implement onStartup to run some custom initialization immediately after the JVM is launched for pipeline execution.
- onSuccess(String, String, Message) - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsIO.RateLimitPolicy
-
Called after a request succeeds.
- onSuccess(String, String, Message) - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsV19.SimpleRateLimitPolicy
- onSuccess(List<KinesisRecord>) - Method in interface org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicy
-
Called after Kinesis records are successfully retrieved.
- onSuccess(List<KinesisRecord>) - Method in class org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory.DefaultRateLimiter
- onSuccess(List<KinesisRecord>) - Method in class org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory.DelayIntervalRateLimiter
- onTeardown() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiConvertMessages.ConvertMessagesDoFn
- onTeardown() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiFlushAndFinalizeDoFn
- onTeardown() - Method in class org.apache.beam.sdk.io.gcp.bigquery.UpdateSchemaDestination
- onThrottle(KinesisClientThrottledException) - Method in interface org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicy
-
Called after the Kinesis client is throttled.
- onThrottle(KinesisClientThrottledException) - Method in class org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory.DefaultRateLimiter
- onTimer(String, String, KeyT, BoundedWindow, Instant, Instant, TimeDomain) - Method in class org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate
- onTimer(String, String, KeyT, BoundedWindow, Instant, Instant, TimeDomain) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.stableinput.BufferingDoFnRunner
- onTimer(String, String, KeyT, BoundedWindow, Instant, Instant, TimeDomain) - Method in class org.apache.beam.runners.spark.translation.DoFnRunnerWithMetrics
- onTimer(String, Instant, TimerMap, TimerMap, ValueState<SortedMap<Instant, TimestampedValue<ValueT>>>, ValueState<SortedMap<Instant, Long>>, DoFn.OutputReceiver<ValueT>) - Method in class org.apache.beam.sdk.extensions.timeseries.FillGaps.FillGapsDoFn
- OnTimerContext() - Constructor for class org.apache.beam.sdk.transforms.DoFn.OnTimerContext
- onTrigger(ReduceFn.OnTriggerContext) - Method in class org.apache.beam.runners.twister2.translators.functions.internal.SystemReduceFnBuffering
- onWindowExpiration(BoundedWindow, Instant, KeyT) - Method in class org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate
- onWindowExpiration(BoundedWindow, Instant, KeyT) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.stableinput.BufferingDoFnRunner
- onWindowExpiration(BoundedWindow, Instant, KeyT) - Method in class org.apache.beam.runners.spark.translation.DoFnRunnerWithMetrics
- OnWindowExpirationContext() - Constructor for class org.apache.beam.sdk.transforms.DoFn.OnWindowExpirationContext
- open() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- open() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.ExecutableStageDoFnOperator
- open() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.DedupingOperator
- open() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.PartialReduceBundleOperator
- open() - Method in class org.apache.beam.sdk.io.FileIO.ReadableFile
-
Returns a
ReadableByteChannel
reading the data from this file, potentially decompressing it usingFileIO.ReadableFile.getCompression()
. - open(String) - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
-
Opens a uniquely named temporary file and initializes the writer using
FileBasedSink.Writer.prepareWrite(java.nio.channels.WritableByteChannel)
. - open(WritableByteChannel) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Sink
- open(WritableByteChannel) - Method in interface org.apache.beam.sdk.io.FileIO.Sink
-
Initializes writing to the given channel.
- open(WritableByteChannel) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Sink
- open(WritableByteChannel) - Method in class org.apache.beam.sdk.io.TextIO.Sink
- open(WritableByteChannel) - Method in class org.apache.beam.sdk.io.TFRecordIO.Sink
- open(WritableByteChannel) - Method in class org.apache.beam.sdk.io.thrift.ThriftIO.Sink
- open(WritableByteChannel) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Sink
- open(SourceInputSplit<T>) - Method in class org.apache.beam.runners.flink.translation.wrappers.SourceInputFormat
- open(GcsPath) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
-
Opens an object in GCS.
- open(ClassLoaderFileSystem.ClassLoaderResourceId) - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem
- open(ResourceId) - Static method in class org.apache.beam.sdk.io.FileSystems
-
Returns a read channel for the given
ResourceId
. - open(Configuration) - Method in class org.apache.beam.runners.flink.translation.functions.FlinkDoFnFunction
- open(Configuration) - Method in class org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction
- open(Configuration) - Method in class org.apache.beam.runners.flink.translation.functions.FlinkExecutableStagePruningFunction
- open(Configuration) - Method in class org.apache.beam.runners.flink.translation.functions.FlinkMergingNonShuffleReduceFunction
- open(Configuration) - Method in class org.apache.beam.runners.flink.translation.functions.FlinkMultiOutputPruningFunction
- open(Configuration) - Method in class org.apache.beam.runners.flink.translation.functions.FlinkPartialReduceFunction
- open(Configuration) - Method in class org.apache.beam.runners.flink.translation.functions.FlinkReduceFunction
- open(Configuration) - Method in class org.apache.beam.runners.flink.translation.functions.FlinkStatefulDoFnFunction
- open(Configuration) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.UnboundedSourceWrapper
-
Initialize and restore state before starting execution of the source.
- open(GenericInputSplit) - Method in class org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat
- open(MetricConfig) - Method in class org.apache.beam.runners.flink.metrics.FileReporter
- open(ResourceIdT) - Method in class org.apache.beam.sdk.io.FileSystem
-
Returns a read channel for the given
FileSystem
. - openSeekable() - Method in class org.apache.beam.sdk.io.FileIO.ReadableFile
-
Returns a
SeekableByteChannel
equivalent toFileIO.ReadableFile.open()
, but fails if this file is notseekable
. - Operations - Search tag in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1
- Section
- Optimization - Search tag in class org.apache.beam.sdk.transforms.ParDo
- Section
- optimizedWrites() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
If true, enables new codepaths that are expected to use less resources while writing to BigQuery.
- OptionalCoder<T> - Class in org.apache.beam.sdk.coders
- options - Variable in class org.apache.beam.sdk.extensions.sorter.ExternalSorter
- options() - Method in class org.apache.beam.runners.jobsubmission.JobPreparation
- options() - Static method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter
- Options() - Constructor for class org.apache.beam.runners.dataflow.DataflowPipelineRegistrar.Options
- Options() - Constructor for class org.apache.beam.runners.direct.DirectRegistrar.Options
- Options() - Constructor for class org.apache.beam.runners.flink.FlinkRunnerRegistrar.Options
- Options() - Constructor for class org.apache.beam.runners.jet.JetRunnerRegistrar.Options
- Options() - Constructor for class org.apache.beam.runners.prism.PrismRegistrar.Options
- Options() - Constructor for class org.apache.beam.runners.spark.SparkRunnerRegistrar.Options
- Options() - Constructor for class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingRunnerRegistrar.Options
- Options() - Constructor for class org.apache.beam.runners.twister2.Twister2RunnerRegistrar.Options
- Options() - Constructor for class org.apache.beam.sdk.extensions.sorter.ExternalSorter.Options
- Options() - Constructor for class org.apache.beam.sdk.options.ManualDockerEnvironmentOptions.Options
- Options() - Constructor for class org.apache.beam.sdk.options.RemoteEnvironmentOptions.Options
- Options() - Constructor for class org.apache.beam.sdk.transforms.resourcehints.ResourceHintsOptions.Options
- OptionsRegistrar() - Constructor for class org.apache.beam.runners.portability.testing.TestUniversalRunner.OptionsRegistrar
- ORACLE - Enum constant in enum class org.apache.beam.io.debezium.Connectors
- ORACLE - Static variable in class org.apache.beam.sdk.io.jdbc.JdbcUtil
- Order - Class in org.apache.beam.sdk.extensions.sql.example.model
-
Describes an order.
- Order() - Constructor for class org.apache.beam.sdk.extensions.sql.example.model.Order
- Order(int, int) - Constructor for class org.apache.beam.sdk.extensions.sql.example.model.Order
- OrderByKey() - Constructor for class org.apache.beam.sdk.values.KV.OrderByKey
- OrderByValue() - Constructor for class org.apache.beam.sdk.values.KV.OrderByValue
- OrderedEventProcessor<EventT,
EventKeyT, - Class in org.apache.beam.sdk.extensions.orderedResultT, StateT> -
Transform for processing ordered events.
- OrderedEventProcessor() - Constructor for class org.apache.beam.sdk.extensions.ordered.OrderedEventProcessor
- OrderedEventProcessorResult<KeyT,
ResultT, - Class in org.apache.beam.sdk.extensions.orderedEventT> -
The result of the ordered processing.
- orderedList(Coder<T>) - Static method in class org.apache.beam.sdk.state.StateSpecs
- OrderedListState<T> - Interface in org.apache.beam.sdk.state
-
A
ReadableState
cell containing a list of values sorted by timestamp. - OrderedProcessingGlobalSequenceHandler(Class<EventT>, Class<KeyT>, Class<StateT>, Class<ResultT>) - Constructor for class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler.OrderedProcessingGlobalSequenceHandler
- OrderedProcessingHandler<EventT,
KeyT, - Class in org.apache.beam.sdk.extensions.orderedStateT, ResultT> -
Parent class for Ordered Processing configuration handlers.
- OrderedProcessingHandler(Class<EventT>, Class<KeyT>, Class<StateT>, Class<ResultT>) - Constructor for class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler
-
Provide concrete classes which will be used by the ordered processing transform.
- OrderedProcessingHandler.OrderedProcessingGlobalSequenceHandler<EventT,
KeyT, - Class in org.apache.beam.sdk.extensions.orderedStateT, ResultT> -
Parent class for Ordered Processing configuration handlers to handle processing of the events where global sequence is used.
- OrderedProcessingStatus - Class in org.apache.beam.sdk.extensions.ordered
-
Indicates the status of ordered processing for a particular key.
- OrderedProcessingStatus() - Constructor for class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus
- OrderedProcessingStatus.Builder - Class in org.apache.beam.sdk.extensions.ordered
- OrderKey - Class in org.apache.beam.sdk.extensions.sql.impl.cep
-
The
OrderKey
class stores the information to sort a column. - orFinally(Trigger.OnceTrigger) - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
-
Specify an ending condition for this trigger.
- OrFinallyTrigger - Class in org.apache.beam.sdk.transforms.windowing
-
A
Trigger
that executes according to its main trigger until its "finally" trigger fires. - org.apache.beam.io.debezium - package org.apache.beam.io.debezium
-
Transforms for reading from DebeziumIO.
- org.apache.beam.io.requestresponse - package org.apache.beam.io.requestresponse
-
Package provides Beam I/O transform support for safely reading from and writing to Web APIs.
- org.apache.beam.runners.dataflow - package org.apache.beam.runners.dataflow
-
Provides a Beam runner that executes pipelines on the Google Cloud Dataflow service.
- org.apache.beam.runners.dataflow.internal - package org.apache.beam.runners.dataflow.internal
-
Implementation of the
DataflowRunner
. - org.apache.beam.runners.dataflow.options - package org.apache.beam.runners.dataflow.options
-
Provides
PipelineOptions
specific to Google Cloud Dataflow. - org.apache.beam.runners.dataflow.util - package org.apache.beam.runners.dataflow.util
-
Provides miscellaneous internal utilities used by the Google Cloud Dataflow runner.
- org.apache.beam.runners.direct - package org.apache.beam.runners.direct
-
Defines the
PipelineOptions.DirectRunner
which executes both Bounded and UnboundedPipelines
on the local machine. - org.apache.beam.runners.flink - package org.apache.beam.runners.flink
-
Internal implementation of the Beam runner for Apache Flink.
- org.apache.beam.runners.flink.adapter - package org.apache.beam.runners.flink.adapter
-
Adaptors for using Beam transforms in Apache Flink pipelines.
- org.apache.beam.runners.flink.metrics - package org.apache.beam.runners.flink.metrics
-
Internal metrics implementation of the Beam runner for Apache Flink.
- org.apache.beam.runners.flink.translation.functions - package org.apache.beam.runners.flink.translation.functions
-
Internal implementation of the Beam runner for Apache Flink.
- org.apache.beam.runners.flink.translation.types - package org.apache.beam.runners.flink.translation.types
-
Internal implementation of the Beam runner for Apache Flink.
- org.apache.beam.runners.flink.translation.utils - package org.apache.beam.runners.flink.translation.utils
-
Internal implementation of the Beam runner for Apache Flink.
- org.apache.beam.runners.flink.translation.wrappers - package org.apache.beam.runners.flink.translation.wrappers
-
Internal implementation of the Beam runner for Apache Flink.
- org.apache.beam.runners.flink.translation.wrappers.streaming - package org.apache.beam.runners.flink.translation.wrappers.streaming
-
Internal implementation of the Beam runner for Apache Flink.
- org.apache.beam.runners.flink.translation.wrappers.streaming.io - package org.apache.beam.runners.flink.translation.wrappers.streaming.io
-
Internal implementation of the Beam runner for Apache Flink.
- org.apache.beam.runners.flink.translation.wrappers.streaming.io.source - package org.apache.beam.runners.flink.translation.wrappers.streaming.io.source
-
Internal implementation of the Beam runner for Apache Flink.
- org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.bounded - package org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.bounded
-
Internal implementation of the Beam runner for Apache Flink.
- org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.impulse - package org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.impulse
-
Internal implementation of the Beam runner for Apache Flink.
- org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.unbounded - package org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.unbounded
-
Internal implementation of the Beam runner for Apache Flink.
- org.apache.beam.runners.flink.translation.wrappers.streaming.stableinput - package org.apache.beam.runners.flink.translation.wrappers.streaming.stableinput
-
Classes for buffering elements for achieving @RequiresStableInput.
- org.apache.beam.runners.flink.translation.wrappers.streaming.state - package org.apache.beam.runners.flink.translation.wrappers.streaming.state
-
Internal state implementation of the Beam runner for Apache Flink.
- org.apache.beam.runners.fnexecution.artifact - package org.apache.beam.runners.fnexecution.artifact
-
Pipeline execution-time artifact-management services, including abstract implementations of the Artifact Retrieval Service.
- org.apache.beam.runners.fnexecution.control - package org.apache.beam.runners.fnexecution.control
-
Utilities for a Beam runner to interact with the Fn API
Control Service
via java abstractions. - org.apache.beam.runners.fnexecution.data - package org.apache.beam.runners.fnexecution.data
-
Utilities for a Beam runner to interact with the Fn API
Data Service
via java abstractions. - org.apache.beam.runners.fnexecution.environment - package org.apache.beam.runners.fnexecution.environment
-
Classes used to instantiate and manage SDK harness environments.
- org.apache.beam.runners.fnexecution.environment.testing - package org.apache.beam.runners.fnexecution.environment.testing
-
Test utilities for the environment management package.
- org.apache.beam.runners.fnexecution.logging - package org.apache.beam.runners.fnexecution.logging
-
Classes used to log informational messages over the
Beam Fn Logging Service
. - org.apache.beam.runners.fnexecution.provisioning - package org.apache.beam.runners.fnexecution.provisioning
-
Provision api services.
- org.apache.beam.runners.fnexecution.state - package org.apache.beam.runners.fnexecution.state
-
State API services.
- org.apache.beam.runners.fnexecution.status - package org.apache.beam.runners.fnexecution.status
-
Worker Status API services.
- org.apache.beam.runners.fnexecution.translation - package org.apache.beam.runners.fnexecution.translation
-
Shared utilities for a Beam runner to translate portable pipelines.
- org.apache.beam.runners.fnexecution.wire - package org.apache.beam.runners.fnexecution.wire
-
Wire coders for communications between runner and SDK harness.
- org.apache.beam.runners.jet - package org.apache.beam.runners.jet
-
Implementation of the Beam runner for Hazelcast Jet.
- org.apache.beam.runners.jet.metrics - package org.apache.beam.runners.jet.metrics
-
Helper classes for implementing metrics in the Hazelcast Jet based runner.
- org.apache.beam.runners.jet.processors - package org.apache.beam.runners.jet.processors
-
Individual DAG node processors used by the Beam runner for Hazelcast Jet.
- org.apache.beam.runners.jobsubmission - package org.apache.beam.runners.jobsubmission
-
Job management services for use in beam runners.
- org.apache.beam.runners.local - package org.apache.beam.runners.local
-
Utilities useful when executing a pipeline on a single machine.
- org.apache.beam.runners.portability - package org.apache.beam.runners.portability
-
Support for executing a pipeline locally over the Beam fn API.
- org.apache.beam.runners.portability.testing - package org.apache.beam.runners.portability.testing
-
Testing utilities for the reference runner.
- org.apache.beam.runners.prism - package org.apache.beam.runners.prism
-
Support for executing a pipeline on Prism.
- org.apache.beam.runners.spark - package org.apache.beam.runners.spark
-
Internal implementation of the Beam runner for Apache Spark.
- org.apache.beam.runners.spark.coders - package org.apache.beam.runners.spark.coders
-
Beam coders and coder-related utilities for running on Apache Spark.
- org.apache.beam.runners.spark.io - package org.apache.beam.runners.spark.io
-
Spark-specific transforms for I/O.
- org.apache.beam.runners.spark.metrics - package org.apache.beam.runners.spark.metrics
-
Provides internal utilities for implementing Beam metrics using Spark accumulators.
- org.apache.beam.runners.spark.metrics.sink - package org.apache.beam.runners.spark.metrics.sink
-
Spark sinks that supports beam metrics and aggregators.
- org.apache.beam.runners.spark.stateful - package org.apache.beam.runners.spark.stateful
-
Spark-specific stateful operators.
- org.apache.beam.runners.spark.structuredstreaming - package org.apache.beam.runners.spark.structuredstreaming
-
Internal implementation of the Beam runner for Apache Spark.
- org.apache.beam.runners.spark.structuredstreaming.examples - package org.apache.beam.runners.spark.structuredstreaming.examples
- org.apache.beam.runners.spark.structuredstreaming.io - package org.apache.beam.runners.spark.structuredstreaming.io
-
Spark-specific transforms for I/O.
- org.apache.beam.runners.spark.structuredstreaming.metrics - package org.apache.beam.runners.spark.structuredstreaming.metrics
-
Provides internal utilities for implementing Beam metrics using Spark accumulators.
- org.apache.beam.runners.spark.structuredstreaming.metrics.sink - package org.apache.beam.runners.spark.structuredstreaming.metrics.sink
-
Spark sinks that supports beam metrics and aggregators.
- org.apache.beam.runners.spark.structuredstreaming.translation - package org.apache.beam.runners.spark.structuredstreaming.translation
-
Internal translators for running Beam pipelines on Spark.
- org.apache.beam.runners.spark.structuredstreaming.translation.batch - package org.apache.beam.runners.spark.structuredstreaming.translation.batch
-
Internal utilities to translate Beam pipelines to Spark batching.
- org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions - package org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions
-
Internal implementation of the Beam runner for Apache Spark.
- org.apache.beam.runners.spark.structuredstreaming.translation.helpers - package org.apache.beam.runners.spark.structuredstreaming.translation.helpers
-
Internal helpers to translate Beam pipelines to Spark streaming.
- org.apache.beam.runners.spark.structuredstreaming.translation.utils - package org.apache.beam.runners.spark.structuredstreaming.translation.utils
-
Internal utils to translate Beam pipelines to Spark streaming.
- org.apache.beam.runners.spark.translation - package org.apache.beam.runners.spark.translation
-
Internal translators for running Beam pipelines on Spark.
- org.apache.beam.runners.spark.translation.streaming - package org.apache.beam.runners.spark.translation.streaming
-
Internal utilities to translate Beam pipelines to Spark streaming.
- org.apache.beam.runners.spark.util - package org.apache.beam.runners.spark.util
-
Internal utilities to translate Beam pipelines to Spark.
- org.apache.beam.runners.twister2 - package org.apache.beam.runners.twister2
-
Internal implementation of the Beam runner for Twister2.
- org.apache.beam.runners.twister2.translation.wrappers - package org.apache.beam.runners.twister2.translation.wrappers
-
Internal implementation of the Beam runner for Twister2.
- org.apache.beam.runners.twister2.translators - package org.apache.beam.runners.twister2.translators
-
Internal implementation of the Beam runner for Twister2.
- org.apache.beam.runners.twister2.translators.batch - package org.apache.beam.runners.twister2.translators.batch
-
Internal implementation of the Beam runner for Twister2.
- org.apache.beam.runners.twister2.translators.functions - package org.apache.beam.runners.twister2.translators.functions
-
Internal implementation of the Beam runner for Twister2.
- org.apache.beam.runners.twister2.translators.functions.internal - package org.apache.beam.runners.twister2.translators.functions.internal
-
Internal implementation of the Beam runner for Twister2.
- org.apache.beam.runners.twister2.translators.streaming - package org.apache.beam.runners.twister2.translators.streaming
-
Internal implementation of the Beam runner for Twister2.
- org.apache.beam.runners.twister2.utils - package org.apache.beam.runners.twister2.utils
-
Internal implementation of the Beam runner for Twister2.
- org.apache.beam.sdk - package org.apache.beam.sdk
-
Provides a simple, powerful model for building both batch and streaming parallel data processing
Pipeline
s. - org.apache.beam.sdk.annotations - package org.apache.beam.sdk.annotations
-
Defines annotations used across the SDK.
- org.apache.beam.sdk.coders - package org.apache.beam.sdk.coders
-
Defines
Coders
to specify how data is encoded to and decoded from byte strings. - org.apache.beam.sdk.expansion - package org.apache.beam.sdk.expansion
-
Contains classes needed to expose transforms to other SDKs.
- org.apache.beam.sdk.expansion.service - package org.apache.beam.sdk.expansion.service
-
Classes used to expand cross-language transforms.
- org.apache.beam.sdk.extensions.arrow - package org.apache.beam.sdk.extensions.arrow
-
Extensions for using Apache Arrow with Beam.
- org.apache.beam.sdk.extensions.avro - package org.apache.beam.sdk.extensions.avro
- org.apache.beam.sdk.extensions.avro.coders - package org.apache.beam.sdk.extensions.avro.coders
-
Defines
Coders
to specify how data is encoded to and decoded from byte strings using Apache Avro. - org.apache.beam.sdk.extensions.avro.io - package org.apache.beam.sdk.extensions.avro.io
-
Defines transforms for reading and writing Avro storage format.
- org.apache.beam.sdk.extensions.avro.schemas - package org.apache.beam.sdk.extensions.avro.schemas
- org.apache.beam.sdk.extensions.avro.schemas.io.payloads - package org.apache.beam.sdk.extensions.avro.schemas.io.payloads
-
Provides abstractions for schema-aware AvroIO.
- org.apache.beam.sdk.extensions.avro.schemas.utils - package org.apache.beam.sdk.extensions.avro.schemas.utils
-
Defines utilities for deailing with schemas using Apache Avro.
- org.apache.beam.sdk.extensions.gcp.auth - package org.apache.beam.sdk.extensions.gcp.auth
-
Defines classes related to interacting with
Credentials
for pipeline creation and execution containing Google Cloud Platform components. - org.apache.beam.sdk.extensions.gcp.options - package org.apache.beam.sdk.extensions.gcp.options
-
Defines
PipelineOptions
for configuring pipeline execution for Google Cloud Platform components. - org.apache.beam.sdk.extensions.gcp.storage - package org.apache.beam.sdk.extensions.gcp.storage
-
Defines IO connectors for Google Cloud Storage.
- org.apache.beam.sdk.extensions.gcp.util - package org.apache.beam.sdk.extensions.gcp.util
-
Defines Google Cloud Platform component utilities that can be used by Beam runners.
- org.apache.beam.sdk.extensions.gcp.util.channels - package org.apache.beam.sdk.extensions.gcp.util.channels
-
Package contains java channel wrappers used with GCS.
- org.apache.beam.sdk.extensions.gcp.util.gcsfs - package org.apache.beam.sdk.extensions.gcp.util.gcsfs
-
Defines utilities used to interact with Google Cloud Storage.
- org.apache.beam.sdk.extensions.jackson - package org.apache.beam.sdk.extensions.jackson
-
Utilities for parsing and creating JSON serialized objects.
- org.apache.beam.sdk.extensions.joinlibrary - package org.apache.beam.sdk.extensions.joinlibrary
-
Utilities for performing SQL-style joins of keyed
PCollections
. - org.apache.beam.sdk.extensions.ml - package org.apache.beam.sdk.extensions.ml
-
Provides DoFns for integration with Google Cloud AI Video Intelligence service.
- org.apache.beam.sdk.extensions.ordered - package org.apache.beam.sdk.extensions.ordered
-
Provides a transform for ordered processing.
- org.apache.beam.sdk.extensions.ordered.combiner - package org.apache.beam.sdk.extensions.ordered.combiner
-
Default implementation of the global sequence combiner used by
OrderedEventProcessor
when processing events using global sequences. - org.apache.beam.sdk.extensions.protobuf - package org.apache.beam.sdk.extensions.protobuf
-
Defines a
Coder
for Protocol Buffers messages,ProtoCoder
. - org.apache.beam.sdk.extensions.python - package org.apache.beam.sdk.extensions.python
-
Extensions for invoking Python transforms from the Beam Java SDK.
- org.apache.beam.sdk.extensions.python.transforms - package org.apache.beam.sdk.extensions.python.transforms
-
Extensions for invoking Python transforms from the Beam Java SDK.
- org.apache.beam.sdk.extensions.sbe - package org.apache.beam.sdk.extensions.sbe
-
Extension for working with SBE messages in Beam.
- org.apache.beam.sdk.extensions.schemaio.expansion - package org.apache.beam.sdk.extensions.schemaio.expansion
-
External Transform Registration for SchemaIOs.
- org.apache.beam.sdk.extensions.sketching - package org.apache.beam.sdk.extensions.sketching
-
Utilities for computing statistical indicators using probabilistic sketches.
- org.apache.beam.sdk.extensions.sorter - package org.apache.beam.sdk.extensions.sorter
-
Utility for performing local sort of potentially large sets of values.
- org.apache.beam.sdk.extensions.sql - package org.apache.beam.sdk.extensions.sql
-
BeamSQL provides a new interface to run a SQL statement with Beam.
- org.apache.beam.sdk.extensions.sql.example - package org.apache.beam.sdk.extensions.sql.example
-
Example how to use Data Catalog table provider.
- org.apache.beam.sdk.extensions.sql.example.model - package org.apache.beam.sdk.extensions.sql.example.model
-
Java classes used to for modeling the examples.
- org.apache.beam.sdk.extensions.sql.expansion - package org.apache.beam.sdk.extensions.sql.expansion
-
External Transform Registration for Beam SQL.
- org.apache.beam.sdk.extensions.sql.impl - package org.apache.beam.sdk.extensions.sql.impl
-
Implementation classes of BeamSql.
- org.apache.beam.sdk.extensions.sql.impl.cep - package org.apache.beam.sdk.extensions.sql.impl.cep
-
Utilities for Complex Event Processing (CEP).
- org.apache.beam.sdk.extensions.sql.impl.nfa - package org.apache.beam.sdk.extensions.sql.impl.nfa
-
Package of Non-deterministic Finite Automata (
NFA
) for MATCH_RECOGNIZE. - org.apache.beam.sdk.extensions.sql.impl.parser - package org.apache.beam.sdk.extensions.sql.impl.parser
-
Beam SQL parsing additions to Calcite SQL.
- org.apache.beam.sdk.extensions.sql.impl.planner - package org.apache.beam.sdk.extensions.sql.impl.planner
-
BeamQueryPlanner
is the main interface. - org.apache.beam.sdk.extensions.sql.impl.rel - package org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamSQL specified nodes, to replace
RelNode
. - org.apache.beam.sdk.extensions.sql.impl.rule - package org.apache.beam.sdk.extensions.sql.impl.rule
-
RelOptRule
to generateBeamRelNode
. - org.apache.beam.sdk.extensions.sql.impl.schema - package org.apache.beam.sdk.extensions.sql.impl.schema
-
define table schema, to map with Beam IO components.
- org.apache.beam.sdk.extensions.sql.impl.transform - package org.apache.beam.sdk.extensions.sql.impl.transform
-
PTransform
used in a BeamSql pipeline. - org.apache.beam.sdk.extensions.sql.impl.transform.agg - package org.apache.beam.sdk.extensions.sql.impl.transform.agg
-
Implementation of standard SQL aggregation functions, e.g.
- org.apache.beam.sdk.extensions.sql.impl.udaf - package org.apache.beam.sdk.extensions.sql.impl.udaf
-
UDAF classes.
- org.apache.beam.sdk.extensions.sql.impl.udf - package org.apache.beam.sdk.extensions.sql.impl.udf
-
UDF classes.
- org.apache.beam.sdk.extensions.sql.impl.utils - package org.apache.beam.sdk.extensions.sql.impl.utils
-
Utility classes.
- org.apache.beam.sdk.extensions.sql.meta - package org.apache.beam.sdk.extensions.sql.meta
-
Metadata related classes.
- org.apache.beam.sdk.extensions.sql.meta.catalog - package org.apache.beam.sdk.extensions.sql.meta.catalog
-
Catalogs.
- org.apache.beam.sdk.extensions.sql.meta.provider - package org.apache.beam.sdk.extensions.sql.meta.provider
-
Table providers.
- org.apache.beam.sdk.extensions.sql.meta.provider.avro - package org.apache.beam.sdk.extensions.sql.meta.provider.avro
-
Table schema for AvroIO.
- org.apache.beam.sdk.extensions.sql.meta.provider.bigquery - package org.apache.beam.sdk.extensions.sql.meta.provider.bigquery
-
Table schema for BigQuery.
- org.apache.beam.sdk.extensions.sql.meta.provider.bigtable - package org.apache.beam.sdk.extensions.sql.meta.provider.bigtable
-
Table schema for BigTable.
- org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog - package org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog
-
Table schema for Google Cloud Data Catalog.
- org.apache.beam.sdk.extensions.sql.meta.provider.datagen - package org.apache.beam.sdk.extensions.sql.meta.provider.datagen
-
Table schema for Datagen.
- org.apache.beam.sdk.extensions.sql.meta.provider.datastore - package org.apache.beam.sdk.extensions.sql.meta.provider.datastore
-
Table schema for DataStore.
- org.apache.beam.sdk.extensions.sql.meta.provider.hcatalog - package org.apache.beam.sdk.extensions.sql.meta.provider.hcatalog
-
Table schema for HCatalog.
- org.apache.beam.sdk.extensions.sql.meta.provider.iceberg - package org.apache.beam.sdk.extensions.sql.meta.provider.iceberg
-
Table schema for Iceberg.
- org.apache.beam.sdk.extensions.sql.meta.provider.kafka - package org.apache.beam.sdk.extensions.sql.meta.provider.kafka
-
Table schema for KafkaIO.
- org.apache.beam.sdk.extensions.sql.meta.provider.mongodb - package org.apache.beam.sdk.extensions.sql.meta.provider.mongodb
-
Table schema for MongoDb.
- org.apache.beam.sdk.extensions.sql.meta.provider.parquet - package org.apache.beam.sdk.extensions.sql.meta.provider.parquet
-
Table schema for ParquetIO.
- org.apache.beam.sdk.extensions.sql.meta.provider.pubsub - package org.apache.beam.sdk.extensions.sql.meta.provider.pubsub
-
Table schema for
PubsubIO
. - org.apache.beam.sdk.extensions.sql.meta.provider.pubsublite - package org.apache.beam.sdk.extensions.sql.meta.provider.pubsublite
-
Provides abstractions for schema-aware IOs.
- org.apache.beam.sdk.extensions.sql.meta.provider.seqgen - package org.apache.beam.sdk.extensions.sql.meta.provider.seqgen
-
Table schema for streaming sequence generator.
- org.apache.beam.sdk.extensions.sql.meta.provider.test - package org.apache.beam.sdk.extensions.sql.meta.provider.test
-
Table schema for in-memory test data.
- org.apache.beam.sdk.extensions.sql.meta.provider.text - package org.apache.beam.sdk.extensions.sql.meta.provider.text
-
Table schema for text files.
- org.apache.beam.sdk.extensions.sql.meta.store - package org.apache.beam.sdk.extensions.sql.meta.store
-
Meta stores.
- org.apache.beam.sdk.extensions.sql.provider - package org.apache.beam.sdk.extensions.sql.provider
-
Package containing UDF providers for testing.
- org.apache.beam.sdk.extensions.sql.udf - package org.apache.beam.sdk.extensions.sql.udf
-
Provides interfaces for defining user-defined functions in Beam SQL.
- org.apache.beam.sdk.extensions.sql.zetasql - package org.apache.beam.sdk.extensions.sql.zetasql
-
ZetaSQL Dialect package.
- org.apache.beam.sdk.extensions.sql.zetasql.translation - package org.apache.beam.sdk.extensions.sql.zetasql.translation
-
Conversion logic between ZetaSQL resolved query nodes and Calcite rel nodes.
- org.apache.beam.sdk.extensions.sql.zetasql.translation.impl - package org.apache.beam.sdk.extensions.sql.zetasql.translation.impl
-
Java implementation of ZetaSQL functions.
- org.apache.beam.sdk.extensions.sql.zetasql.unnest - package org.apache.beam.sdk.extensions.sql.zetasql.unnest
-
Temporary solution to support ZetaSQL UNNEST.
- org.apache.beam.sdk.extensions.timeseries - package org.apache.beam.sdk.extensions.timeseries
-
Utilities for operating on timeseries data.
- org.apache.beam.sdk.extensions.yaml - package org.apache.beam.sdk.extensions.yaml
-
Extensions for invoking Beam YAML transforms from the Beam Java SDK.
- org.apache.beam.sdk.extensions.zetasketch - package org.apache.beam.sdk.extensions.zetasketch
-
PTransform
s to compute statistical sketches on data streams based on the ZetaSketch implementation. - org.apache.beam.sdk.fn - package org.apache.beam.sdk.fn
-
The top level package for the Fn Execution Java libraries.
- org.apache.beam.sdk.fn.channel - package org.apache.beam.sdk.fn.channel
-
gRPC channel management.
- org.apache.beam.sdk.fn.data - package org.apache.beam.sdk.fn.data
-
Classes to interact with the portability framework data plane.
- org.apache.beam.sdk.fn.server - package org.apache.beam.sdk.fn.server
-
gPRC server factory.
- org.apache.beam.sdk.fn.splittabledofn - package org.apache.beam.sdk.fn.splittabledofn
-
Defines utilities related to executing splittable
DoFn
. - org.apache.beam.sdk.fn.stream - package org.apache.beam.sdk.fn.stream
-
gRPC stream management.
- org.apache.beam.sdk.fn.test - package org.apache.beam.sdk.fn.test
-
Utilities for testing use of this package.
- org.apache.beam.sdk.fn.windowing - package org.apache.beam.sdk.fn.windowing
-
Common utilities related to windowing during execution of a pipeline.
- org.apache.beam.sdk.function - package org.apache.beam.sdk.function
-
Java 8 functional interface extensions.
- org.apache.beam.sdk.harness - package org.apache.beam.sdk.harness
-
Utilities for configuring worker environment.
- org.apache.beam.sdk.io - package org.apache.beam.sdk.io
-
Defines transforms for reading and writing common storage formats, including
invalid reference
org.apache.beam.sdk.io.AvroIO
TextIO
. - org.apache.beam.sdk.io.amqp - package org.apache.beam.sdk.io.amqp
-
Transforms for reading and writing using AMQP 1.0 protocol.
- org.apache.beam.sdk.io.aws2.auth - package org.apache.beam.sdk.io.aws2.auth
-
Common code for AWS authentication related functionalities.
- org.apache.beam.sdk.io.aws2.common - package org.apache.beam.sdk.io.aws2.common
-
Common code for AWS sources and sinks such as retry configuration.
- org.apache.beam.sdk.io.aws2.dynamodb - package org.apache.beam.sdk.io.aws2.dynamodb
-
Defines IO connectors for Amazon Web Services DynamoDB.
- org.apache.beam.sdk.io.aws2.kinesis - package org.apache.beam.sdk.io.aws2.kinesis
-
Transforms for reading from Amazon Kinesis.
- org.apache.beam.sdk.io.aws2.options - package org.apache.beam.sdk.io.aws2.options
-
Defines
PipelineOptions
for configuring pipeline execution for Amazon Web Services components. - org.apache.beam.sdk.io.aws2.s3 - package org.apache.beam.sdk.io.aws2.s3
-
Defines IO connectors for Amazon Web Services S3.
- org.apache.beam.sdk.io.aws2.schemas - package org.apache.beam.sdk.io.aws2.schemas
-
Schemas for AWS model classes.
- org.apache.beam.sdk.io.aws2.sns - package org.apache.beam.sdk.io.aws2.sns
-
Defines IO connectors for Amazon Web Services SNS.
- org.apache.beam.sdk.io.aws2.sqs - package org.apache.beam.sdk.io.aws2.sqs
-
Defines IO connectors for Amazon Web Services SQS.
- org.apache.beam.sdk.io.aws2.sqs.providers - package org.apache.beam.sdk.io.aws2.sqs.providers
-
Defines external schema transformation providers for Amazon Web Services SQS.
- org.apache.beam.sdk.io.azure.blobstore - package org.apache.beam.sdk.io.azure.blobstore
-
Defines IO connectors for Azure Blob Storage.
- org.apache.beam.sdk.io.azure.cosmos - package org.apache.beam.sdk.io.azure.cosmos
-
Defines IO connectors for Azure Cosmos DB.
- org.apache.beam.sdk.io.azure.options - package org.apache.beam.sdk.io.azure.options
-
Defines IO connectors for Microsoft Azure Blobstore.
- org.apache.beam.sdk.io.cassandra - package org.apache.beam.sdk.io.cassandra
-
Transforms for reading and writing from/to Apache Cassandra.
- org.apache.beam.sdk.io.cdap - package org.apache.beam.sdk.io.cdap
-
Transforms for reading and writing from CDAP.
- org.apache.beam.sdk.io.cdap.context - package org.apache.beam.sdk.io.cdap.context
-
Context for CDAP classes.
- org.apache.beam.sdk.io.clickhouse - package org.apache.beam.sdk.io.clickhouse
-
Transform for writing to ClickHouse.
- org.apache.beam.sdk.io.contextualtextio - package org.apache.beam.sdk.io.contextualtextio
-
Transforms for reading from Files with contextual Information.
- org.apache.beam.sdk.io.csv - package org.apache.beam.sdk.io.csv
-
Transforms for reading and writing CSV files.
- org.apache.beam.sdk.io.csv.providers - package org.apache.beam.sdk.io.csv.providers
-
Transforms for reading and writing CSV files.
- org.apache.beam.sdk.io.elasticsearch - package org.apache.beam.sdk.io.elasticsearch
-
Common test utilities for Elasticsearch.
- org.apache.beam.sdk.io.fileschematransform - package org.apache.beam.sdk.io.fileschematransform
-
Defines transforms for File reading and writing support with Schema Transform.
- org.apache.beam.sdk.io.fs - package org.apache.beam.sdk.io.fs
-
Apache Beam FileSystem interfaces and their default implementations.
- org.apache.beam.sdk.io.gcp.bigquery - package org.apache.beam.sdk.io.gcp.bigquery
-
Defines transforms for reading and writing from Google BigQuery.
- org.apache.beam.sdk.io.gcp.bigquery.providers - package org.apache.beam.sdk.io.gcp.bigquery.providers
-
Defines SchemaTransformProviders for reading and writing from Google BigQuery.
- org.apache.beam.sdk.io.gcp.bigtable - package org.apache.beam.sdk.io.gcp.bigtable
-
Defines transforms for reading and writing from Google Cloud Bigtable.
- org.apache.beam.sdk.io.gcp.bigtable.changestreams - package org.apache.beam.sdk.io.gcp.bigtable.changestreams
-
Change stream for Google Cloud Bigtable.
- org.apache.beam.sdk.io.gcp.bigtable.changestreams.action - package org.apache.beam.sdk.io.gcp.bigtable.changestreams.action
-
Business logic to process change stream for Google Cloud Bigtable.
- org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao - package org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao
-
Data access object for change stream for Google Cloud Bigtable.
- org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn - package org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn
-
DoFn and SDF definitions to process Google Cloud Bigtable Change Streams.
- org.apache.beam.sdk.io.gcp.bigtable.changestreams.encoder - package org.apache.beam.sdk.io.gcp.bigtable.changestreams.encoder
-
Encoders for writing and reading from Metadata Table for Google Cloud Bigtable Change Streams.
- org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator - package org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator
-
Classes related to estimating the throughput of the change streams SDFs.
- org.apache.beam.sdk.io.gcp.bigtable.changestreams.model - package org.apache.beam.sdk.io.gcp.bigtable.changestreams.model
-
User models for the Google Cloud Bigtable change stream API.
- org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler - package org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler
-
Partition reconciler for Google Cloud Bigtable Change Streams.
- org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction - package org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction
-
Custom RestrictionTracker for Google Cloud Bigtable Change Streams.
- org.apache.beam.sdk.io.gcp.common - package org.apache.beam.sdk.io.gcp.common
-
Defines common Google Cloud Platform IO support classes.
- org.apache.beam.sdk.io.gcp.datastore - package org.apache.beam.sdk.io.gcp.datastore
-
Provides an API for reading from and writing to Google Cloud Datastore over different versions of the Cloud Datastore Client libraries.
- org.apache.beam.sdk.io.gcp.firestore - package org.apache.beam.sdk.io.gcp.firestore
-
Provides an API for reading from and writing to Google Cloud Firestore.
- org.apache.beam.sdk.io.gcp.healthcare - package org.apache.beam.sdk.io.gcp.healthcare
-
Provides an API for reading from and writing to Google Cloud Datastore over different versions of the Cloud Datastore Client libraries.
- org.apache.beam.sdk.io.gcp.pubsub - package org.apache.beam.sdk.io.gcp.pubsub
-
Defines transforms for reading and writing from Google Cloud Pub/Sub.
- org.apache.beam.sdk.io.gcp.pubsublite - package org.apache.beam.sdk.io.gcp.pubsublite
-
Defines transforms for reading and writing from Google Cloud Pub/Sub Lite.
- org.apache.beam.sdk.io.gcp.pubsublite.internal - package org.apache.beam.sdk.io.gcp.pubsublite.internal
-
Defines transforms for reading and writing from Google Cloud Pub/Sub Lite.
- org.apache.beam.sdk.io.gcp.spanner - package org.apache.beam.sdk.io.gcp.spanner
-
Provides an API for reading from and writing to Google Cloud Spanner.
- org.apache.beam.sdk.io.gcp.spanner.changestreams - package org.apache.beam.sdk.io.gcp.spanner.changestreams
-
Provides an API for reading change stream data from Google Cloud Spanner.
- org.apache.beam.sdk.io.gcp.spanner.changestreams.action - package org.apache.beam.sdk.io.gcp.spanner.changestreams.action
-
Action processors for each of the types of Change Stream records received.
- org.apache.beam.sdk.io.gcp.spanner.changestreams.cache - package org.apache.beam.sdk.io.gcp.spanner.changestreams.cache
-
Caching strategy for watermark.
- org.apache.beam.sdk.io.gcp.spanner.changestreams.dao - package org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
-
Database Access Objects for querying change streams and modifying the Connector's metadata tables.
- org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn - package org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn
-
DoFn and SDF definitions to process Google Cloud Spanner Change Streams.
- org.apache.beam.sdk.io.gcp.spanner.changestreams.encoder - package org.apache.beam.sdk.io.gcp.spanner.changestreams.encoder
-
User model for the Spanner change stream API.
- org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator - package org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator
-
Classes related to estimating the throughput of the change streams SDFs.
- org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper - package org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper
-
Mapping related functionality, such as from
ResultSet
s to Change Stream models. - org.apache.beam.sdk.io.gcp.spanner.changestreams.model - package org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
User models for the Spanner change stream API.
- org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction - package org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction
-
Custom restriction tracker related classes.
- org.apache.beam.sdk.io.gcp.testing - package org.apache.beam.sdk.io.gcp.testing
-
Defines utilities for unit testing Google Cloud Platform components of Apache Beam pipelines.
- org.apache.beam.sdk.io.googleads - package org.apache.beam.sdk.io.googleads
-
Defines transforms for reading from Google Ads.
- org.apache.beam.sdk.io.hadoop - package org.apache.beam.sdk.io.hadoop
-
Classes shared by Hadoop based IOs.
- org.apache.beam.sdk.io.hadoop.format - package org.apache.beam.sdk.io.hadoop.format
-
Defines transforms for writing to Data sinks that implement
HadoopFormatIO
. - org.apache.beam.sdk.io.hbase - package org.apache.beam.sdk.io.hbase
-
Transforms for reading and writing from/to Apache HBase.
- org.apache.beam.sdk.io.hcatalog - package org.apache.beam.sdk.io.hcatalog
-
Transforms for reading and writing using HCatalog.
- org.apache.beam.sdk.io.hdfs - package org.apache.beam.sdk.io.hdfs
-
FileSystem
implementation for any HadoopFileSystem
. - org.apache.beam.sdk.io.iceberg - package org.apache.beam.sdk.io.iceberg
-
Iceberg connectors.
- org.apache.beam.sdk.io.influxdb - package org.apache.beam.sdk.io.influxdb
-
Transforms for reading and writing from/to InfluxDB.
- org.apache.beam.sdk.io.jdbc - package org.apache.beam.sdk.io.jdbc
-
Transforms for reading and writing from JDBC.
- org.apache.beam.sdk.io.jdbc.providers - package org.apache.beam.sdk.io.jdbc.providers
-
Transforms for reading and writing from JDBC.
- org.apache.beam.sdk.io.jms - package org.apache.beam.sdk.io.jms
-
Transforms for reading and writing from JMS (Java Messaging Service).
- org.apache.beam.sdk.io.json - package org.apache.beam.sdk.io.json
-
Transforms for reading and writing JSON files.
- org.apache.beam.sdk.io.json.providers - package org.apache.beam.sdk.io.json.providers
-
Transforms for reading and writing JSON files.
- org.apache.beam.sdk.io.kafka - package org.apache.beam.sdk.io.kafka
-
Transforms for reading and writing from Apache Kafka.
- org.apache.beam.sdk.io.kafka.jmh - package org.apache.beam.sdk.io.kafka.jmh
-
Benchmarks for KafkaIO.
- org.apache.beam.sdk.io.kafka.serialization - package org.apache.beam.sdk.io.kafka.serialization
-
Kafka serializers and deserializers.
- org.apache.beam.sdk.io.kafka.upgrade - package org.apache.beam.sdk.io.kafka.upgrade
-
A library to support upgrading Kafka transforms without upgrading the pipeline.
- org.apache.beam.sdk.io.kudu - package org.apache.beam.sdk.io.kudu
-
Transforms for reading and writing from/to Apache Kudu.
- org.apache.beam.sdk.io.mongodb - package org.apache.beam.sdk.io.mongodb
-
Transforms for reading and writing from MongoDB.
- org.apache.beam.sdk.io.mqtt - package org.apache.beam.sdk.io.mqtt
-
Transforms for reading and writing from MQTT.
- org.apache.beam.sdk.io.neo4j - package org.apache.beam.sdk.io.neo4j
-
Transforms for reading from and writing to from Neo4j.
- org.apache.beam.sdk.io.parquet - package org.apache.beam.sdk.io.parquet
-
Transforms for reading and writing from Parquet.
- org.apache.beam.sdk.io.pulsar - package org.apache.beam.sdk.io.pulsar
-
Transforms for reading and writing from Apache Pulsar.
- org.apache.beam.sdk.io.rabbitmq - package org.apache.beam.sdk.io.rabbitmq
-
Transforms for reading and writing from RabbitMQ.
- org.apache.beam.sdk.io.range - package org.apache.beam.sdk.io.range
-
Provides thread-safe helpers for implementing dynamic work rebalancing in position-based bounded sources.
- org.apache.beam.sdk.io.redis - package org.apache.beam.sdk.io.redis
-
Transforms for reading and writing from Redis.
- org.apache.beam.sdk.io.singlestore - package org.apache.beam.sdk.io.singlestore
-
Transforms for reading and writing from SingleStoreDB.
- org.apache.beam.sdk.io.singlestore.schematransform - package org.apache.beam.sdk.io.singlestore.schematransform
-
SingleStoreIO SchemaTransforms.
- org.apache.beam.sdk.io.snowflake - package org.apache.beam.sdk.io.snowflake
-
Snowflake IO transforms.
- org.apache.beam.sdk.io.snowflake.crosslanguage - package org.apache.beam.sdk.io.snowflake.crosslanguage
-
Cross-language for SnowflakeIO.
- org.apache.beam.sdk.io.snowflake.data - package org.apache.beam.sdk.io.snowflake.data
-
Snowflake IO data types.
- org.apache.beam.sdk.io.snowflake.data.datetime - package org.apache.beam.sdk.io.snowflake.data.datetime
-
Snowflake IO date/time types.
- org.apache.beam.sdk.io.snowflake.data.geospatial - package org.apache.beam.sdk.io.snowflake.data.geospatial
-
Snowflake IO geospatial types.
- org.apache.beam.sdk.io.snowflake.data.logical - package org.apache.beam.sdk.io.snowflake.data.logical
-
Snowflake IO logical types.
- org.apache.beam.sdk.io.snowflake.data.numeric - package org.apache.beam.sdk.io.snowflake.data.numeric
-
Snowflake IO numeric types.
- org.apache.beam.sdk.io.snowflake.data.structured - package org.apache.beam.sdk.io.snowflake.data.structured
-
Snowflake IO structured types.
- org.apache.beam.sdk.io.snowflake.data.text - package org.apache.beam.sdk.io.snowflake.data.text
-
Snowflake IO text types.
- org.apache.beam.sdk.io.snowflake.enums - package org.apache.beam.sdk.io.snowflake.enums
-
Snowflake IO data types.
- org.apache.beam.sdk.io.snowflake.services - package org.apache.beam.sdk.io.snowflake.services
-
Snowflake IO services and POJOs.
- org.apache.beam.sdk.io.solace - package org.apache.beam.sdk.io.solace
-
Solace IO connector.
- org.apache.beam.sdk.io.solace.broker - package org.apache.beam.sdk.io.solace.broker
-
Solace IO broker-related classes.
- org.apache.beam.sdk.io.solace.data - package org.apache.beam.sdk.io.solace.data
-
Solace IO connector - data-related classes.
- org.apache.beam.sdk.io.solace.read - package org.apache.beam.sdk.io.solace.read
-
Solace IO connector - read connector classes.
- org.apache.beam.sdk.io.solace.write - package org.apache.beam.sdk.io.solace.write
-
SolaceIO Write connector.
- org.apache.beam.sdk.io.solr - package org.apache.beam.sdk.io.solr
-
Transforms for reading and writing from/to Solr.
- org.apache.beam.sdk.io.sparkreceiver - package org.apache.beam.sdk.io.sparkreceiver
-
Transforms for reading and writing from streaming CDAP plugins.
- org.apache.beam.sdk.io.splunk - package org.apache.beam.sdk.io.splunk
-
Transforms for writing events to Splunk's Http Event Collector (HEC).
- org.apache.beam.sdk.io.thrift - package org.apache.beam.sdk.io.thrift
-
Transforms for reading and writing to Thrift files.
- org.apache.beam.sdk.io.tika - package org.apache.beam.sdk.io.tika
-
Transform for reading and parsing files with Apache Tika.
- org.apache.beam.sdk.io.xml - package org.apache.beam.sdk.io.xml
-
Transforms for reading and writing Xml files.
- org.apache.beam.sdk.jmh.io - package org.apache.beam.sdk.jmh.io
-
Benchmarks for IO.
- org.apache.beam.sdk.jmh.schemas - package org.apache.beam.sdk.jmh.schemas
-
Benchmarks for schemas.
- org.apache.beam.sdk.jmh.util - package org.apache.beam.sdk.jmh.util
-
Benchmarks for core SDK utility classes.
- org.apache.beam.sdk.managed - package org.apache.beam.sdk.managed
-
Managed reads and writes.
- org.apache.beam.sdk.managed.testing - package org.apache.beam.sdk.managed.testing
-
Test transform for Managed API.
- org.apache.beam.sdk.metrics - package org.apache.beam.sdk.metrics
-
Metrics allow exporting information about the execution of a pipeline.
- org.apache.beam.sdk.options - package org.apache.beam.sdk.options
-
Defines
PipelineOptions
for configuring pipeline execution. - org.apache.beam.sdk.providers - package org.apache.beam.sdk.providers
-
Defines
SchemaTransformProvider
s for transforms in the core module. - org.apache.beam.sdk.schemas - package org.apache.beam.sdk.schemas
- org.apache.beam.sdk.schemas.annotations - package org.apache.beam.sdk.schemas.annotations
- org.apache.beam.sdk.schemas.io - package org.apache.beam.sdk.schemas.io
-
Provides abstractions for schema-aware IOs.
- org.apache.beam.sdk.schemas.io.payloads - package org.apache.beam.sdk.schemas.io.payloads
-
Provides abstractions for schema-aware IOs.
- org.apache.beam.sdk.schemas.logicaltypes - package org.apache.beam.sdk.schemas.logicaltypes
-
A set of common LogicalTypes for use with schemas.
- org.apache.beam.sdk.schemas.parser - package org.apache.beam.sdk.schemas.parser
-
Defines utilities for deailing with schemas.
- org.apache.beam.sdk.schemas.parser.generated - package org.apache.beam.sdk.schemas.parser.generated
-
Defines utilities for deailing with schemas.
- org.apache.beam.sdk.schemas.transforms - package org.apache.beam.sdk.schemas.transforms
-
Defines transforms that work on PCollections with schemas..
- org.apache.beam.sdk.schemas.transforms.providers - package org.apache.beam.sdk.schemas.transforms.providers
-
Defines transforms that work on PCollections with schemas..
- org.apache.beam.sdk.schemas.utils - package org.apache.beam.sdk.schemas.utils
-
Defines utilities for deailing with schemas.
- org.apache.beam.sdk.state - package org.apache.beam.sdk.state
-
Classes and interfaces for interacting with state.
- org.apache.beam.sdk.testing - package org.apache.beam.sdk.testing
-
Defines utilities for unit testing Apache Beam pipelines.
- org.apache.beam.sdk.transforms - package org.apache.beam.sdk.transforms
-
Defines
PTransform
s for transforming data in a pipeline. - org.apache.beam.sdk.transforms.display - package org.apache.beam.sdk.transforms.display
-
Defines
HasDisplayData
for annotating components which providedisplay data
used within UIs and diagnostic tools. - org.apache.beam.sdk.transforms.errorhandling - package org.apache.beam.sdk.transforms.errorhandling
-
Provides utilities for handling errors in Pipelines.
- org.apache.beam.sdk.transforms.join - package org.apache.beam.sdk.transforms.join
-
Defines the
CoGroupByKey
transform for joining multiple PCollections. - org.apache.beam.sdk.transforms.resourcehints - package org.apache.beam.sdk.transforms.resourcehints
-
Defines
ResourceHints
for configuring pipeline execution. - org.apache.beam.sdk.transforms.splittabledofn - package org.apache.beam.sdk.transforms.splittabledofn
-
Defines utilities related to splittable
DoFn
. - org.apache.beam.sdk.transforms.windowing - package org.apache.beam.sdk.transforms.windowing
- org.apache.beam.sdk.transformservice.launcher - package org.apache.beam.sdk.transformservice.launcher
-
A library that can be used to start up a Docker-composed based Beam transform service.
- org.apache.beam.sdk.values - package org.apache.beam.sdk.values
-
Defines
PCollection
and other classes for representing data in aPipeline
. - ORPHANED_NEW_PARTITION_CLEANED_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
-
Counter for the total number of orphaned new partitions cleaned up.
- OrphanedMetadataCleaner - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler
- OrphanedMetadataCleaner() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler.OrphanedMetadataCleaner
- out() - Static method in class org.apache.beam.runners.spark.io.ConsoleIO.Write
-
Prints 10 elements from the
PCollection
to the console. - out(int) - Static method in class org.apache.beam.runners.spark.io.ConsoleIO.Write
-
Prints
num
elements from thePCollection
to stdout. - OUT - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read
-
The tag for the main output of FHIR resources.
- OUT - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search
-
The tag for the main output of FHIR Resources from a search.
- OUT - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything
-
The tag for the main output of FHIR Resources from a GetPatientEverything request.
- OUT - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read
-
The tag for the main output of HL7v2 read responses.
- OUT - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read
-
The tag for the main output of HL7v2 Messages.
- outbound(DataStreams.OutputChunkConsumer<ByteString>) - Static method in class org.apache.beam.sdk.fn.stream.DataStreams
-
Converts a single element delimited
OutputStream
into multipleByteStrings
. - outbound(DataStreams.OutputChunkConsumer<ByteString>, int) - Static method in class org.apache.beam.sdk.fn.stream.DataStreams
-
Converts a single element delimited
OutputStream
into multipleByteStrings
using the specified maximum chunk size. - OutboundObserverFactory - Class in org.apache.beam.sdk.fn.stream
-
Creates factories which determine an underlying
StreamObserver
implementation to use in to interact with fn execution APIs. - OutboundObserverFactory() - Constructor for class org.apache.beam.sdk.fn.stream.OutboundObserverFactory
- OutboundObserverFactory.BasicFactory<ReqT,
RespT> - Interface in org.apache.beam.sdk.fn.stream -
Creates an outbound observer for the given inbound observer.
- outboundObserverFor(OutboundObserverFactory.BasicFactory<ReqT, RespT>, StreamObserver<ReqT>) - Method in class org.apache.beam.sdk.fn.stream.OutboundObserverFactory
-
Creates an outbound observer for the given inbound observer by potentially inserting hooks into the inbound and outbound observers.
- outboundObserverFor(StreamObserver<ReqT>) - Method in interface org.apache.beam.sdk.fn.stream.OutboundObserverFactory.BasicFactory
- OUTER - Static variable in class org.apache.beam.sdk.coders.Coder.Context
-
Deprecated.The outer context: the value being encoded or decoded takes up the remainder of the record/stream contents.
- OutgoingMessage() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.OutgoingMessage
- output - Variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator.BufferedOutputManager
- output() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedEventProcessorResult
- output() - Method in class org.apache.beam.sdk.transforms.WithFailures.Result
- output(TupleTag<T>, WindowedValue<T>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator.BufferedOutputManager
- output(TupleTag<T>, T) - Method in class org.apache.beam.sdk.transforms.DoFn.WindowedContext
-
Adds the given element to the output
PCollection
with the given tag. - output(TupleTag<T>, T, Instant, BoundedWindow) - Method in class org.apache.beam.sdk.transforms.DoFn.FinishBundleContext
-
Adds the given element to the output
PCollection
with the given tag at the given timestamp in the given window. - output(OutputT) - Method in class org.apache.beam.sdk.transforms.DoFn.WindowedContext
-
Adds the given element to the main output
PCollection
. - output(OutputT, Instant, BoundedWindow) - Method in class org.apache.beam.sdk.transforms.DoFn.FinishBundleContext
-
Adds the given element to the main output
PCollection
at the given timestamp in the given window. - output(T) - Method in interface org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.ParserCallback
-
Output the object.
- output(T) - Method in interface org.apache.beam.sdk.transforms.DoFn.OutputReceiver
- output(T, Instant) - Method in interface org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.ParserCallback
-
Output the object using the specified timestamp.
- OUTPUT - Enum constant in enum class org.apache.beam.sdk.io.cdap.PluginConstants.Format
- OUTPUT - Enum constant in enum class org.apache.beam.sdk.io.cdap.PluginConstants.FormatProvider
- OUTPUT - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- OUTPUT_DIR - Static variable in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO
-
MRJobConfig.MAPREDUCE_JOB_DIR
. - OUTPUT_FORMAT_CLASS_ATTR - Static variable in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO
-
MRJobConfig.OUTPUT_FORMAT_CLASS_ATTR
. - OUTPUT_INFO - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- OUTPUT_KEY_CLASS - Static variable in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO
-
MRJobConfig.OUTPUT_KEY_CLASS
. - OUTPUT_NAME - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- OUTPUT_ROW_SCHEMA - Static variable in class org.apache.beam.sdk.io.aws2.sqs.providers.SqsReadSchemaTransformProvider
- OUTPUT_ROWS_TAG - Static variable in class org.apache.beam.sdk.expansion.service.WindowIntoTransformProvider
- OUTPUT_ROWS_TAG - Static variable in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider
- OUTPUT_ROWS_TAG - Static variable in class org.apache.beam.sdk.schemas.transforms.providers.FlattenTransformProvider
- OUTPUT_ROWS_TAG - Static variable in class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider
- OUTPUT_ROWS_TAG - Static variable in class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider
- OUTPUT_ROWS_TAG - Static variable in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider
- OUTPUT_ROWS_TAG - Static variable in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider
- OUTPUT_SCHEMA - Static variable in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformProvider
- OUTPUT_SCHEMA - Static variable in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider
- OUTPUT_TAG - Static variable in class org.apache.beam.sdk.io.aws2.sqs.providers.SqsReadSchemaTransformProvider
- OUTPUT_TAG - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider
- OUTPUT_TAG - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformProvider
- OUTPUT_TAG - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformProvider
- OUTPUT_TAG - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
- OUTPUT_TAG - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
- OUTPUT_TAG - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider
- OUTPUT_TAG - Static variable in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider
- OUTPUT_TAG - Static variable in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider
- OUTPUT_TAG - Static variable in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider
- OUTPUT_TAG - Static variable in class org.apache.beam.sdk.io.TFRecordReadSchemaTransformProvider
- OUTPUT_TAG - Static variable in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformProvider
- OUTPUT_VALUE_CLASS - Static variable in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO
-
MRJobConfig.OUTPUT_VALUE_CLASS
. - outputCoder - Variable in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
- Output Coders - Search tag in class org.apache.beam.sdk.transforms.ParDo
- Section
- outputCollectionNames() - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider
- outputCollectionNames() - Method in class org.apache.beam.sdk.expansion.service.WindowIntoTransformProvider
- outputCollectionNames() - Method in class org.apache.beam.sdk.extensions.sql.expansion.SqlTransformSchemaTransformProvider
- outputCollectionNames() - Method in class org.apache.beam.sdk.io.aws2.sqs.providers.SqsReadSchemaTransformProvider
- outputCollectionNames() - Method in class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider
- outputCollectionNames() - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformProvider
- outputCollectionNames() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformProvider
-
The expected
PCollectionRowTuple
output tags. - outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformProvider
-
Implementation of the
TypedSchemaTransformProvider
outputCollectionNames method. - outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider
- outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryFileLoadsSchemaTransformProvider
- outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider
- outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider
- outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformProvider
- outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformProvider
- outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
- outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
- outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider
- outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider
- outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider
- outputCollectionNames() - Method in class org.apache.beam.sdk.io.iceberg.IcebergCdcReadSchemaTransformProvider
- outputCollectionNames() - Method in class org.apache.beam.sdk.io.iceberg.IcebergReadSchemaTransformProvider
- outputCollectionNames() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider
- outputCollectionNames() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider
- outputCollectionNames() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider
- outputCollectionNames() - Method in class org.apache.beam.sdk.io.json.providers.JsonWriteTransformProvider
- outputCollectionNames() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider
- outputCollectionNames() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider
- outputCollectionNames() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadProvider
-
Implementation of the
TypedSchemaTransformProvider
outputCollectionNames method. - outputCollectionNames() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteProvider
-
Implementation of the
TypedSchemaTransformProvider
outputCollectionNames method. - outputCollectionNames() - Method in class org.apache.beam.sdk.io.TFRecordReadSchemaTransformProvider
-
Implementation of the
TypedSchemaTransformProvider
outputCollectionNames method. - outputCollectionNames() - Method in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformProvider
-
Implementation of the
TypedSchemaTransformProvider
outputCollectionNames method. - outputCollectionNames() - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider
- outputCollectionNames() - Method in class org.apache.beam.sdk.schemas.transforms.providers.FlattenTransformProvider
- outputCollectionNames() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider
- outputCollectionNames() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider
- outputCollectionNames() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider
- outputCollectionNames() - Method in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider
- outputCollectionNames() - Method in interface org.apache.beam.sdk.schemas.transforms.SchemaTransformProvider
-
Returns the output collection names of this transform.
- outputColumnMap - Variable in class org.apache.beam.sdk.extensions.sql.zetasql.QueryTrait
- outputFormatProvider - Variable in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
-
This should be set after
SubmitterLifecycle.prepareRun(Object)
call with passing this context object as a param. - outputManager - Variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- outputManagerFactory - Variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- outputOf(Contextful.Fn<InputT, OutputT>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
Like
TypeDescriptors.outputOf(ProcessFunction)
but forContextful.Fn
. - outputOf(ProcessFunction<InputT, OutputT>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
Returns a type descriptor for the output of the given
ProcessFunction
, subject to Java type erasure: may contain unresolved type variables if the type was erased. - outputOf(SerializableFunction<InputT, OutputT>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
Binary compatibility adapter for
TypeDescriptors.outputOf(ProcessFunction)
. - OutputRangeTracker(OffsetRange) - Constructor for class org.apache.beam.sdk.transforms.PeriodicSequence.OutputRangeTracker
- OutputReceiverFactory - Interface in org.apache.beam.runners.fnexecution.control
-
A factory that can create output receivers during an executable stage.
- OutputReference - Class in org.apache.beam.runners.dataflow.util
-
A representation used by
Step
s to reference the output of otherStep
s. - OutputReference(String, String) - Constructor for class org.apache.beam.runners.dataflow.util.OutputReference
- outputRuntimeOptions() - Method in interface org.apache.beam.sdk.options.PipelineOptions
-
Returns a map of properties which correspond to
ValueProvider.RuntimeValueProvider
, keyed by the property name. - outputSchema() - Method in class org.apache.beam.sdk.schemas.transforms.Cast
- outputSchemaCoder - Variable in class org.apache.beam.sdk.schemas.utils.ConvertHelpers.ConvertedSchemaInformation
- Output Snapshots - Search tag in class org.apache.beam.sdk.io.iceberg.IcebergIO
- Section
- OutputTagFilter<OutputT,
InputT> - Class in org.apache.beam.runners.twister2.translators.functions -
Output tag filter.
- OutputTagFilter() - Constructor for class org.apache.beam.runners.twister2.translators.functions.OutputTagFilter
- OutputTagFilter(int) - Constructor for class org.apache.beam.runners.twister2.translators.functions.OutputTagFilter
- outputWindowedValue(TupleTag<T>, T, Instant, Collection<? extends BoundedWindow>, PaneInfo) - Method in class org.apache.beam.sdk.transforms.DoFn.WindowedContext
-
Adds the given element to the main output
PCollection
, with the given windowing metadata. - outputWindowedValue(OutputT, Instant, Collection<? extends BoundedWindow>, PaneInfo) - Method in class org.apache.beam.sdk.transforms.DoFn.WindowedContext
-
Adds the given element to the main output
PCollection
, with the given windowing metadata. - outputWindowedValue(T, Instant, Collection<? extends BoundedWindow>, PaneInfo) - Method in interface org.apache.beam.sdk.transforms.DoFn.OutputReceiver
- outputWithTimestamp(TupleTag<T>, T, Instant) - Method in class org.apache.beam.sdk.transforms.DoFn.WindowedContext
-
Adds the given element to the specified output
PCollection
, with the given timestamp. - outputWithTimestamp(OutputT, Instant) - Method in class org.apache.beam.sdk.transforms.DoFn.WindowedContext
-
Adds the given element to the main output
PCollection
, with the given timestamp. - outputWithTimestamp(T, Instant) - Method in interface org.apache.beam.sdk.transforms.DoFn.OutputReceiver
- OVER_SPECIFIED - Enum constant in enum class org.apache.beam.sdk.coders.CannotProvideCoderException.ReasonCode
-
The reason a coder could not be provided is because the type variable
T
is over specified with multiple incompatible coders. - overlaps(ByteKeyRange) - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
-
Returns
true
if the specifiedByteKeyRange
overlaps this range. - overrideEncodingPositions(UUID, Map<String, Integer>) - Static method in class org.apache.beam.sdk.coders.RowCoder
-
Override encoding positions for the given schema.
- overrideEncodingPositions(UUID, Map<String, Integer>) - Static method in class org.apache.beam.sdk.coders.RowCoderGenerator
- overrideEncodingPositions(UUID, Map<String, Integer>) - Static method in class org.apache.beam.sdk.schemas.SchemaCoder
-
Override encoding positions for the given schema.
- ownerId - Variable in class org.apache.beam.runners.jet.processors.ParDoP.Supplier
- ownerId - Variable in class org.apache.beam.runners.jet.processors.StatefulParDoP.Supplier
P
- PackageUtil - Class in org.apache.beam.runners.dataflow.util
-
Helper routines for packages.
- PackageUtil.StagedFile - Class in org.apache.beam.runners.dataflow.util
- pairFunctionToPairFlatMapFunction(PairFunction<T, K, V>) - Static method in class org.apache.beam.runners.spark.translation.TranslationUtils
- pane() - Method in class org.apache.beam.sdk.transforms.DoFn.ProcessContext
-
Returns information about the pane within this window into which the input element has been assigned.
- PaneInfo - Class in org.apache.beam.sdk.transforms.windowing
-
Provides information about the pane an element belongs to.
- PaneInfo.PaneInfoCoder - Class in org.apache.beam.sdk.transforms.windowing
-
A Coder for encoding PaneInfo instances.
- PaneInfo.Timing - Enum Class in org.apache.beam.sdk.transforms.windowing
-
Enumerates the possibilities for the timing of this pane firing related to the input and output watermarks for its computation.
- paneInfoFromBytes(byte[]) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers.Utils
- paneInfoToBytes(PaneInfo) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers.Utils
- PARALLEL_INPUT - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- Parallel reading from a JDBC datasource - Search tag in class org.apache.beam.sdk.io.jdbc.JdbcIO
- Section
- Parallel reading from a SingleStoreDB datasource - Search tag in class org.apache.beam.sdk.io.singlestore.SingleStoreIO
- Section
- ParameterListBuilder() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.UdfImplReflectiveFunctionBase.ParameterListBuilder
- parameters - Variable in class org.apache.beam.sdk.extensions.sql.impl.UdfImplReflectiveFunctionBase
-
Types of parameter for the function call.
- Parameters - Search tag in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct
- Section
- Parameters - Search tag in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies
- Section
- Parameters - Search tag in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles
- Section
- Params() - Constructor for class org.apache.beam.sdk.io.DefaultFilenamePolicy.Params
-
Construct a default Params object.
- ParamsCoder() - Constructor for class org.apache.beam.sdk.io.DefaultFilenamePolicy.ParamsCoder
- ParDo - Class in org.apache.beam.sdk.transforms
-
ParDo
is the core element-wise transform in Apache Beam, invoking a user-specified function on each of the elements of the inputPCollection
to produce zero or more output elements, all of which are collected into the outputPCollection
. - ParDo() - Constructor for class org.apache.beam.sdk.transforms.ParDo
- ParDo.MultiOutput<InputT,
OutputT> - Class in org.apache.beam.sdk.transforms -
A
PTransform
that, when applied to aPCollection<InputT>
, invokes a user-specifiedDoFn<InputT, OutputT>
on all its elements, which can emit elements to any of thePTransform
's outputPCollection
s, which are bundled into a resultPCollectionTuple
. - ParDo.SingleOutput<InputT,
OutputT> - Class in org.apache.beam.sdk.transforms -
A
PTransform
that, when applied to aPCollection<InputT>
, invokes a user-specifiedDoFn<InputT, OutputT>
on all its elements, with all its outputs collected into an outputPCollection<OutputT>
. - ParDoMultiOutputTranslatorBatch<InputT,
OutputT> - Class in org.apache.beam.runners.twister2.translators.batch -
ParDo translator.
- ParDoMultiOutputTranslatorBatch() - Constructor for class org.apache.beam.runners.twister2.translators.batch.ParDoMultiOutputTranslatorBatch
- ParDoMultiOverrideFactory<InputT,
OutputT> - Class in org.apache.beam.runners.direct -
A
PTransformOverrideFactory
that provides overrides for applications of aParDo
in the direct runner. - ParDoMultiOverrideFactory() - Constructor for class org.apache.beam.runners.direct.ParDoMultiOverrideFactory
- ParDoP<InputT,
OutputT> - Class in org.apache.beam.runners.jet.processors -
Jet
Processor
implementation for Beam's ParDo primitive (when no user-state is being used). - ParDoP.Supplier<InputT,
OutputT> - Class in org.apache.beam.runners.jet.processors -
Jet
Processor
supplier that will provide instances ofParDoP
. - ParDoStateUpdateFn<KeyT,
ValueT, - Class in org.apache.beam.runners.spark.translation.streamingInputT, OutputT> -
A function to handle stateful processing in Apache Beam's SparkRunner.
- ParDoStateUpdateFn(MetricsContainerStepMapAccumulator, String, DoFn<InputT, OutputT>, Coder<KeyT>, WindowedValues.FullWindowedValueCoder<ValueT>, SerializablePipelineOptions, TupleTag<?>, List<TupleTag<?>>, Coder<InputT>, Map<TupleTag<?>, Coder<?>>, Map<TupleTag<?>, KV<WindowingStrategy<?, ?>, SideInputBroadcast<?>>>, WindowingStrategy<?, ?>, DoFnSchemaInformation, Map<String, PCollectionView<?>>, Map<Integer, GlobalWatermarkHolder.SparkWatermarks>, List<Integer>, boolean) - Constructor for class org.apache.beam.runners.spark.translation.streaming.ParDoStateUpdateFn
- ParDoStateUpdateFn.SparkTimerInternalsIterator - Class in org.apache.beam.runners.spark.translation.streaming
-
An iterator implementation that processes timers from
SparkTimerInternals
. - parent() - Method in class org.apache.beam.sdk.schemas.utils.SchemaZipFold.Context
-
Type of parent node in a tree.
- ParquetConfiguration() - Constructor for class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.ParquetConfiguration
- parquetConfigurationBuilder() - Static method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration
- ParquetIO - Class in org.apache.beam.sdk.io.parquet
-
IO to read and write Parquet files.
- ParquetIO.Parse<T> - Class in org.apache.beam.sdk.io.parquet
-
Implementation of
ParquetIO.parseGenericRecords(SerializableFunction)
. - ParquetIO.ParseFiles<T> - Class in org.apache.beam.sdk.io.parquet
-
Implementation of
ParquetIO.parseFilesGenericRecords(SerializableFunction)
. - ParquetIO.Read - Class in org.apache.beam.sdk.io.parquet
-
Implementation of
ParquetIO.read(Schema)
. - ParquetIO.ReadFiles - Class in org.apache.beam.sdk.io.parquet
-
Implementation of
ParquetIO.readFiles(Schema)
. - ParquetIO.ReadFiles.BlockTracker - Class in org.apache.beam.sdk.io.parquet
- ParquetIO.Sink - Class in org.apache.beam.sdk.io.parquet
-
Implementation of
ParquetIO.sink(org.apache.avro.Schema)
. - ParquetReadSchemaTransformFormatProvider - Class in org.apache.beam.sdk.io.fileschematransform
- ParquetReadSchemaTransformFormatProvider() - Constructor for class org.apache.beam.sdk.io.fileschematransform.ParquetReadSchemaTransformFormatProvider
- ParquetTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.parquet
-
TableProvider
forParquetIO
for consumption by Beam SQL. - ParquetTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.parquet.ParquetTableProvider
- ParquetWriteSchemaTransformFormatProvider - Class in org.apache.beam.sdk.io.fileschematransform
-
A
FileWriteSchemaTransformFormatProvider
for Parquet format. - ParquetWriteSchemaTransformFormatProvider() - Constructor for class org.apache.beam.sdk.io.fileschematransform.ParquetWriteSchemaTransformFormatProvider
- parse() - Static method in class org.apache.beam.sdk.io.tika.TikaIO
-
Parses files matching a given filepattern.
- parse(GridFSDBFile, MongoDbGridFSIO.ParserCallback<T>) - Method in interface org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Parser
- parse(Class<T>, CSVFormat) - Static method in class org.apache.beam.sdk.io.csv.CsvIO
-
Instantiates a
CsvIOParse
for parsing CSV string records into customSchema
-mappedClass<T>
es from the records' assumed CsvFormat. - parse(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner
-
Parse input SQL query, and return a
SqlNode
as grammar tree. - parse(String) - Method in interface org.apache.beam.sdk.extensions.sql.impl.QueryPlanner
-
Parse input SQL query, and return a
SqlNode
as grammar tree. - parse(String) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner
- parse(String) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
-
Parse string with ClickHouse type to
TableSchema.ColumnType
. - parse(String) - Static method in enum class org.apache.beam.sdk.io.clickhouse.TableSchema.DefaultType
- parse(String) - Static method in class org.apache.beam.sdk.schemas.parser.FieldAccessDescriptorParser
- parse(T) - Method in class org.apache.beam.sdk.testing.JsonMatcher
- Parse() - Constructor for class org.apache.beam.sdk.extensions.avro.io.AvroIO.Parse
- Parse() - Constructor for class org.apache.beam.sdk.io.parquet.ParquetIO.Parse
- Parse() - Constructor for class org.apache.beam.sdk.io.tika.TikaIO.Parse
- ParseAll() - Constructor for class org.apache.beam.sdk.extensions.avro.io.AvroIO.ParseAll
-
Deprecated.
- parseAllGenericRecords(SerializableFunction<GenericRecord, T>) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
-
Deprecated.You can achieve The functionality of
AvroIO.parseAllGenericRecords(SerializableFunction)
usingFileIO
matching plusAvroIO.parseFilesGenericRecords(SerializableFunction)
()}. This is the preferred method to make composition explicit.AvroIO.ParseAll
will not receive upgrades and will be removed in a future version of Beam. - parseArgs(String[]) - Static method in class org.apache.beam.runners.flink.FlinkJobServerDriver
- parseDate(String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.DateTimeUtils
- parseDateToValue(String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.DateTimeUtils
- parseDefaultExpression(TableSchema.ColumnType, String) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
-
Get default value of a column based on expression.
- parseDicomWebpath(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser
- ParsedMetricName() - Constructor for class org.apache.beam.sdk.metrics.LabeledMetricNameUtils.ParsedMetricName
- ParseException - Exception Class in org.apache.beam.sdk.extensions.sql.impl
-
Exception thrown when Beam SQL is unable to parse the statement.
- ParseException(String, Throwable) - Constructor for exception class org.apache.beam.sdk.extensions.sql.impl.ParseException
- ParseException(Throwable) - Constructor for exception class org.apache.beam.sdk.extensions.sql.impl.ParseException
- parseFiles() - Static method in class org.apache.beam.sdk.io.tika.TikaIO
-
Parses files in a
PCollection
ofFileIO.ReadableFile
. - ParseFiles() - Constructor for class org.apache.beam.sdk.extensions.avro.io.AvroIO.ParseFiles
- ParseFiles() - Constructor for class org.apache.beam.sdk.io.parquet.ParquetIO.ParseFiles
- ParseFiles() - Constructor for class org.apache.beam.sdk.io.tika.TikaIO.ParseFiles
- parseFilesGenericRecords(SerializableFunction<GenericRecord, T>) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
-
Like
AvroIO.parseGenericRecords(SerializableFunction)
, but reads eachFileIO.ReadableFile
in the inputPCollection
. - parseFilesGenericRecords(SerializableFunction<GenericRecord, T>) - Static method in class org.apache.beam.sdk.io.parquet.ParquetIO
-
Reads
GenericRecord
from Parquet files and converts to user defined type using providedparseFn
. - parseFn - Variable in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
- parseGenericRecords(SerializableFunction<GenericRecord, T>) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
-
Reads Avro file(s) containing records of an unspecified schema and converting each record to a custom type.
- parseGenericRecords(SerializableFunction<GenericRecord, T>) - Static method in class org.apache.beam.sdk.io.parquet.ParquetIO
-
Reads
GenericRecord
from a Parquet file (or multiple Parquet files matching the pattern) and converts to user defined type using provided parseFn. - parseInitialContinuationTokens(Row) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.encoder.MetadataTableEncoder
-
Return a list of initial token from a row.
- ParseJsons<OutputT> - Class in org.apache.beam.sdk.extensions.jackson
-
PTransform
for parsing JSONStrings
. - ParseJsons.ParseJsonsWithFailures<FailureT> - Class in org.apache.beam.sdk.extensions.jackson
-
A
PTransform
that adds exception handling toParseJsons
. - parseLockUuid(Row) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.encoder.MetadataTableEncoder
-
Returns the uuid from a row.
- parseMetricName(String) - Static method in class org.apache.beam.sdk.metrics.LabeledMetricNameUtils
-
Parse a 'metric name' String that was created with 'MetricNameBuilder'.
- ParsePayloadAsPubsubMessageProto() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages.ParsePayloadAsPubsubMessageProto
- parseProperties(String) - Static method in class org.apache.beam.sdk.extensions.sql.TableUtils
- ParsePubsubMessageProtoAsPayload() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages.ParsePubsubMessageProtoAsPayload
- ParsePubsubMessageProtoAsPayloadFromWindowedValue() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite.ParsePubsubMessageProtoAsPayloadFromWindowedValue
- parseQuery(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
- parseQuery(String, QueryPlanner.QueryParameters) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
- ParseResult - Class in org.apache.beam.sdk.io.tika
-
The result of parsing a single file with Tika: contains the file's location, metadata, extracted text, and optionally an error.
- ParseResult() - Constructor for class org.apache.beam.sdk.transforms.JsonToRow.ParseResult
- parseRows(Schema, CSVFormat) - Static method in class org.apache.beam.sdk.io.csv.CsvIO
-
Instantiates a
CsvIOParse
for parsing CSV string records intoRow
s from the records' assumed CsvFormat and expectedSchema
. - parseTableSpec(String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
-
Parse a table specification in the form
"[project_id]:[dataset_id].[table_id]"
or"[project_id].[dataset_id].[table_id]"
or"[dataset_id].[table_id]"
. - parseTableUrn(String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
- parseTime(String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.DateTimeUtils
- parseTimestampAsMsSinceEpoch(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Return timestamp as ms-since-unix-epoch corresponding to
timestamp
. - parseTimestampWithLocalTimeZone(String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.DateTimeUtils
- parseTimestampWithoutTimeZone(String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.DateTimeUtils
- parseTimestampWithTimeZone(String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.DateTimeUtils
- parseTimestampWithTZToValue(String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.DateTimeUtils
- parseTimestampWithUTCTimeZone(String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.DateTimeUtils
- parseTimeToValue(String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.DateTimeUtils
- parseTokenFromRow(Row) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.encoder.MetadataTableEncoder
-
Read the continuation token cell of a row from ReadRows.
- parseWatermarkFromRow(Row) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.encoder.MetadataTableEncoder
-
Read the watermark cell of a row from ReadRows.
- parseWatermarkLastUpdatedFromRow(Row) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.encoder.MetadataTableEncoder
-
Return the timestamp (the time it was updated) of the watermark cell.
- ParseWithError() - Constructor for class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn.ParseWithError
- PartialFlinkCombiner(CombineFnBase.GlobalCombineFn<InputT, AccumT, ?>) - Constructor for class org.apache.beam.runners.flink.translation.functions.AbstractFlinkCombineRunner.PartialFlinkCombiner
- PartialReduceBundleOperator<K,
InputT, - Class in org.apache.beam.runners.flink.translation.wrappers.streamingOutputT, AccumT> - PartialReduceBundleOperator(CombineFnBase.GlobalCombineFn<InputT, AccumT, OutputT>, String, Coder<WindowedValue<KV<K, InputT>>>, TupleTag<KV<K, AccumT>>, List<TupleTag<?>>, DoFnOperator.OutputManagerFactory<KV<K, AccumT>>, WindowingStrategy<?, ?>, Map<Integer, PCollectionView<?>>, Collection<PCollectionView<?>>, PipelineOptions) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.PartialReduceBundleOperator
- Partition<T> - Class in org.apache.beam.sdk.transforms
-
Partition
takes aPCollection<T>
and aPartitionFn
, uses thePartitionFn
to split the elements of the inputPCollection
intoN
partitions, and returns aPCollectionList<T>
that bundlesN
PCollection<T>
s containing the split elements. - PARTITION_CREATED_TO_SCHEDULED_MS - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Time in milliseconds that a partition took to transition from
PartitionMetadata.State.CREATED
toPartitionMetadata.State.SCHEDULED
. - PARTITION_END_RECORD_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Counter for the total number of partition end records identified during the execution of the Connector.
- PARTITION_EVENT_RECORD_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Counter for the total number of partition event records identified during the execution of the Connector.
- PARTITION_MERGE_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
-
Counter for the total number of partition merges identified during the execution of the Connector.
- PARTITION_RECONCILED_WITH_TOKEN_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
-
Counter for the total number of partitions reconciled with continuation tokens.
- PARTITION_RECONCILED_WITHOUT_TOKEN_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
-
Counter for the total number of partitions reconciled without continuation tokens.
- PARTITION_RECORD_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Counter for the total number of partitions identified during the execution of the Connector.
- PARTITION_RECORD_MERGE_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Counter for the total number of partition merges identified during the execution of the Connector.
- PARTITION_RECORD_SPLIT_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Counter for the total number of partition splits / moves identified during the execution of the Connector.
- PARTITION_SCHEDULED_TO_RUNNING_MS - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Time in milliseconds that a partition took to transition from
PartitionMetadata.State.SCHEDULED
toPartitionMetadata.State.RUNNING
. - PARTITION_SPLIT_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
-
Counter for the total number of partition splits / moves identified during the execution of the Connector.
- PARTITION_START_RECORD_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Counter for the total number of partition start records identified during the execution of the Connector.
- PARTITION_STREAM_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
-
Counter for the total number of active partitions being streamed.
- PARTITION_TOKEN - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.InitialPartition
-
The token of the initial partition.
- Partition.PartitionFn<T> - Interface in org.apache.beam.sdk.transforms
-
A function object that chooses an output partition for an element.
- Partition.PartitionWithSideInputsFn<T> - Interface in org.apache.beam.sdk.transforms
-
A function object that chooses an output partition for an element.
- Partition Assignment and Checkpointing - Search tag in class org.apache.beam.sdk.io.kafka.KafkaIO
- Section
- PartitionContext() - Constructor for class org.apache.beam.sdk.io.kafka.TimestampPolicy.PartitionContext
- PartitionEndRecord - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
A partition end record serves as a notification that the client should stop reading the partition.
- PartitionEndRecord(Timestamp, String, ChangeStreamRecordMetadata) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionEndRecord
-
Constructs the partition end record with the given timestamp, record sequence and metadata.
- partitionEndRecordAction(PartitionMetadataDao, ChangeStreamMetrics) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.ActionFactory
-
Creates and returns a singleton instance of an action class capable of process
PartitionEndRecord
s. - PartitionEndRecordAction - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.action
-
This class is part of the process for
ReadChangeStreamPartitionDoFn
SDF. - partitioner() - Method in class org.apache.beam.runners.spark.io.SourceRDD.Unbounded
- PARTITIONER_CLASS_ATTR - Static variable in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO
-
MRJobConfig.PARTITIONER_CLASS_ATTR
. - PartitionEventRecord - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
A partition event record describes key range changes for a change stream partition.
- PartitionEventRecord(Timestamp, String, ChangeStreamRecordMetadata) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionEventRecord
-
Constructs the partition event record with the given partitions.
- partitionEventRecordAction(PartitionMetadataDao, ChangeStreamMetrics) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.ActionFactory
-
Creates and returns a singleton instance of an action class capable of process
PartitionEventRecord
s. - PartitionEventRecordAction - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.action
-
This class is part of the process for
ReadChangeStreamPartitionDoFn
SDF. - partitionFields(List<String>) - Method in class org.apache.beam.sdk.extensions.sql.meta.Table.Builder
- partitionFor(T, int) - Method in interface org.apache.beam.sdk.transforms.Partition.PartitionFn
-
Chooses the partition into which to put the given element.
- partitionFor(T, int, Contextful.Fn.Context) - Method in interface org.apache.beam.sdk.transforms.Partition.PartitionWithSideInputsFn
-
Chooses the partition into which to put the given element.
- Partitioning of writes - Search tag in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO
- Section
- PartitioningWindowFn<T,
W> - Class in org.apache.beam.sdk.transforms.windowing -
A
WindowFn
that places each value into exactly one window based on its timestamp and never merges windows. - PartitioningWindowFn() - Constructor for class org.apache.beam.sdk.transforms.windowing.PartitioningWindowFn
- PartitionMark(String, int, long, long) - Constructor for class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark.PartitionMark
- PartitionMetadata - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
Model for the partition metadata database table used in the Connector.
- PartitionMetadata(String, HashSet<String>, Timestamp, Timestamp, long, PartitionMetadata.State, Timestamp, Timestamp, Timestamp, Timestamp, Timestamp) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
- PartitionMetadata.Builder - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
Partition metadata builder for better user experience.
- PartitionMetadata.State - Enum Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
The state at which a partition can be in the system: CREATED: the partition has been created, but no query has been done against it yet.
- PartitionMetadataAdminDao - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
-
Data access object for creating and dropping the partition metadata table.
- PartitionMetadataDao - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
-
Data access object for the Connector metadata tables.
- PartitionMetadataDao.InTransactionContext - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
-
Represents the execution of a read / write transaction in Cloud Spanner.
- PartitionMetadataDao.TransactionResult<T> - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
-
Represents a result from executing a Cloud Spanner read / write transaction.
- partitionMetadataMapper() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper.MapperFactory
-
Creates and returns a single instance of a mapper class capable of transforming a
Struct
into aPartitionMetadata
class. - PartitionMetadataMapper - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper
-
This class is responsible for transforming a
Struct
to aPartitionMetadata
. - PartitionMetadataTableNames - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
-
Configuration for a partition metadata table.
- PartitionMetadataTableNames(String, String, String) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataTableNames
- partitionQuery() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.Read
-
Factory method to create a new type safe builder for
PartitionQueryRequest
operations. - PartitionReconciler - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler
-
There can be a race when many splits and merges happen to a single partition in quick succession.
- PartitionReconciler(MetadataTableDao, ChangeStreamMetrics) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler.PartitionReconciler
- PartitionRecord - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.model
-
Output result of
DetectNewPartitionsDoFn
containing information required to stream a partition. - PartitionRecord(Range.ByteStringRange, List<ChangeStreamContinuationToken>, String, Instant, List<NewPartition>, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
- PartitionRecord(Range.ByteStringRange, List<ChangeStreamContinuationToken>, Instant, List<NewPartition>) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
- PartitionRecord(Range.ByteStringRange, Instant, String, Instant, List<NewPartition>, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
- PartitionRecord(Range.ByteStringRange, Instant, Instant, List<NewPartition>) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
- PartitionStartRecord - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
A partition start record serves as a notification that the client should schedule the partitions to be queried.
- PartitionStartRecord(Timestamp, String, List<String>, ChangeStreamRecordMetadata) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionStartRecord
-
Constructs the partition start record with the given partitions.
- partitionStartRecordAction(PartitionMetadataDao, ChangeStreamMetrics) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.ActionFactory
-
Creates and returns a singleton instance of an action class capable of process
PartitionStartRecord
s. - PartitionStartRecordAction - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.action
-
This class is part of the process for invalid input: '{@link org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn..ReadChangeStreamPartitionDoFn'} SDF.
- partitionsToString(List<Range.ByteStringRange>) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ByteStringRangeHelper
-
Convert partitions to a string for debugging.
- PAssert - Class in org.apache.beam.sdk.testing
-
An assertion on the contents of a
PCollection
incorporated into the pipeline. - PAssert.DefaultConcludeTransform - Class in org.apache.beam.sdk.testing
-
Default transform to check that a PAssert was successful.
- PAssert.GroupThenAssert<T> - Class in org.apache.beam.sdk.testing
-
A transform that applies an assertion-checking function over iterables of
ActualT
to the entirety of the contents of its input. - PAssert.GroupThenAssertForSingleton<T> - Class in org.apache.beam.sdk.testing
-
A transform that applies an assertion-checking function to the sole element of a
PCollection
. - PAssert.IterableAssert<T> - Interface in org.apache.beam.sdk.testing
-
Builder interface for assertions applicable to iterables and PCollection contents.
- PAssert.MatcherCheckerFn<T> - Class in org.apache.beam.sdk.testing
-
Check that the passed-in matchers match the existing data.
- PAssert.OneSideInputAssert<ActualT> - Class in org.apache.beam.sdk.testing
-
An assertion checker that takes a single
PCollectionView<ActualT>
and an assertion overActualT
, and checks it within a Beam pipeline. - PAssert.PAssertionSite - Class in org.apache.beam.sdk.testing
-
Track the place where an assertion is defined.
- PAssert.PCollectionContentsAssert<T> - Class in org.apache.beam.sdk.testing
-
An
PAssert.IterableAssert
about the contents of aPCollection
. - PAssert.PCollectionListContentsAssert<T> - Class in org.apache.beam.sdk.testing
-
An assert about the contents of each
PCollection
in the givenPCollectionList
. - PAssert.SingletonAssert<T> - Interface in org.apache.beam.sdk.testing
-
Builder interface for assertions applicable to a single value.
- PassThroughLogicalType<T> - Class in org.apache.beam.sdk.schemas.logicaltypes
-
A base class for LogicalTypes that use the same Java type as the underlying base type.
- PassThroughLogicalType(String, Schema.FieldType, Object, Schema.FieldType) - Constructor for class org.apache.beam.sdk.schemas.logicaltypes.PassThroughLogicalType
- password() - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionServiceFactory
-
The password to use for authentication.
- password(String) - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionServiceFactory.Builder
-
Set Solace password.
- password(String) - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthSempClientFactory.Builder
-
Set Solace password.
- passwordSecretName() - Method in class org.apache.beam.sdk.io.solace.broker.GCPSecretSessionServiceFactory
- passwordSecretName(String) - Method in class org.apache.beam.sdk.io.solace.broker.GCPSecretSessionServiceFactory.Builder
-
The Secret Manager secret name where the password is stored.
- passwordSecretVersion() - Method in class org.apache.beam.sdk.io.solace.broker.GCPSecretSessionServiceFactory
- passwordSecretVersion(String) - Method in class org.apache.beam.sdk.io.solace.broker.GCPSecretSessionServiceFactory.Builder
-
Optional.
- pastEndOfWindow() - Static method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark
-
Creates a trigger that fires when the watermark passes the end of the window.
- pastFirstElementInPane() - Static method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
-
Creates a trigger that fires when the current processing time passes the processing time at which this trigger saw the first element in a pane.
- patchTableDescription(TableReference, String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Patch BigQuery
Table
description. - patchTableDescription(TableReference, String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
- patchTableDescription(TableReference, String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- path - Variable in class org.apache.beam.sdk.jmh.io.TextSourceBenchmark.Data
- path() - Method in class org.apache.beam.sdk.schemas.transforms.Cast.CompatibilityError
- path() - Method in class org.apache.beam.sdk.schemas.utils.SchemaZipFold.Context
-
Field path from a root of a schema.
- pathString - Variable in class org.apache.beam.sdk.jmh.io.TextSourceBenchmark.Data
- PathValidator - Interface in org.apache.beam.sdk.extensions.gcp.storage
-
For internal use only; no backwards compatibility guarantees.
- PathValidatorFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcsOptions.PathValidatorFactory
- PATIENT_EVERYTHING - Enum constant in enum class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.FhirResourcePagesIterator.FhirMethod
- PatientEverythingParameter() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything.PatientEverythingParameter
- PatternCondition - Class in org.apache.beam.sdk.extensions.sql.impl.cep
-
PatternCondition
stores the function to decide whether a row is a match of a single pattern. - payload() - Method in class org.apache.beam.sdk.io.splunk.SplunkWriteError
- PAYLOAD_TOO_LARGE - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
- PayloadSerializer - Interface in org.apache.beam.sdk.schemas.io.payloads
- PayloadSerializerKafkaTable - Class in org.apache.beam.sdk.extensions.sql.meta.provider.kafka
- PayloadSerializerProvider - Interface in org.apache.beam.sdk.schemas.io.payloads
- PayloadSerializers - Class in org.apache.beam.sdk.schemas.io.payloads
- payloadToConfig(ExternalTransforms.ExternalConfigurationPayload, Class<ConfigT>) - Static method in class org.apache.beam.sdk.expansion.service.ExpansionService
-
Attempt to create an instance of
ExpansionService
from anExternalTransforms.ExternalConfigurationPayload
. - PBegin - Class in org.apache.beam.sdk.values
- PBegin(Pipeline) - Constructor for class org.apache.beam.sdk.values.PBegin
- PCollection<T> - Class in org.apache.beam.sdk.values
-
A
PCollection<T>
is an immutable collection of values of typeT
. - PCOLLECTION_NAME - Static variable in class org.apache.beam.sdk.extensions.sql.SqlTransform
- PCollection.IsBounded - Enum Class in org.apache.beam.sdk.values
-
The enumeration of cases for whether a
PCollection
is bounded. - PCollectionContentsAssert(PCollection<T>, PAssert.AssertionWindows, SimpleFunction<Iterable<ValueInSingleWindow<T>>, Iterable<T>>, PAssert.PAssertionSite) - Constructor for class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
- PCollectionContentsAssert(PCollection<T>, PAssert.PAssertionSite) - Constructor for class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
- PCollectionList<T> - Class in org.apache.beam.sdk.values
-
A
PCollectionList<T>
is an immutable list of homogeneously typedPCollection<T>s
. - PCollectionListContentsAssert(PCollectionList<T>) - Constructor for class org.apache.beam.sdk.testing.PAssert.PCollectionListContentsAssert
- PCollectionRowTuple - Class in org.apache.beam.sdk.values
-
A
PCollectionRowTuple
is an immutable tuple ofPCollection
, "keyed" by a string tag.s
- pCollections() - Static method in class org.apache.beam.sdk.transforms.Flatten
-
Returns a
PTransform
that flattens aPCollectionList
into aPCollection
containing all the elements of all thePCollection
s in its input. - PCollectionTuple - Class in org.apache.beam.sdk.values
-
A
PCollectionTuple
is an immutable tuple of heterogeneously-typedPCollections
, "keyed" byTupleTags
. - PCollectionView<T> - Interface in org.apache.beam.sdk.values
-
A
PCollectionView<T>
is an immutable view of aPCollection
as a value of typeT
that can be accessed as a side input to aParDo
transform. - PCollectionViews - Class in org.apache.beam.sdk.values
-
For internal use only; no backwards compatibility guarantees.
- PCollectionViews() - Constructor for class org.apache.beam.sdk.values.PCollectionViews
- PCollectionViews.HasDefaultValue<T> - Interface in org.apache.beam.sdk.values
- PCollectionViews.InMemoryListFromMultimapViewFn<T> - Class in org.apache.beam.sdk.values
-
Implementation which is able to adapt a multimap materialization to an in-memory
List<T>
. - PCollectionViews.InMemoryListViewFn<T> - Class in org.apache.beam.sdk.values
-
Implementation which is able to adapt an iterable materialization to an in-memory
List<T>
. - PCollectionViews.InMemoryMapFromVoidKeyViewFn<K,
V> - Class in org.apache.beam.sdk.values -
Implementation which is able to adapt a multimap materialization to an in-memory
Map<K, V>
. - PCollectionViews.InMemoryMapViewFn<K,
V> - Class in org.apache.beam.sdk.values -
Implementation which is able to adapt an iterable materialization to an in-memory
Map<K, V>
. - PCollectionViews.InMemoryMultimapFromVoidKeyViewFn<K,
V> - Class in org.apache.beam.sdk.values -
Implementation which is able to adapt a multimap materialization to an in-memory
Map<K, Iterable<V>>
. - PCollectionViews.InMemoryMultimapViewFn<K,
V> - Class in org.apache.beam.sdk.values -
Implementation which is able to adapt an iterable materialization to an in-memory
Map<K, Iterable<V>>
. - PCollectionViews.IsSingletonView<T> - Interface in org.apache.beam.sdk.values
- PCollectionViews.IterableBackedListViewFn<T> - Class in org.apache.beam.sdk.values
-
Implementation which is able to adapt an iterable materialization to a
List<T>
. - PCollectionViews.IterableViewFn<T> - Class in org.apache.beam.sdk.values
-
Deprecated.
- PCollectionViews.IterableViewFn2<T> - Class in org.apache.beam.sdk.values
-
Implementation which is able to adapt an iterable materialization to a
Iterable<T>
. - PCollectionViews.ListViewFn<T> - Class in org.apache.beam.sdk.values
-
Deprecated.
- PCollectionViews.ListViewFn2<T> - Class in org.apache.beam.sdk.values
-
Implementation which is able to adapt a multimap materialization to a
List<T>
. - PCollectionViews.MapViewFn<K,
V> - Class in org.apache.beam.sdk.values -
Deprecated.
- PCollectionViews.MapViewFn2<K,
V> - Class in org.apache.beam.sdk.values -
Implementation which is able to adapt a multimap materialization to a
Map<K, V>
. - PCollectionViews.MultimapViewFn<K,
V> - Class in org.apache.beam.sdk.values -
Deprecated.
- PCollectionViews.MultimapViewFn2<K,
V> - Class in org.apache.beam.sdk.values -
Implementation which is able to adapt a multimap materialization to a
Map<K, Iterable<V>>
. - PCollectionViews.SimplePCollectionView<ElemT,
PrimitiveViewT, - Class in org.apache.beam.sdk.valuesViewT, W> -
A class for
PCollectionView
implementations, with additional type parameters that are not visible at pipeline assembly time when the view is used as a side input. - PCollectionViews.SingletonViewFn<T> - Class in org.apache.beam.sdk.values
-
Deprecated.
- PCollectionViews.SingletonViewFn2<T> - Class in org.apache.beam.sdk.values
-
Implementation which is able to adapt an iterable materialization to a
T
. - PCollectionViews.TypeDescriptorSupplier<T> - Interface in org.apache.beam.sdk.values
- PCollectionViews.ValueOrMetadata<T,
MetaT> - Class in org.apache.beam.sdk.values -
Stores values or metadata about values.
- PCollectionViews.ValueOrMetadataCoder<T,
MetaT> - Class in org.apache.beam.sdk.values -
A coder for
PCollectionViews.ValueOrMetadata
. - PCollectionViewTranslatorBatch<ElemT,
ViewT> - Class in org.apache.beam.runners.twister2.translators.batch -
PCollectionView translator.
- PCollectionViewTranslatorBatch() - Constructor for class org.apache.beam.runners.twister2.translators.batch.PCollectionViewTranslatorBatch
- PDone - Class in org.apache.beam.sdk.values
- peekOutputElements() - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
Deprecated.Use
TestPipeline
with theDirectRunner
. - peekOutputElements(TupleTag<T>) - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
Deprecated.Use
TestPipeline
with theDirectRunner
. - peekOutputElementsInWindow(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
Deprecated.Use
TestPipeline
with theDirectRunner
. - peekOutputElementsInWindow(TupleTag<OutputT>, BoundedWindow) - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
Deprecated.Use
TestPipeline
with theDirectRunner
. - peekOutputElementsWithTimestamp() - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
Deprecated.Use
TestPipeline
with theDirectRunner
. - PENDING_BYTES_METRIC_NAME - Static variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.unbounded.FlinkUnboundedSourceReader
- perElement() - Static method in class org.apache.beam.sdk.transforms.Count
-
Returns a
PTransform
that counts the number of occurrences of each element in its inputPCollection
. - PeriodicImpulse - Class in org.apache.beam.sdk.transforms
-
A
PTransform
which produces a sequence of elements at fixed runtime intervals. - PeriodicSequence - Class in org.apache.beam.sdk.transforms
-
A
PTransform
which generates a sequence of timestamped elements at given runtime intervals. - PeriodicSequence.OutputRangeTracker - Class in org.apache.beam.sdk.transforms
- PeriodicSequence.SequenceDefinition - Class in org.apache.beam.sdk.transforms
- PeriodicStatusPageDirectoryFactory() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions.PeriodicStatusPageDirectoryFactory
- perKey() - Static method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct
-
Like
ApproximateDistinct.globally()
but per key, i.e computes the approximate number of distinct values per key in aPCollection<KV<K, V>>
and returnsPCollection<KV<K, Long>>
. - perKey() - Static method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies
-
Like
SketchFrequencies.globally()
but per key, i.e a Count-min sketch per key inPCollection<KV<K, V>>
and returns aPCollection<KV<K, {@link CountMinSketch}>>
. - perKey() - Static method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles
-
Like
TDigestQuantiles.globally()
, but builds a digest for each key in the stream. - perKey() - Static method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct
- perKey() - Static method in class org.apache.beam.sdk.extensions.zetasketch.HllCount.Extract
-
Returns a
PTransform
that takes an inputPCollection<KV<K, byte[]>>
of (key, HLL++ sketch) pairs and returns aPCollection<KV<K, Long>>
of (key, estimated count of distinct elements extracted from each sketch). - perKey() - Method in class org.apache.beam.sdk.extensions.zetasketch.HllCount.Init.Builder
-
Returns a
Combine.PerKey
PTransform
that takes an inputPCollection<KV<K, InputT>>
and returns aPCollection<KV<K, byte[]>>
which consists of the per-key HLL++ sketch computed from the values matching each key in the inputPCollection
. - perKey() - Static method in class org.apache.beam.sdk.extensions.zetasketch.HllCount.MergePartial
-
Returns a
Combine.PerKey
PTransform
that takes an inputPCollection<KV<K, byte[]>>
of (key, HLL++ sketch) pairs and returns aPCollection<KV<K, byte[]>>
of (key, new sketch merged from the input sketches under the key). - perKey() - Static method in class org.apache.beam.sdk.transforms.Count
-
Returns a
PTransform
that counts the number of elements associated with each key of its inputPCollection
. - perKey() - Static method in class org.apache.beam.sdk.transforms.Latest
-
Returns a
PTransform
that takes as input aPCollection<KV<K, V>>
and returns aPCollection<KV<K, V>>
whose contents is the latest element per-key according to its event time. - perKey() - Static method in class org.apache.beam.sdk.transforms.Max
-
Returns a
PTransform
that takes an inputPCollection<KV<K, T>>
and returns aPCollection<KV<K, T>>
that contains an output element mapping each distinct key in the inputPCollection
to the maximum according to the natural ordering ofT
of the values associated with that key in the inputPCollection
. - perKey() - Static method in class org.apache.beam.sdk.transforms.Mean
-
Returns a
PTransform
that takes an inputPCollection<KV<K, N>>
and returns aPCollection<KV<K, Double>>
that contains an output element mapping each distinct key in the inputPCollection
to the mean of the values associated with that key in the inputPCollection
. - perKey() - Static method in class org.apache.beam.sdk.transforms.Min
-
Returns a
PTransform
that takes an inputPCollection<KV<K, T>>
and returns aPCollection<KV<K, T>>
that contains an output element mapping each distinct key in the inputPCollection
to the minimum according to the natural ordering ofT
of the values associated with that key in the inputPCollection
. - perKey(double) - Static method in class org.apache.beam.sdk.transforms.ApproximateUnique
-
Deprecated.Like
ApproximateUnique.perKey(int)
, but specifies the desired maximum estimation error instead of the sample size. - perKey(int) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles
-
Like
ApproximateQuantiles.perKey(int, Comparator)
, but sorts values using their natural ordering. - perKey(int) - Static method in class org.apache.beam.sdk.transforms.ApproximateUnique
-
Deprecated.Returns a
PTransform
that takes aPCollection<KV<K, V>>
and returns aPCollection<KV<K, Long>>
that contains an output element mapping each distinct key in the inputPCollection
to an estimate of the number of distinct values associated with that key in the inputPCollection
. - perKey(int, ComparatorT) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles
-
Returns a
PTransform
that takes aPCollection<KV<K, V>>
and returns aPCollection<KV<K, List<V>>>
that contains an output element mapping each distinct key in the inputPCollection
to aList
of the approximateN
-tiles of the values associated with that key in the inputPCollection
. - perKey(int, ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Top
-
Returns a
PTransform
that takes an inputPCollection<KV<K, V>>
and returns aPCollection<KV<K, List<V>>>
that contains an output element mapping each distinct key in the inputPCollection
to the largestcount
values associated with that key in the inputPCollection<KV<K, V>>
, in decreasing order, sorted using the givenComparator<V>
. - perKey(ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Max
-
Returns a
PTransform
that takes an inputPCollection<KV<K, T>>
and returns aPCollection<KV<K, T>>
that contains one output element per key mapping each to the maximum of the values associated with that key in the inputPCollection
. - perKey(ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Min
-
Returns a
PTransform
that takes an inputPCollection<KV<K, T>>
and returns aPCollection<KV<K, T>>
that contains one output element per key mapping each to the minimum of the values associated with that key in the inputPCollection
. - perKey(CombineFnBase.GlobalCombineFn<? super InputT, ?, OutputT>) - Static method in class org.apache.beam.sdk.transforms.Combine
-
Returns a
Combine.PerKey
PTransform
that first groups its inputPCollection
ofKV
s by keys and windows, then invokes the given function on each of the values lists to produce a combined value, and then returns aPCollection
ofKV
s mapping each distinct key to its combined value for each window. - perKey(SerializableBiFunction<V, V, V>) - Static method in class org.apache.beam.sdk.transforms.Combine
-
Returns a
Combine.PerKey
PTransform
that first groups its inputPCollection
ofKV
s by keys and windows, then invokes the given function on each of the values lists to produce a combined value, and then returns aPCollection
ofKV
s mapping each distinct key to its combined value for each window. - perKey(SerializableFunction<Iterable<V>, V>) - Static method in class org.apache.beam.sdk.transforms.Combine
-
Returns a
Combine.PerKey
PTransform
that first groups its inputPCollection
ofKV
s by keys and windows, then invokes the given function on each of the values lists to produce a combined value, and then returns aPCollection
ofKV
s mapping each distinct key to its combined value for each window. - PerKey() - Constructor for class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.PerKey
- PerKey(double) - Constructor for class org.apache.beam.sdk.transforms.ApproximateUnique.PerKey
-
Deprecated.
- PerKey(int) - Constructor for class org.apache.beam.sdk.transforms.ApproximateUnique.PerKey
-
Deprecated.
- PerKeyDigest() - Constructor for class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.PerKeyDigest
- PerKeyDistinct() - Constructor for class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.PerKeyDistinct
- PerKeySketch() - Constructor for class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.PerKeySketch
- Permissions - Search tag in class org.apache.beam.runners.dataflow.DataflowRunner
- Section
- Permissions - Search tag in class org.apache.beam.sdk.extensions.avro.io.AvroSource
- Section
- Permissions - Search tag in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
- Section
- Permissions - Search tag in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO
- Section
- Permissions - Search tag in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1
- Section
- Permissions - Search tag in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1
- Section
- Permissions - Search tag in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
- Section
- PFADD - Enum constant in enum class org.apache.beam.sdk.io.redis.RedisIO.Write.Method
-
Use PFADD command.
- pin() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StreamAppendClient
-
Pin this object.
- PInput - Interface in org.apache.beam.sdk.values
-
The interface for things that might be input to a
PTransform
. - pipeline() - Method in class org.apache.beam.runners.jobsubmission.JobPreparation
- Pipeline - Class in org.apache.beam.sdk
-
A
Pipeline
manages a directed acyclic graph ofPTransforms
, and thePCollections
that thePTransforms
consume and produce. - Pipeline(PipelineOptions) - Constructor for class org.apache.beam.sdk.Pipeline
- PIPELINE_PROTO_CODER_ID - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- Pipeline.PipelineExecutionException - Exception Class in org.apache.beam.sdk
- Pipeline.PipelineVisitor - Interface in org.apache.beam.sdk
-
For internal use only; no backwards-compatibility guarantees.
- Pipeline.PipelineVisitor.CompositeBehavior - Enum Class in org.apache.beam.sdk
-
Control enum for indicating whether or not a traversal should process the contents of a composite transform or not.
- Pipeline.PipelineVisitor.Defaults - Class in org.apache.beam.sdk
-
Default no-op
Pipeline.PipelineVisitor
that enters all composite transforms. - PIPELINED - Static variable in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- PipelineExecutionException(Throwable) - Constructor for exception class org.apache.beam.sdk.Pipeline.PipelineExecutionException
-
Wraps
cause
into aPipeline.PipelineExecutionException
. - PipelineMessageReceiver - Interface in org.apache.beam.runners.local
-
Handles failures in the form of exceptions.
- pipelineOptions - Variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- pipelineOptions() - Method in class org.apache.beam.runners.fnexecution.provisioning.JobInfo
- PipelineOptions - Interface in org.apache.beam.sdk.options
-
PipelineOptions are used to configure Pipelines.
- PipelineOptions.AtomicLongFactory - Class in org.apache.beam.sdk.options
-
DefaultValueFactory
which supplies an ID that is guaranteed to be unique within the given process. - PipelineOptions.CheckEnabled - Enum Class in org.apache.beam.sdk.options
-
Enumeration of the possible states for a given check.
- PipelineOptions.DirectRunner - Class in org.apache.beam.sdk.options
-
A
DefaultValueFactory
that obtains the class of theDirectRunner
if it exists on the classpath, and throws an exception otherwise. - PipelineOptions.JobNameFactory - Class in org.apache.beam.sdk.options
-
Returns a normalized job name constructed from
ApplicationNameOptions.getAppName()
, the local system user name (if available), the current time, and a random integer. - PipelineOptions.UserAgentFactory - Class in org.apache.beam.sdk.options
-
Returns a user agent string constructed from
ReleaseInfo.getName()
andReleaseInfo.getVersion()
, in the format[name]/[version]
. - PipelineOptionsFactory - Class in org.apache.beam.sdk.options
-
Constructs a
PipelineOptions
or any derived interface that is composable to any other derived interface ofPipelineOptions
via thePipelineOptions.as(java.lang.Class<T>)
method. - PipelineOptionsFactory() - Constructor for class org.apache.beam.sdk.options.PipelineOptionsFactory
- PipelineOptionsFactory.Builder - Class in org.apache.beam.sdk.options
-
A fluent
PipelineOptions
builder. - PipelineOptionsRegistrar - Interface in org.apache.beam.sdk.options
-
PipelineOptions
creators have the ability to automatically have theirPipelineOptions
registered with this SDK by creating aServiceLoader
entry and a concrete implementation of this interface. - PipelineOptionsValidator - Class in org.apache.beam.sdk.options
-
Validates that the
PipelineOptions
conforms to all theValidation
criteria. - PipelineOptionsValidator() - Constructor for class org.apache.beam.sdk.options.PipelineOptionsValidator
- PipelineResult - Interface in org.apache.beam.sdk
-
Result of
Pipeline.run()
. - PipelineResult.State - Enum Class in org.apache.beam.sdk
-
Possible job states, for both completed and ongoing jobs.
- PipelineRunner<ResultT> - Class in org.apache.beam.sdk
-
A
PipelineRunner
runs aPipeline
. - PipelineRunner() - Constructor for class org.apache.beam.sdk.PipelineRunner
- PipelineTranslator - Class in org.apache.beam.runners.spark.structuredstreaming.translation
-
The pipeline translator translates a Beam
Pipeline
into a Spark correspondence, that can then be evaluated. - PipelineTranslator() - Constructor for class org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator
- PipelineTranslator.TranslationState - Interface in org.apache.beam.runners.spark.structuredstreaming.translation
-
Shared, mutable state during the translation of a pipeline and omitted afterwards.
- PipelineTranslator.UnresolvedTranslation<InT,
T> - Interface in org.apache.beam.runners.spark.structuredstreaming.translation -
Unresolved translation, allowing to optimize the generated Spark DAG.
- PipelineTranslatorBatch - Class in org.apache.beam.runners.spark.structuredstreaming.translation.batch
-
PipelineTranslator
for executing aPipeline
in Spark in batch mode. - PipelineTranslatorBatch() - Constructor for class org.apache.beam.runners.spark.structuredstreaming.translation.batch.PipelineTranslatorBatch
- PipelineTranslatorUtils - Class in org.apache.beam.runners.fnexecution.translation
-
Utilities for pipeline translation.
- placementId() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIPredict
- PlainAccumulatorState() - Constructor for class org.apache.beam.sdk.io.kafka.jmh.KafkaIOUtilsBenchmark.PlainAccumulatorState
- plainRead(KafkaIOUtilsBenchmark.PlainAccumulatorState) - Method in class org.apache.beam.sdk.io.kafka.jmh.KafkaIOUtilsBenchmark
- plainReadWhileWriting(KafkaIOUtilsBenchmark.PlainAccumulatorState) - Method in class org.apache.beam.sdk.io.kafka.jmh.KafkaIOUtilsBenchmark
- plainWrite(KafkaIOUtilsBenchmark.PlainAccumulatorState, KafkaIOUtilsBenchmark.ProducerState) - Method in class org.apache.beam.sdk.io.kafka.jmh.KafkaIOUtilsBenchmark
- plainWriteWhileReading(KafkaIOUtilsBenchmark.PlainAccumulatorState, KafkaIOUtilsBenchmark.ProducerState) - Method in class org.apache.beam.sdk.io.kafka.jmh.KafkaIOUtilsBenchmark
- Plugin<K,
V> - Class in org.apache.beam.sdk.io.cdap -
Class wrapper for a CDAP plugin.
- Plugin() - Constructor for class org.apache.beam.sdk.io.cdap.Plugin
- Plugin.Builder<K,
V> - Class in org.apache.beam.sdk.io.cdap -
Builder class for a
Plugin
. - pluginConfig - Variable in class org.apache.beam.sdk.io.cdap.Plugin
- PluginConfigInstantiationUtils - Class in org.apache.beam.sdk.io.cdap
-
Class for getting any filled
PluginConfig
configuration object. - PluginConfigInstantiationUtils() - Constructor for class org.apache.beam.sdk.io.cdap.PluginConfigInstantiationUtils
- PluginConstants - Class in org.apache.beam.sdk.io.cdap
-
Class for CDAP plugin constants.
- PluginConstants() - Constructor for class org.apache.beam.sdk.io.cdap.PluginConstants
- PluginConstants.Format - Enum Class in org.apache.beam.sdk.io.cdap
-
Format types.
- PluginConstants.FormatProvider - Enum Class in org.apache.beam.sdk.io.cdap
-
Format provider types.
- PluginConstants.Hadoop - Enum Class in org.apache.beam.sdk.io.cdap
-
Hadoop types.
- PluginConstants.PluginType - Enum Class in org.apache.beam.sdk.io.cdap
-
Plugin types.
- plus(NodeStats) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.NodeStats
- plus(RelOptCost) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
- PLUS - Enum constant in enum class org.apache.beam.sdk.extensions.sql.impl.cep.CEPKind
- PLUS - Static variable in class org.apache.beam.sdk.extensions.sql.impl.cep.Quantifier
- PLUS_RELUCTANT - Static variable in class org.apache.beam.sdk.extensions.sql.impl.cep.Quantifier
- plusDelayOf(Duration) - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
-
Adds some delay to the original target time.
- poisonInstructionId(String) - Method in class org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer
-
Poisons an instruction id.
- POJOUtils - Class in org.apache.beam.sdk.schemas.utils
-
A set of utilities to generate getter and setter classes for POJOs.
- POJOUtils() - Constructor for class org.apache.beam.sdk.schemas.utils.POJOUtils
- PollFn() - Constructor for class org.apache.beam.sdk.transforms.Watch.Growth.PollFn
- pollFor(Duration) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery.PollingAssertion
- pollJob(JobReference, int) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
-
Waits for the job is Done, and returns the job.
- pollJob(JobReference, int) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
- pollNext(ReaderOutput<WindowedValue<ValueWithRecordId<T>>>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.unbounded.FlinkUnboundedSourceReader
- pollNext(ReaderOutput<WindowedValue<T>>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.bounded.FlinkBoundedSourceReader
- pollOperation(Operation, Long) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Poll operation.
- pollOperation(Operation, Long) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- pooledClientFactory(BuilderT) - Static method in class org.apache.beam.sdk.io.aws2.common.ObjectPool
- popDataset(String) - Method in class org.apache.beam.runners.spark.translation.SparkTranslationContext
-
Retrieve the dataset for the pCollection id and remove it from the DAG's leaves.
- populateDisplayData(SingleStoreIO.DataSourceConfiguration, DisplayData.Builder) - Static method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.DataSourceConfiguration
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Parse
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ParseAll
-
Deprecated.
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ParseFiles
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Read
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ReadAll
-
Deprecated.
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ReadFiles
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.Globally
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.PerKey
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.amqp.AmqpIO.Read
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.BoundedReadFromUnboundedSource
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.CompressedSource
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.Read
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.ReadFiles
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.csv.CsvIO.Write
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BoundedElasticsearchSource
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
-
Populates the display data.
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.FileBasedSink.FilenamePolicy
-
Populates the display data.
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.FileBasedSink
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.FileBasedSource
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.FileIO.Match
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.FileIO.MatchAll
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.FileIO.MatchConfiguration
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.FileIO.ReadMatches
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.WriteWithResults
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntityWithSummary
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKeyWithSummary
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.WriteWithSummary
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.datastore.RampupThrottlingFn
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchGetDocuments
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchWriteWithDeadLetterQueue
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchWriteWithSummary
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.ListCollectionIds
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.ListDocuments
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.PartitionQuery
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.RunQuery
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write.PubsubBoundedWriter
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.WriteGrouped
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.GenerateSequence
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsV19.Read
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.HadoopInputFormatBoundedSource
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Write
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Write
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.WriteRowMutations
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Read
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Read
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Write
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceProviderFromDataSourceConfiguration
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.PoolableDataSourceProvider
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadAll
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadRows
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadWithPartitions
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.TypedWithoutMetadata
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.kudu.KuduIO.Read
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.kudu.KuduIO.Write
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read.BoundedGridFSSource
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Write
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Read
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Write
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverProviderFromDriverConfiguration
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.ReadAll
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.WriteUnwind
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.OffsetBasedSource
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Parse
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.ParseFiles
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Read
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.ReadFiles
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.Read.Bounded
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.Read.Unbounded
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
-
Populate the display data with connectionConfiguration details.
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Read
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.Read
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.ReadWithPartitions
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.Write
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceProviderFromDataSourceConfiguration
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.solr.SolrIO.Read
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.Source
-
Register display data for the given transform or component.
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.TextIO.Read
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.TextIO.ReadAll
-
Deprecated.
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.TextIO.ReadFiles
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.TextIO.Write
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.TFRecordIO.Read
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.thrift.ThriftIO.ReadFiles
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.tika.TikaIO.Parse
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.tika.TikaIO.ParseFiles
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.WriteFiles
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Write
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.Globally
-
Deprecated.
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.PerKey
-
Deprecated.
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Combine.Globally
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Combine.GloballyAsSingletonView
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Combine.GroupedValues
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Combine.PerKeyWithHotKeyFanout
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
-
Register display data for the given transform or component.
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.ArrayAgg.ArrayAggArray
-
Register display data for the given transform or component.
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations.BitXOr
-
Register display data for the given transform or component.
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions.PositionAwareCombineFn
-
Register display data for the given transform or component.
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn
-
Register display data for the given transform or component.
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
-
Register display data for the given transform or component.
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
-
Register display data for the given transform or component.
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
-
Register display data for the given transform or component.
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
-
Register display data for the given transform or component.
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
-
Register display data for the given transform or component.
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
-
Register display data for the given transform or component.
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CountIf.CountIfFn
-
Register display data for the given transform or component.
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CovarianceFn
-
Register display data for the given transform or component.
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.ordered.combiner.DefaultSequenceCombiner
-
Register display data for the given transform or component.
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.runners.flink.FlinkStreamingAggregationsTranslators.ConcatenateAsIterable
-
Register display data for the given transform or component.
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.sql.impl.LazyAggregateCombineFn
-
Register display data for the given transform or component.
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Concatenate
-
Register display data for the given transform or component.
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggByte
-
Register display data for the given transform or component.
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggString
-
Register display data for the given transform or component.
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.VarianceFn
-
Register display data for the given transform or component.
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
- populateDisplayData(DisplayData.Builder) - Method in interface org.apache.beam.sdk.transforms.display.HasDisplayData
-
Register display data for the given transform or component.
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.DoFn
-
Register display data for the given transform or component.
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Filter
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.FlatMapElements.FlatMapWithFailures
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.FlatMapElements
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.GroupByKey
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.InferableFunction
-
Register display data for the given transform or component.
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.MapElements.MapWithFailures
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.MapElements
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
-
Register display data for the given transform or component.
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Partition
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.PTransform
-
Register display data for the given transform or component.
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Top.TopCombineFn
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.windowing.Window.Assign
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.windowing.Window
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
-
Register display data for the given transform or component.
- PortableBigQueryDestinations - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
- PortableBigQueryDestinations(Schema, BigQueryWriteConfiguration) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.PortableBigQueryDestinations
- portableMetrics() - Method in class org.apache.beam.runners.flink.FlinkPortableRunnerResult
- portableMetrics() - Method in interface org.apache.beam.runners.jobsubmission.PortablePipelineResult
-
Returns the object to access monitoring infos from the pipeline.
- PortableMetrics - Class in org.apache.beam.runners.portability
- PortablePipelineJarCreator - Class in org.apache.beam.runners.jobsubmission
-
PortablePipelineRunner
that bundles the input pipeline along with all dependencies, artifacts, etc. - PortablePipelineJarCreator(Class) - Constructor for class org.apache.beam.runners.jobsubmission.PortablePipelineJarCreator
- PortablePipelineJarUtils - Class in org.apache.beam.runners.jobsubmission
-
Contains common code for writing and reading portable pipeline jars.
- PortablePipelineJarUtils() - Constructor for class org.apache.beam.runners.jobsubmission.PortablePipelineJarUtils
- PortablePipelineOptions - Interface in org.apache.beam.sdk.options
-
Pipeline options common to all portable runners.
- PortablePipelineResult - Interface in org.apache.beam.runners.jobsubmission
-
Result of a portable
PortablePipelineRunner.run(RunnerApi.Pipeline, JobInfo)
. - PortablePipelineRunner - Interface in org.apache.beam.runners.jobsubmission
-
Runs a portable Beam pipeline on some execution engine.
- PortableRunner - Class in org.apache.beam.runners.portability
- PortableRunnerRegistrar - Class in org.apache.beam.runners.portability
-
Registrar for the portable runner.
- PortableRunnerRegistrar() - Constructor for class org.apache.beam.runners.portability.PortableRunnerRegistrar
- position() - Method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingSeekableByteChannel
- position(long) - Method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingSeekableByteChannel
- positional() - Method in class org.apache.beam.sdk.extensions.sql.impl.QueryPlanner.QueryParameters
- POSITIONAL - Enum constant in enum class org.apache.beam.sdk.extensions.sql.impl.QueryPlanner.QueryParameters.Kind
- PositionAwareCombineFn() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions.PositionAwareCombineFn
- Position-based sources - Search tag in interface org.apache.beam.sdk.io.range.RangeTracker
- Section
- POSTGRES - Enum constant in enum class org.apache.beam.io.debezium.Connectors
- POSTGRES - Static variable in class org.apache.beam.sdk.io.jdbc.JdbcUtil
- PostProcessingMetricsDoFn - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn
-
A DoFn class to gather metrics about the emitted
DataChangeRecord
s. - PostProcessingMetricsDoFn(ChangeStreamMetrics) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.PostProcessingMetricsDoFn
- POutput - Interface in org.apache.beam.sdk.values
-
The interface for things that might be output from a
PTransform
. - PRE_DEFINED_WINDOW_FUNCTIONS - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.BeamZetaSqlCatalog
- precisionForRelativeError(double) - Static method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct
-
Computes the precision based on the desired relative error.
- predictAll() - Static method in class org.apache.beam.sdk.extensions.ml.RecommendationAIIO
- prefetch() - Method in class org.apache.beam.sdk.fn.stream.DataStreams.DataStreamDecoder
- prefetch() - Method in interface org.apache.beam.sdk.fn.stream.PrefetchableIterable
-
Ensures that the next iterator returned has been prefetched.
- prefetch() - Method in class org.apache.beam.sdk.fn.stream.PrefetchableIterables.Default
- prefetch() - Method in interface org.apache.beam.sdk.fn.stream.PrefetchableIterator
-
If not
PrefetchableIterator.isReady()
, schedules the next expensive operation such that at some point in time in the futurePrefetchableIterator.isReady()
will return true. - PrefetchableIterable<T> - Interface in org.apache.beam.sdk.fn.stream
-
An
Iterable
that returnsPrefetchableIterator
s. - PrefetchableIterables - Class in org.apache.beam.sdk.fn.stream
-
This class contains static utility functions that operate on or return objects of type
PrefetchableIterable
. - PrefetchableIterables() - Constructor for class org.apache.beam.sdk.fn.stream.PrefetchableIterables
- PrefetchableIterables.Default<T> - Class in org.apache.beam.sdk.fn.stream
-
A default implementation that caches an iterator to be returned when
PrefetchableIterables.Default.prefetch()
is invoked. - PrefetchableIterator<T> - Interface in org.apache.beam.sdk.fn.stream
-
Iterator
that supports prefetching the next set of records. - PrefetchableIterators - Class in org.apache.beam.sdk.fn.stream
- PrefetchableIterators() - Constructor for class org.apache.beam.sdk.fn.stream.PrefetchableIterators
- prefetchOnMerge(MergingStateAccessor<K, W>) - Method in class org.apache.beam.runners.twister2.translators.functions.internal.SystemReduceFnBuffering
- prefetchOnTrigger(StateAccessor<K>) - Method in class org.apache.beam.runners.twister2.translators.functions.internal.SystemReduceFnBuffering
- Pre-filtering Options - Search tag in class org.apache.beam.sdk.io.iceberg.IcebergIO
- Section
- prepare(TSetContext) - Method in class org.apache.beam.runners.twister2.translation.wrappers.Twister2BoundedSource
- prepare(TSetContext) - Method in class org.apache.beam.runners.twister2.translators.functions.AssignWindowsFunction
- prepare(TSetContext) - Method in class org.apache.beam.runners.twister2.translators.functions.ByteToElemFunction
- prepare(TSetContext) - Method in class org.apache.beam.runners.twister2.translators.functions.ByteToWindowFunction
- prepare(TSetContext) - Method in class org.apache.beam.runners.twister2.translators.functions.ByteToWindowFunctionPrimitive
- prepare(TSetContext) - Method in class org.apache.beam.runners.twister2.translators.functions.DoFnFunction
- prepare(TSetContext) - Method in class org.apache.beam.runners.twister2.translators.functions.ElemToBytesFunction
- prepare(TSetContext) - Method in class org.apache.beam.runners.twister2.translators.functions.GroupByWindowFunction
- prepare(TSetContext) - Method in class org.apache.beam.runners.twister2.translators.functions.MapToTupleFunction
- prepare(TSetContext) - Method in class org.apache.beam.runners.twister2.translators.functions.OutputTagFilter
- prepare(TSetContext) - Method in class org.apache.beam.runners.twister2.translators.functions.Twister2SinkFunction
- prepare(JobApi.PrepareJobRequest, StreamObserver<JobApi.PrepareJobResponse>) - Method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
- prepare(JobApi.PrepareJobRequest, StreamObserver<JobApi.PrepareJobResponse>) - Method in class org.apache.beam.runners.portability.testing.TestJobService
- prepareCall(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- prepareCall(String, int, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- prepareCall(String, int, int, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- prepareFilesToStage(SparkCommonPipelineOptions) - Static method in interface org.apache.beam.runners.spark.SparkCommonPipelineOptions
-
Classpath contains non jar files (eg.
- prepareForProcessing() - Method in class org.apache.beam.sdk.transforms.DoFn
-
Deprecated.use
DoFn.Setup
orDoFn.StartBundle
instead. This method will be removed in a future release. - prepareForTranslation(RunnerApi.Pipeline) - Method in interface org.apache.beam.runners.flink.FlinkPortablePipelineTranslator
- preparePrivateKey(String, String) - Static method in class org.apache.beam.sdk.io.snowflake.KeyPairUtils
- PreparePubsubWriteDoFn<InputT> - Class in org.apache.beam.sdk.io.gcp.pubsub
- prepareRun() - Method in class org.apache.beam.sdk.io.cdap.Plugin
-
Calls
SubmitterLifecycle.prepareRun(Object)
method on thePlugin.cdapPluginObj
passing needed configuration object as a parameter. - prepareSnapshotPreBarrier(long) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- prepareStatement(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- prepareStatement(String, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- prepareStatement(String, int[]) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- prepareStatement(String, int, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- prepareStatement(String, int, int, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- prepareStatement(String, String[]) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- prepareWrite(WritableByteChannel) - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
-
Called with the channel that a subclass will write its header, footer, and values to.
- PrepareWrite<InputT,
DestinationT, - Class in org.apache.beam.sdk.io.gcp.bigqueryOutputT> -
Prepare an input
PCollection
for writing to BigQuery. - PrepareWrite(DynamicDestinations<InputT, DestinationT>, SerializableFunction<InputT, OutputT>) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite
- preProcess(WindowedValue<KV<K, InputT>>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.WindowDoFnOperator
- preProcess(WindowedValue<PreInputT>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- PRESERVES_KEYS - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- PRESET_HELPERS - Static variable in class org.apache.beam.sdk.io.jdbc.JdbcUtil
- PREV - Enum constant in enum class org.apache.beam.sdk.extensions.sql.impl.cep.CEPKind
- previous(Timestamp) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampUtils
- primary() - Method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
- PrimitiveParDoSingleFactory<InputT,
OutputT> - Class in org.apache.beam.runners.dataflow -
A
PTransformOverrideFactory
that producesPrimitiveParDoSingleFactory.ParDoSingle
instances fromParDo.SingleOutput
instances. - PrimitiveParDoSingleFactory() - Constructor for class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory
- PrimitiveParDoSingleFactory.ParDoSingle<InputT,
OutputT> - Class in org.apache.beam.runners.dataflow -
A single-output primitive
ParDo
. - PrimitiveParDoSingleFactory.PayloadTranslator - Class in org.apache.beam.runners.dataflow
-
A translator for
PrimitiveParDoSingleFactory.ParDoSingle
. - PrimitiveParDoSingleFactory.Registrar - Class in org.apache.beam.runners.dataflow
- printHelp(PrintStream) - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
-
Outputs the set of registered options with the PipelineOptionsFactory with a description for each one if available to the output stream.
- printHelp(PrintStream, Class<? extends PipelineOptions>) - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
-
Outputs the set of options available to be set for the passed in
PipelineOptions
interface. - PrismPipelineOptions - Interface in org.apache.beam.runners.prism
- PrismRegistrar - Class in org.apache.beam.runners.prism
- PrismRegistrar.Options - Class in org.apache.beam.runners.prism
-
Registers the
PrismPipelineOptions
andTestPrismPipelineOptions
. - PrismRegistrar.Runner - Class in org.apache.beam.runners.prism
- PrismRunner - Class in org.apache.beam.runners.prism
-
A
PipelineRunner
executed on Prism. - PrismRunner(PrismPipelineOptions) - Constructor for class org.apache.beam.runners.prism.PrismRunner
- process(byte[], DoFn.MultiOutputReceiver) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider.ErrorFn
- process(byte[], DoFn.MultiOutputReceiver) - Method in class org.apache.beam.sdk.io.TFRecordReadSchemaTransformProvider.ErrorFn
- process(int, Inbox) - Method in class org.apache.beam.runners.jet.processors.ParDoP
- process(int, Inbox) - Method in class org.apache.beam.runners.jet.processors.StatefulParDoP
- process(SequencedMessage, DoFn.MultiOutputReceiver) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.ErrorFn
- process(InputT, Instant, BoundedWindow, PaneInfo, DoFn.MultiOutputReceiver) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PreparePubsubWriteDoFn
- process(List<JobMessage>) - Method in interface org.apache.beam.runners.dataflow.util.MonitoringUtil.JobMessagesHandler
-
Process the rows.
- process(List<JobMessage>) - Method in class org.apache.beam.runners.dataflow.util.MonitoringUtil.LoggingHandler
- process(Map<String, String>, RestrictionTracker<KafkaSourceConsumerFn.OffsetHolder, Map<String, Object>>, DoFn.OutputReceiver<T>) - Method in class org.apache.beam.io.debezium.KafkaSourceConsumerFn
-
Process the retrieved element and format it for output.
- process(DataChangeRecord, DoFn.MultiOutputReceiver) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.DataChangeRecordToRow
- process(PipelineOptions, KV<String, StorageApiFlushAndFinalizeDoFn.Operation>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiFlushAndFinalizeDoFn
- process(DoFn.ProcessContext, RestrictionTracker<Watch.GrowthState, KV<Watch.Growth.PollResult<OutputT>, TerminationStateT>>, ManualWatermarkEstimator<Instant>) - Method in class org.apache.beam.sdk.transforms.Watch.WatchGrowthFn
- process(DoFn.ProcessContext) - Method in class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceTransform.SplitIntoRangesFn
- process(DoFn.ProcessContext) - Method in class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceTransform.AbstractReadFileRangesFn
- process(KV<Row, ValueT>, Instant, TimerMap, TimerMap, ValueState<SortedMap<Instant, TimestampedValue<ValueT>>>, DoFn.OutputReceiver<ValueT>) - Method in class org.apache.beam.sdk.extensions.timeseries.FillGaps.FillGapsDoFn
- process(Row, DoFn.MultiOutputReceiver) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.ErrorCounterFn
- process(Row, DoFn.MultiOutputReceiver) - Method in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformProvider.ErrorFn
- processArrayOfNestedStringField(RowBundles.ArrayOfNestedStringBundle, Blackhole) - Method in class org.apache.beam.sdk.jmh.schemas.GetterBasedSchemaProviderBenchmark
- processArrayOfStringField(RowBundles.ArrayOfStringBundle, Blackhole) - Method in class org.apache.beam.sdk.jmh.schemas.GetterBasedSchemaProviderBenchmark
- processBundle(InputT...) - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
Deprecated.Use
TestPipeline
with theDirectRunner
. - processBundle(Iterable<? extends InputT>) - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
Deprecated.Use
TestPipeline
with theDirectRunner
. - ProcessBundleDescriptors - Class in org.apache.beam.runners.fnexecution.control
-
Utility methods for creating
BeamFnApi.ProcessBundleDescriptor
instances. - ProcessBundleDescriptors() - Constructor for class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors
- ProcessBundleDescriptors.BagUserStateSpec<K,
V, - Class in org.apache.beam.runners.fnexecution.controlW> -
A container type storing references to the key, value, and window
Coder
used when handling bag user state requests. - ProcessBundleDescriptors.ExecutableProcessBundleDescriptor - Class in org.apache.beam.runners.fnexecution.control
- ProcessBundleDescriptors.SideInputSpec<T,
W> - Class in org.apache.beam.runners.fnexecution.control -
A container type storing references to the value, and window
Coder
used when handling side input state requests. - ProcessBundleDescriptors.TimerSpec<K,
V, - Class in org.apache.beam.runners.fnexecution.controlW> -
A container type storing references to the key, timer and payload coders and the remote input destination used when handling timer requests.
- processByteBufferField(RowBundles.ByteBufferBundle, Blackhole) - Method in class org.apache.beam.sdk.jmh.schemas.GetterBasedSchemaProviderBenchmark
- processBytesField(RowBundles.BytesBundle, Blackhole) - Method in class org.apache.beam.sdk.jmh.schemas.GetterBasedSchemaProviderBenchmark
- ProcessContext() - Constructor for class org.apache.beam.sdk.transforms.DoFn.ProcessContext
- ProcessContinuation() - Constructor for class org.apache.beam.sdk.transforms.DoFn.ProcessContinuation
- processDateTimeField(RowBundles.DateTimeBundle, Blackhole) - Method in class org.apache.beam.sdk.jmh.schemas.GetterBasedSchemaProviderBenchmark
- processElement(byte[]) - Method in class org.apache.beam.sdk.io.pulsar.WriteToPulsarDoFn
- processElement(PubSubMessage) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.PubsubLiteSink
- processElement(PubSubMessage, DoFn.OutputReceiver<PubSubMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.SetUuidFromPubSubMessage.SetUuidFn
- processElement(Struct, DoFn.MultiOutputReceiver) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.ErrorFn
- processElement(ErrorT, DoFn.OutputReceiver<ErrorT>) - Method in class org.apache.beam.sdk.transforms.errorhandling.ErrorHandler.PTransformErrorHandler.WriteErrorMetrics.CountErrors
- processElement(InputT) - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
Deprecated.Use
TestPipeline
with theDirectRunner
. - processElement(Iterable<KV<DestinationT, WriteTables.Result>>, DoFn.ProcessContext, BoundedWindow) - Method in class org.apache.beam.sdk.io.gcp.bigquery.UpdateSchemaDestination
- processElement(Long, Instant, DoFn.OutputReceiver<Row>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datagen.DataGeneratorRowFn
- processElement(String, DoFn.MultiOutputReceiver) - Method in class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn.ParseWithError
- processElement(InitialPipelineState, RestrictionTracker<OffsetRange, Long>, DoFn.OutputReceiver<PartitionRecord>, ManualWatermarkEstimator<Instant>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.DetectNewPartitionsDoFn
- processElement(PartitionRecord, RestrictionTracker<StreamProgress, StreamProgress>, DoFn.OutputReceiver<KV<ByteString, ChangeStreamRecord>>, ManualWatermarkEstimator<Instant>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.ReadChangeStreamPartitionDoFn
- processElement(PubsubMessage, Instant) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write.PubsubBoundedWriter
- processElement(DataChangeRecord, DoFn.OutputReceiver<DataChangeRecord>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.PostProcessingMetricsDoFn
-
Stage to measure a data records latencies and metrics.
- processElement(PartitionMetadata, RestrictionTracker<TimestampRange, Timestamp>, DoFn.OutputReceiver<DataChangeRecord>, ManualWatermarkEstimator<Instant>, DoFn.BundleFinalizer) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
-
Performs a change stream query for a given partition.
- processElement(PulsarSourceDescriptor, RestrictionTracker<OffsetRange, Long>, WatermarkEstimator, DoFn.OutputReceiver<PulsarMessage>) - Method in class org.apache.beam.sdk.io.pulsar.ReadFromPulsarDoFn
- processElement(Solace.Record, DoFn.OutputReceiver<Solace.PublishResult>) - Method in class org.apache.beam.sdk.io.solace.write.RecordToPublishResultDoFn
- processElement(Solace.Record, DoFn.OutputReceiver<KV<Integer, Solace.Record>>) - Method in class org.apache.beam.sdk.io.solace.write.AddShardKeyDoFn
- processElement(DoFn.OutputReceiver<Void>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.CleanUpReadChangeStreamDoFn
- processElement(DoFn.OutputReceiver<InitialPipelineState>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.InitializeDoFn
- processElement(DoFn.OutputReceiver<PartitionMetadata>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.InitializeDoFn
- processElement(DoFn.ProcessContext, BoundedWindow) - Method in class org.apache.beam.runners.dataflow.DataflowRunner.StreamingPCollectionViewWriterFn
- processElement(DoFn.ProcessContext) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read.CleanTmpFilesFromGcsFn
- processElement(DoFn.ProcessContext) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read.MapCsvToStringArrayFn
- processElement(DoFn.ProcessContext) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.FetchHL7v2Message.HL7v2MessageGetFn
-
Process element.
- processElement(DoFn.ProcessContext) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadSpannerSchema
- processElement(DoFn.ProcessContext) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read.FetchHL7v2Message.HL7v2MessageGetFn
-
Process element.
- processElement(DoFn.ProcessContext, PipelineOptions, KV<DestinationT, ElementT>, Instant, DoFn.MultiOutputReceiver) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiConvertMessages.ConvertMessagesDoFn
- processElement(DoFn.ProcessContext, BoundedWindow) - Method in class org.apache.beam.runners.dataflow.BatchStatefulParDoOverrides.BatchStatefulDoFn
- processElement(DoFn.ProcessContext) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamSqlOutputToConsoleFn
- processElement(DoFn.ProcessContext) - Method in class org.apache.beam.sdk.values.ValueWithRecordId.StripIdsDoFn
- processElement(DoFn.ProcessContext, BoundedWindow) - Method in class org.apache.beam.sdk.transforms.View.ToListViewDoFn
- processElement(DoFn.ProcessContext) - Method in class org.apache.beam.sdk.io.gcp.datastore.RampupThrottlingFn
-
Emit only as many elements as the exponentially increasing budget allows.
- processElement(RestrictionTracker<TimestampRange, Timestamp>, DoFn.OutputReceiver<PartitionMetadata>, ManualWatermarkEstimator<Instant>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
-
Main processing function for the
DetectNewPartitionsDoFn
function. - processElement(KV<ByteString, ChangeStreamRecord>, DoFn.OutputReceiver<KV<ByteString, ChangeStreamMutation>>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.FilterForMutationDoFn
- processElement(KV<Integer, Solace.Record>, Timer, Instant) - Method in class org.apache.beam.sdk.io.solace.write.UnboundedBatchedSolaceWriter
- processElement(KV<Integer, Solace.Record>, Instant, ValueState<Integer>) - Method in class org.apache.beam.sdk.io.solace.write.UnboundedStreamingSolaceWriter
- processElement(Row, DoFn.MultiOutputReceiver) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformProvider.ErrorFn
- processElement(Row, DoFn.OutputReceiver<Row>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamSetOperatorsTransforms.SetOperatorFilteringDoFn
- processElement(WindowedValue<InputT>) - Method in class org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate
- processElement(WindowedValue<InputT>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.stableinput.BufferingDoFnRunner
- processElement(WindowedValue<InputT>) - Method in class org.apache.beam.runners.spark.translation.DoFnRunnerWithMetrics
- processElement(StreamRecord<WindowedValue<ValueWithRecordId<T>>>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.DedupingOperator
- processElement(StreamRecord<WindowedValue<PreInputT>>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- processElement(T, DoFn.OutputReceiver<KV<Integer, T>>) - Method in class org.apache.beam.sdk.transforms.Reshuffle.AssignShardFn
- processElement1(StreamRecord<WindowedValue<PreInputT>>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- processElement2(StreamRecord<RawUnionValue>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- processElementWithRunner(DoFnRunner<InputT, OutputT>, WindowedValue<InputT>) - Method in class org.apache.beam.runners.jet.processors.ParDoP
- processElementWithRunner(DoFnRunner<KV<?, ?>, OutputT>, WindowedValue<KV<?, ?>>) - Method in class org.apache.beam.runners.jet.processors.StatefulParDoP
- ProcessEnvironment - Class in org.apache.beam.runners.fnexecution.environment
-
Environment for process-based execution.
- ProcessEnvironmentFactory - Class in org.apache.beam.runners.fnexecution.environment
-
An
EnvironmentFactory
which forks processes based on the parameters in the Environment. - ProcessEnvironmentFactory.Provider - Class in org.apache.beam.runners.fnexecution.environment
-
Provider of ProcessEnvironmentFactory.
- ProcessFunction<InputT,
OutputT> - Interface in org.apache.beam.sdk.transforms -
A function that computes an output value of type
OutputT
from an input value of typeInputT
and isSerializable
. - PROCESSING_DELAY_FROM_COMMIT_TIMESTAMP - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
-
Distribution for measuring processing delay from commit timestamp.
- PROCESSING_TIME - Enum constant in enum class org.apache.beam.sdk.state.TimeDomain
-
The
TimeDomain.PROCESSING_TIME
domain corresponds to the current (system) time. - PROCESSING_TIME - Enum constant in enum class org.apache.beam.sdk.testing.TestStream.EventType
- processingStatuses() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedEventProcessorResult
- ProcessingTimeEvent() - Constructor for class org.apache.beam.sdk.testing.TestStream.ProcessingTimeEvent
- ProcessingTimePolicy() - Constructor for class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.ProcessingTimePolicy
- ProcessingTimeWatermarkPolicy() - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.WatermarkPolicyFactory.ProcessingTimeWatermarkPolicy
- processIntField(RowBundles.IntBundle, Blackhole) - Method in class org.apache.beam.sdk.jmh.schemas.GetterBasedSchemaProviderBenchmark
- ProcessManager - Class in org.apache.beam.runners.fnexecution.environment
-
A simple process manager which forks processes and kills them if necessary.
- ProcessManager.RunningProcess - Class in org.apache.beam.runners.fnexecution.environment
- processMapOfIntField(RowBundles.MapOfIntBundle, Blackhole) - Method in class org.apache.beam.sdk.jmh.schemas.GetterBasedSchemaProviderBenchmark
- processMapOfNestedIntField(RowBundles.MapOfNestedIntBundle, Blackhole) - Method in class org.apache.beam.sdk.jmh.schemas.GetterBasedSchemaProviderBenchmark
- processNestedBytesField(RowBundles.NestedBytesBundle, Blackhole) - Method in class org.apache.beam.sdk.jmh.schemas.GetterBasedSchemaProviderBenchmark
- processNestedIntField(RowBundles.NestedIntBundle, Blackhole) - Method in class org.apache.beam.sdk.jmh.schemas.GetterBasedSchemaProviderBenchmark
- processNewPartition(NewPartition, DoFn.OutputReceiver<PartitionRecord>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ProcessNewPartitionsAction
-
Process a single new partition.
- processNewPartitionsAction(ChangeStreamMetrics, MetadataTableDao, Instant) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ActionFactory
- ProcessNewPartitionsAction - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.action
- ProcessNewPartitionsAction(ChangeStreamMetrics, MetadataTableDao, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ProcessNewPartitionsAction
- processNewRow(Row) - Method in class org.apache.beam.sdk.extensions.sql.impl.nfa.NFA
- processRows(Blackhole) - Method in class org.apache.beam.sdk.jmh.schemas.RowBundle
-
Runs benchmark iteration on a bundle of rows.
- processStringBuilderField(RowBundles.StringBuilderBundle, Blackhole) - Method in class org.apache.beam.sdk.jmh.schemas.GetterBasedSchemaProviderBenchmark
- processStringField(RowBundles.StringBundle, Blackhole) - Method in class org.apache.beam.sdk.jmh.schemas.GetterBasedSchemaProviderBenchmark
- processTimestampedElement(TimestampedValue<InputT>) - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
Deprecated.Use
TestPipeline
with theDirectRunner
. - processValue(ReduceFn.ProcessValueContext) - Method in class org.apache.beam.runners.twister2.translators.functions.internal.SystemReduceFnBuffering
- processWatermark(Watermark) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- processWatermark1(Watermark) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- processWatermark2(Watermark) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- processWindowedElement(InputT, Instant, BoundedWindow) - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
Deprecated.Use
TestPipeline
with theDirectRunner
. - processWith(DoFnRunner<InputT, OutputT>) - Method in interface org.apache.beam.runners.flink.translation.wrappers.streaming.stableinput.BufferedElement
-
Processes this element with the provided DoFnRunner.
- produceResult() - Method in interface org.apache.beam.sdk.extensions.ordered.MutableState
-
This method is called after each state mutation.
- ProducerRecordCoder<K,
V> - Class in org.apache.beam.sdk.io.kafka -
Coder
forProducerRecord
. - ProducerRecordCoder(Coder<K>, Coder<V>) - Constructor for class org.apache.beam.sdk.io.kafka.ProducerRecordCoder
- ProducerState() - Constructor for class org.apache.beam.sdk.io.kafka.jmh.KafkaIOUtilsBenchmark.ProducerState
- program - Variable in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter
- Progress() - Constructor for class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker.Progress
- PROHIBIT - Enum constant in enum class org.apache.beam.sdk.io.FileIO.ReadMatches.DirectoryTreatment
- project - Variable in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
- PROJECT - Enum constant in enum class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider.PushDownOptions
- PROJECT_ID_REGEXP - Static variable in class org.apache.beam.runners.dataflow.DataflowRunner
-
Project IDs must contain lowercase letters, digits, or dashes.
- projectId - Variable in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntityWithSummary
- projectId - Variable in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKeyWithSummary
- projectId - Variable in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.WriteWithSummary
- projectId - Variable in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
- projectId() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAICreateCatalogItem
- projectId() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportCatalogItems
- projectId() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportUserEvents
- projectId() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIPredict
- projectId() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIWriteUserEvent
- ProjectionConsumer - Interface in org.apache.beam.sdk.schemas
-
A
ProjectionConsumer
is aSchema
-aware operation (such as aDoFn
orPTransform
) that has aFieldAccessDescriptor
describing which fields the operation accesses. - ProjectionProducer<T> - Interface in org.apache.beam.sdk.schemas
-
A factory for operations that execute a projection on a
Schema
-awarePCollection
. - projectPathFromId(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
- projectPathFromPath(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
- ProjectSupport - Enum Class in org.apache.beam.sdk.extensions.sql.meta
- properties() - Method in interface org.apache.beam.sdk.extensions.sql.meta.catalog.Catalog
-
User-specified configuration properties.
- properties() - Method in class org.apache.beam.sdk.extensions.sql.meta.catalog.InMemoryCatalog
- properties() - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
- properties(ObjectNode) - Method in class org.apache.beam.sdk.extensions.sql.meta.Table.Builder
- Properties of position-based sources - Search tag in interface org.apache.beam.sdk.io.range.RangeTracker
- Section
- PROPERTY_BEAM_TEST_PIPELINE_OPTIONS - Static variable in class org.apache.beam.sdk.testing.TestPipeline
-
System property used to set
TestPipelineOptions
. - propertyName() - Method in class org.apache.beam.sdk.options.ValueProvider.NestedValueProvider
-
Returns the property name associated with this provider.
- propertyName() - Method in class org.apache.beam.sdk.options.ValueProvider.RuntimeValueProvider
-
Returns the property name that corresponds to this provider.
- PropertyNames - Class in org.apache.beam.runners.dataflow.util
-
Constant property names used by the SDK in CloudWorkflow specifications.
- PropertyNames() - Constructor for class org.apache.beam.runners.dataflow.util.PropertyNames
- ProtobufByteStringOutputStream() - Constructor for class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark.ProtobufByteStringOutputStream
- ProtobufCoderProviderRegistrar - Class in org.apache.beam.sdk.extensions.protobuf
-
A
CoderProviderRegistrar
for standard types used with Google Protobuf. - ProtobufCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.extensions.protobuf.ProtobufCoderProviderRegistrar
- ProtoByteUtils - Class in org.apache.beam.sdk.extensions.protobuf
-
Utility class for working with Protocol Buffer (Proto) data.
- ProtoByteUtils() - Constructor for class org.apache.beam.sdk.extensions.protobuf.ProtoByteUtils
- ProtoCoder<T> - Class in org.apache.beam.sdk.extensions.protobuf
-
A
Coder
using Google Protocol Buffers binary format. - ProtoCoder(Class<T>, Set<Class<?>>) - Constructor for class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
-
Private constructor.
- ProtoCoder and Determinism - Search tag in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
- Section
- ProtoCoder and Encoding Stability - Search tag in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
- Section
- ProtoDomain - Class in org.apache.beam.sdk.extensions.protobuf
-
ProtoDomain is a container class for Protobuf descriptors.
- ProtoDynamicMessageSchema<T> - Class in org.apache.beam.sdk.extensions.protobuf
- ProtoFromBytes<T> - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
- ProtoMessageSchema - Class in org.apache.beam.sdk.extensions.protobuf
- ProtoMessageSchema() - Constructor for class org.apache.beam.sdk.extensions.protobuf.ProtoMessageSchema
- protoModeToJsonMode(TableFieldSchema.Mode) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
- ProtoPayloadSerializerProvider - Class in org.apache.beam.sdk.extensions.protobuf
- ProtoPayloadSerializerProvider() - Constructor for class org.apache.beam.sdk.extensions.protobuf.ProtoPayloadSerializerProvider
- ProtoSchemaLogicalTypes - Class in org.apache.beam.sdk.extensions.protobuf
-
A set of
Schema.LogicalType
classes to represent protocol buffer types. - ProtoSchemaLogicalTypes() - Constructor for class org.apache.beam.sdk.extensions.protobuf.ProtoSchemaLogicalTypes
- ProtoSchemaLogicalTypes.DurationConvert - Class in org.apache.beam.sdk.extensions.protobuf
- ProtoSchemaLogicalTypes.Fixed32 - Class in org.apache.beam.sdk.extensions.protobuf
-
A Fixed32 type.
- ProtoSchemaLogicalTypes.Fixed64 - Class in org.apache.beam.sdk.extensions.protobuf
-
A Fixed64 type.
- ProtoSchemaLogicalTypes.SFixed32 - Class in org.apache.beam.sdk.extensions.protobuf
-
A SFixed32 type.
- ProtoSchemaLogicalTypes.SFixed64 - Class in org.apache.beam.sdk.extensions.protobuf
-
An SFixed64 type.
- ProtoSchemaLogicalTypes.SInt32 - Class in org.apache.beam.sdk.extensions.protobuf
-
A SInt32 type.
- ProtoSchemaLogicalTypes.SInt64 - Class in org.apache.beam.sdk.extensions.protobuf
-
A SIn64 type.
- ProtoSchemaLogicalTypes.TimestampConvert - Class in org.apache.beam.sdk.extensions.protobuf
- ProtoSchemaLogicalTypes.UInt32 - Class in org.apache.beam.sdk.extensions.protobuf
-
A UInt32 type.
- ProtoSchemaLogicalTypes.UInt64 - Class in org.apache.beam.sdk.extensions.protobuf
-
A UIn64 type.
- protoSchemaToTableSchema(TableSchema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
- protoTableFieldToTableField(TableFieldSchema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
- protoTableSchemaFromAvroSchema(Schema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.AvroGenericRecordToStorageApiProto
-
Given an Avro Schema, returns a protocol-buffer TableSchema that can be used to write data through BigQuery Storage API.
- ProtoToBytes<T> - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
- ProtoToBytes() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.ProtoToBytes
- protoTypeToJsonType(TableFieldSchema.Type) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
- provide(String, Map<String, String>) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- provider() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQuerySchemaTransformTranslation.BigQueryStorageReadSchemaTransformTranslator
- provider() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQuerySchemaTransformTranslation.BigQueryWriteSchemaTransformTranslator
- provider() - Method in class org.apache.beam.sdk.io.TFRecordSchemaTransformTranslation.TFRecordReadSchemaTransformTranslator
- provider() - Method in class org.apache.beam.sdk.io.TFRecordSchemaTransformTranslation.TFRecordWriteSchemaTransformTranslator
- provider() - Method in class org.apache.beam.sdk.io.thrift.ThriftSchema.Customizer
- provider() - Static method in class org.apache.beam.sdk.io.thrift.ThriftSchema
-
Schema provider that maps any thrift type to a Beam schema, assuming that any typedefs that might have been used in the thrift definitions will preserve all required metadata to infer the beam type (which is the case for any primitive typedefs and alike).
- provider() - Method in class org.apache.beam.sdk.schemas.transforms.SchemaTransformTranslation.SchemaTransformPayloadTranslator
- Provider() - Constructor for class org.apache.beam.runners.fnexecution.environment.ExternalEnvironmentFactory.Provider
- Provider(InstructionRequestHandler) - Constructor for class org.apache.beam.runners.fnexecution.environment.StaticRemoteEnvironmentFactory.Provider
- Provider(PipelineOptions) - Constructor for class org.apache.beam.runners.fnexecution.environment.DockerEnvironmentFactory.Provider
- Provider(PipelineOptions) - Constructor for class org.apache.beam.runners.fnexecution.environment.EmbeddedEnvironmentFactory.Provider
- Provider(PipelineOptions) - Constructor for class org.apache.beam.runners.fnexecution.environment.ProcessEnvironmentFactory.Provider
- Providers - Class in org.apache.beam.sdk.schemas.io
-
Helpers for implementing the "Provider" pattern.
- Providers.Identifyable - Interface in org.apache.beam.sdk.schemas.io
- PTransform<InputT,
OutputT> - Class in org.apache.beam.sdk.transforms - PTransform() - Constructor for class org.apache.beam.sdk.transforms.PTransform
- PTransform(String) - Constructor for class org.apache.beam.sdk.transforms.PTransform
- PTransformErrorHandler(PTransform<PCollection<ErrorT>, OutputT>, Pipeline, Coder<ErrorT>) - Constructor for class org.apache.beam.sdk.transforms.errorhandling.ErrorHandler.PTransformErrorHandler
-
Constructs a new ErrorHandler, but should not be called directly.
- publish(List<PubsubMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
-
Publish messages to
TestPubsub.topicPath()
. - publish(PubsubClient.TopicPath, List<PubsubClient.OutgoingMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Publish
outgoingMessages
to Pubsubtopic
. - publish(PubsubClient.TopicPath, List<PubsubClient.OutgoingMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
- publish(PubsubClient.TopicPath, List<PubsubClient.OutgoingMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
- publish(PubsubClient.TopicPath, List<PubsubClient.OutgoingMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- publishBatch(List<Solace.Record>, boolean, SerializableFunction<Solace.Record, Destination>, DeliveryMode) - Method in interface org.apache.beam.sdk.io.solace.broker.MessageProducer
-
Publishes a batch of messages to the broker.
- publishBatch(List<Solace.Record>, boolean, SerializableFunction<Solace.Record, Destination>, DeliveryMode) - Method in class org.apache.beam.sdk.io.solace.broker.SolaceMessageProducer
- PublisherOptions - Class in org.apache.beam.sdk.io.gcp.pubsublite
-
Options needed for a Pub/Sub Lite Publisher.
- PublisherOptions() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PublisherOptions
- PublisherOptions.Builder - Class in org.apache.beam.sdk.io.gcp.pubsublite
- Publishing messages to RabbitMQ server - Search tag in class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO
- Section
- publishLatencyMetrics() - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Write
-
Publish latency metrics using Beam metrics.
- PublishResult() - Constructor for class org.apache.beam.sdk.io.solace.data.Solace.PublishResult
- PublishResultHandler - Class in org.apache.beam.sdk.io.solace.broker
-
This class is required to handle callbacks from Solace, to find out if messages were actually published or there were any kind of error.
- PublishResultHandler(Queue<Solace.PublishResult>) - Constructor for class org.apache.beam.sdk.io.solace.broker.PublishResultHandler
- publishResults(UnboundedSolaceWriter.BeamContextWrapper) - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
- publishSingleMessage(Solace.Record, Destination, boolean, DeliveryMode) - Method in interface org.apache.beam.sdk.io.solace.broker.MessageProducer
-
Publishes a message to the broker.
- publishSingleMessage(Solace.Record, Destination, boolean, DeliveryMode) - Method in class org.apache.beam.sdk.io.solace.broker.SolaceMessageProducer
- PUBSUB_DYNAMIC_DESTINATIONS - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- PUBSUB_ID_ATTRIBUTE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- PUBSUB_SERIALIZED_ATTRIBUTES_FN - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- PUBSUB_SUBSCRIPTION - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- PUBSUB_SUBSCRIPTION_OVERRIDE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- PUBSUB_TIMESTAMP_ATTRIBUTE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- PUBSUB_TOPIC - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- PUBSUB_TOPIC_OVERRIDE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- PubsubClient - Class in org.apache.beam.sdk.io.gcp.pubsub
-
An (abstract) helper class for talking to Pubsub via an underlying transport.
- PubsubClient() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
- PubsubClient.IncomingMessage - Class in org.apache.beam.sdk.io.gcp.pubsub
-
A message received from Pubsub.
- PubsubClient.OutgoingMessage - Class in org.apache.beam.sdk.io.gcp.pubsub
-
A message to be sent to Pubsub.
- PubsubClient.ProjectPath - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Path representing a cloud project id.
- PubsubClient.PubsubClientFactory - Interface in org.apache.beam.sdk.io.gcp.pubsub
-
Factory for creating clients.
- PubsubClient.SchemaPath - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Path representing a Pubsub schema.
- PubsubClient.SubscriptionPath - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Path representing a Pubsub subscription.
- PubsubClient.TopicPath - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Path representing a Pubsub topic.
- PubsubCoderProviderRegistrar - Class in org.apache.beam.sdk.io.gcp.pubsub
-
A
CoderProviderRegistrar
for standard types used withPubsubIO
. - PubsubCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubCoderProviderRegistrar
- PubsubDlqProvider - Class in org.apache.beam.sdk.io.gcp.pubsub
- PubsubDlqProvider() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubDlqProvider
- PubsubGrpcClient - Class in org.apache.beam.sdk.io.gcp.pubsub
-
A helper class for talking to Pubsub via grpc.
- PubsubIO - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Read and Write
PTransform
s for Cloud Pub/Sub streams. - PubsubIO.PubsubSubscription - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Class representing a Cloud Pub/Sub Subscription.
- PubsubIO.PubsubTopic - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Class representing a Cloud Pub/Sub Topic.
- PubsubIO.Read<T> - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Implementation of read methods.
- PubsubIO.Write<T> - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Implementation of write methods.
- PubsubIO.Write.PubsubBoundedWriter - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Writer to Pubsub which batches messages from bounded collections.
- PubsubJsonClient - Class in org.apache.beam.sdk.io.gcp.pubsub
-
A Pubsub client using JSON transport.
- PubsubLiteIO - Class in org.apache.beam.sdk.io.gcp.pubsublite
-
I/O transforms for reading from Google Pub/Sub Lite.
- PubsubLiteReadSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
- PubsubLiteReadSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.pubsublite
- PubsubLiteReadSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
- PubsubLiteReadSchemaTransformProvider.ErrorFn - Class in org.apache.beam.sdk.io.gcp.pubsublite
- PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.gcp.pubsublite
- PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.pubsublite
- PubsubLiteSink - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
-
A sink which publishes messages to Pub/Sub Lite.
- PubsubLiteSink(PublisherOptions) - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.PubsubLiteSink
- PubsubLiteTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.pubsublite
-
Pub/Sub Lite table provider.
- PubsubLiteTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.pubsublite.PubsubLiteTableProvider
- PubsubLiteWriteSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
- PubsubLiteWriteSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.pubsublite
- PubsubLiteWriteSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
- PubsubLiteWriteSchemaTransformProvider.ErrorCounterFn - Class in org.apache.beam.sdk.io.gcp.pubsublite
- PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.gcp.pubsublite
- PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.pubsublite
- PubsubLiteWriteSchemaTransformProvider.SetUuidFromPubSubMessage - Class in org.apache.beam.sdk.io.gcp.pubsublite
- PubsubLiteWriteSchemaTransformProvider.SetUuidFromPubSubMessage.SetUuidFn - Class in org.apache.beam.sdk.io.gcp.pubsublite
- PubsubMessage - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Class representing a Pub/Sub message.
- PubsubMessage(byte[], Map<String, String>) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
- PubsubMessage(byte[], Map<String, String>, String) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
- PubsubMessage(byte[], Map<String, String>, String, String) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
- PubsubMessagePayloadOnlyCoder - Class in org.apache.beam.sdk.io.gcp.pubsub
-
A coder for PubsubMessage treating the raw bytes being decoded as the message's payload.
- PubsubMessagePayloadOnlyCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
- PubsubMessages - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Common util functions for converting between PubsubMessage proto and
PubsubMessage
. - PubsubMessages.DeserializeBytesIntoPubsubMessagePayloadOnly - Class in org.apache.beam.sdk.io.gcp.pubsub
- PubsubMessages.ParsePayloadAsPubsubMessageProto - Class in org.apache.beam.sdk.io.gcp.pubsub
- PubsubMessages.ParsePubsubMessageProtoAsPayload - Class in org.apache.beam.sdk.io.gcp.pubsub
- PubsubMessageSchemaCoder - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Provides a
SchemaCoder
forPubsubMessage
, including the topic and all fields of a PubSub message from server. - PubsubMessageSchemaCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageSchemaCoder
- PubsubMessageWithAttributesAndMessageIdAndOrderingKeyCoder - Class in org.apache.beam.sdk.io.gcp.pubsub
-
A coder for PubsubMessage including all fields of a PubSub message from server.
- PubsubMessageWithAttributesAndMessageIdAndOrderingKeyCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdAndOrderingKeyCoder
- PubsubMessageWithAttributesAndMessageIdCoder - Class in org.apache.beam.sdk.io.gcp.pubsub
-
A coder for PubsubMessage including attributes and the message id from the PubSub server.
- PubsubMessageWithAttributesAndMessageIdCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdCoder
- PubsubMessageWithAttributesCoder - Class in org.apache.beam.sdk.io.gcp.pubsub
-
A coder for PubsubMessage including attributes.
- PubsubMessageWithAttributesCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
- PubsubMessageWithMessageIdCoder - Class in org.apache.beam.sdk.io.gcp.pubsub
-
A coder for PubsubMessage treating the raw bytes being decoded as the message's payload, with the message id from the PubSub server.
- PubsubMessageWithMessageIdCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithMessageIdCoder
- PubsubMessageWithTopicCoder - Class in org.apache.beam.sdk.io.gcp.pubsub
-
A coder for PubsubMessage including the topic from the PubSub server.
- PubsubMessageWithTopicCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithTopicCoder
- PubsubOptions - Interface in org.apache.beam.sdk.io.gcp.pubsub
-
Properties that can be set when using Google Cloud Pub/Sub with the Apache Beam SDK.
- PubSubPayloadTranslation - Class in org.apache.beam.sdk.io.gcp.pubsub
- PubSubPayloadTranslation() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubSubPayloadTranslation
- PubSubPayloadTranslation.ReadRegistrar - Class in org.apache.beam.sdk.io.gcp.pubsub
- PubSubPayloadTranslation.WriteRegistrar - Class in org.apache.beam.sdk.io.gcp.pubsub
- PubsubReadSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Configuration for reading from Pub/Sub.
- PubsubReadSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
- PubsubReadSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.pubsub
- PubsubReadSchemaTransformConfiguration.ErrorHandling - Class in org.apache.beam.sdk.io.gcp.pubsub
- PubsubReadSchemaTransformConfiguration.ErrorHandling.Builder - Class in org.apache.beam.sdk.io.gcp.pubsub
- PubsubReadSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.pubsub
-
An implementation of
TypedSchemaTransformProvider
for Pub/Sub reads configured usingPubsubReadSchemaTransformConfiguration
. - PubsubReadSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformProvider
- PubsubSchemaIOProvider - Class in org.apache.beam.sdk.io.gcp.pubsub
-
An implementation of
SchemaIOProvider
for reading and writing JSON/AVRO payloads withPubsubIO
. - PubsubSchemaIOProvider() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
- PubsubTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.pubsub
-
TableProvider
forPubsubIO
for consumption by Beam SQL. - PubsubTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.pubsub.PubsubTableProvider
- PubsubTestClient - Class in org.apache.beam.sdk.io.gcp.pubsub
-
A (partial) implementation of
PubsubClient
for use by unit tests. - PubsubTestClient() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- PubsubTestClient.PubsubTestClientFactory - Interface in org.apache.beam.sdk.io.gcp.pubsub
-
Closing the factory will validate all expected messages were processed.
- PubsubUnboundedSink - Class in org.apache.beam.sdk.io.gcp.pubsub
-
A PTransform which streams messages to Pubsub.
- PubsubUnboundedSink(PubsubClient.PubsubClientFactory, ValueProvider<PubsubClient.TopicPath>, String, String, int, boolean) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
- PubsubUnboundedSink(PubsubClient.PubsubClientFactory, ValueProvider<PubsubClient.TopicPath>, String, String, int, boolean, int, int) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
- PubsubUnboundedSink(PubsubClient.PubsubClientFactory, ValueProvider<PubsubClient.TopicPath>, String, String, int, boolean, int, int, String) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
- PubsubUnboundedSink(PubsubClient.PubsubClientFactory, ValueProvider<PubsubClient.TopicPath>, String, String, int, boolean, String) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
- PubsubUnboundedSource - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Users should use
invalid reference
PubsubIO#read
- PubsubUnboundedSource(Clock, PubsubClient.PubsubClientFactory, ValueProvider<PubsubClient.ProjectPath>, ValueProvider<PubsubClient.TopicPath>, ValueProvider<PubsubClient.SubscriptionPath>, String, String, boolean) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
-
Construct an unbounded source to consume from the Pubsub
subscription
. - PubsubUnboundedSource(PubsubClient.PubsubClientFactory, ValueProvider<PubsubClient.ProjectPath>, ValueProvider<PubsubClient.TopicPath>, ValueProvider<PubsubClient.SubscriptionPath>, String, String, boolean) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
-
Construct an unbounded source to consume from the Pubsub
subscription
. - PubsubUnboundedSource(PubsubClient.PubsubClientFactory, ValueProvider<PubsubClient.ProjectPath>, ValueProvider<PubsubClient.TopicPath>, ValueProvider<PubsubClient.SubscriptionPath>, String, String, boolean, boolean) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
-
Construct an unbounded source to consume from the Pubsub
subscription
. - PubsubWriteSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Configuration for writing to Pub/Sub.
- PubsubWriteSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration
- PubsubWriteSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.pubsub
- PubsubWriteSchemaTransformConfiguration.ErrorHandling - Class in org.apache.beam.sdk.io.gcp.pubsub
- PubsubWriteSchemaTransformConfiguration.ErrorHandling.Builder - Class in org.apache.beam.sdk.io.gcp.pubsub
- PubsubWriteSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.pubsub
-
An implementation of
TypedSchemaTransformProvider
for Pub/Sub reads configured usingPubsubWriteSchemaTransformConfiguration
. - PubsubWriteSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformProvider
- PubsubWriteSchemaTransformProvider.ErrorFn - Class in org.apache.beam.sdk.io.gcp.pubsub
- pull(long, PubsubClient.SubscriptionPath, int, boolean) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Request the next batch of up to
batchSize
messages fromsubscription
. - pull(long, PubsubClient.SubscriptionPath, int, boolean) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
- pull(long, PubsubClient.SubscriptionPath, int, boolean) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
- pull(long, PubsubClient.SubscriptionPath, int, boolean) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- PulsarIO - Class in org.apache.beam.sdk.io.pulsar
-
Class for reading and writing from Apache Pulsar.
- PulsarIO.Read - Class in org.apache.beam.sdk.io.pulsar
- PulsarIO.Write - Class in org.apache.beam.sdk.io.pulsar
- PulsarMessage - Class in org.apache.beam.sdk.io.pulsar
-
Class representing a Pulsar Message record.
- PulsarMessage(String, Long) - Constructor for class org.apache.beam.sdk.io.pulsar.PulsarMessage
- PulsarMessage(String, Long, Object) - Constructor for class org.apache.beam.sdk.io.pulsar.PulsarMessage
- PulsarMessageCoder - Class in org.apache.beam.sdk.io.pulsar
- PulsarMessageCoder() - Constructor for class org.apache.beam.sdk.io.pulsar.PulsarMessageCoder
- PulsarSourceDescriptor - Class in org.apache.beam.sdk.io.pulsar
- PulsarSourceDescriptor() - Constructor for class org.apache.beam.sdk.io.pulsar.PulsarSourceDescriptor
- PUSH_DOWN_OPTION - Static variable in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider
- pushArrayBuffer(ArrayBuffer<?>, Option<Object>, Option<StreamBlockId>) - Method in class org.apache.beam.sdk.io.sparkreceiver.WrappedSupervisor
- pushbackDoFnRunner - Variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- pushBytes(ByteBuffer, Option<Object>, Option<StreamBlockId>) - Method in class org.apache.beam.sdk.io.sparkreceiver.WrappedSupervisor
- pushDataset(String, Dataset) - Method in class org.apache.beam.runners.spark.translation.SparkTranslationContext
-
Add output of transform to context.
- pushIterator(Iterator<?>, Option<Object>, Option<StreamBlockId>) - Method in class org.apache.beam.sdk.io.sparkreceiver.WrappedSupervisor
- pushSingle(Object) - Method in class org.apache.beam.sdk.io.sparkreceiver.WrappedSupervisor
- put(String, String) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcsOptions.GcsCustomAuditEntries
- put(String, InstructionRequestHandler) - Method in interface org.apache.beam.runners.fnexecution.control.ControlClientPool.Sink
-
Puts an
InstructionRequestHandler
into a client pool. - put(K2, V2) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
- put(K, V) - Method in interface org.apache.beam.sdk.state.MapState
-
Associates the specified value with the specified key in this state.
- put(K, V) - Method in interface org.apache.beam.sdk.state.MultimapState
-
Associates the specified value with the specified key in this multimap.
- put(T) - Method in class org.apache.beam.sdk.fn.CancellableQueue
-
Adds an element to this queue.
- putAll(Map<? extends K2, ? extends V2>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
- putDataset(PTransform<?, ? extends PValue>, Dataset) - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
-
Add single output of transform to context map and possibly cache if it conforms
EvaluationContext.shouldCache(PTransform, PValue)
. - putDataset(PCollection<T>, Dataset<WindowedValue<T>>) - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator.TranslationState
- putDataset(PCollection<T>, Dataset<WindowedValue<T>>, boolean) - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator.TranslationState
- putDataset(PCollection<T>, Dataset<WindowedValue<T>>, boolean) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
- putDataset(PValue, Dataset) - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
-
Add output of transform to context map and possibly cache if it conforms
EvaluationContext.shouldCache(PTransform, PValue)
. - putIfAbsent(K, V) - Method in interface org.apache.beam.sdk.state.MapState
-
A deferred read-followed-by-write.
- putNormalizedKey(byte[], MemorySegment, int, int) - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueComparator
- putPView(PCollectionView<?>, Iterable<WindowedValue<?>>, Coder<Iterable<WindowedValue<?>>>) - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
-
Adds/Replaces a view to the current views creates in the pipeline.
- putPView(PCollectionView<?>, Iterable<WindowedValue<?>>, Coder<Iterable<WindowedValue<?>>>) - Method in class org.apache.beam.runners.spark.translation.SparkPCollectionView
- putSchemaIfAbsent(TableReference, TableSchema) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableSchemaCache
-
Registers schema for a table if one is not already present.
- putStreamingPView(PCollectionView<?>, Iterable<WindowedValue<?>>, Coder<Iterable<WindowedValue<?>>>) - Method in class org.apache.beam.runners.spark.translation.SparkPCollectionView
- putUnresolved(PCollection<OutT>, PipelineTranslator.UnresolvedTranslation<InT, OutT>) - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator.TranslationState
- putUnresolved(PCollection<T>, PipelineTranslator.UnresolvedTranslation<InputT, T>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
- PValue - Interface in org.apache.beam.sdk.values
-
For internal use.
- PValueBase - Class in org.apache.beam.sdk.values
-
For internal use.
- PValueBase() - Constructor for class org.apache.beam.sdk.values.PValueBase
-
No-arg constructor to allow subclasses to implement
Serializable
. - PValueBase(Pipeline) - Constructor for class org.apache.beam.sdk.values.PValueBase
- PValues - Class in org.apache.beam.sdk.values
-
For internal use.
- PythonCallable - Class in org.apache.beam.sdk.schemas.logicaltypes
-
A logical type for PythonCallableSource objects.
- PythonCallable() - Constructor for class org.apache.beam.sdk.schemas.logicaltypes.PythonCallable
- PythonExternalTransform<InputT,
OutputT> - Class in org.apache.beam.sdk.extensions.python -
Wrapper for invoking external Python transforms.
- PythonExternalTransformOptions - Interface in org.apache.beam.sdk.extensions.python
-
Pipeline options for
PythonExternalTransform
. - PythonExternalTransformOptionsRegistrar - Class in org.apache.beam.sdk.extensions.python
-
A registrar for
PythonExternalTransformOptions
. - PythonExternalTransformOptionsRegistrar() - Constructor for class org.apache.beam.sdk.extensions.python.PythonExternalTransformOptionsRegistrar
- PythonMap<InputT,
OutputT> - Class in org.apache.beam.sdk.extensions.python.transforms -
Wrapper for invoking external Python
Map
transforms.. - PythonService - Class in org.apache.beam.sdk.extensions.python
-
Utility to bootstrap and start a Beam Python service.
- PythonService(String, String...) - Constructor for class org.apache.beam.sdk.extensions.python.PythonService
- PythonService(String, List<String>) - Constructor for class org.apache.beam.sdk.extensions.python.PythonService
- PythonService(String, List<String>, List<String>) - Constructor for class org.apache.beam.sdk.extensions.python.PythonService
Q
- QMARK - Static variable in class org.apache.beam.sdk.extensions.sql.impl.cep.Quantifier
- QMARK_RELUCTANT - Static variable in class org.apache.beam.sdk.extensions.sql.impl.cep.Quantifier
- qualifiedComponent() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- qualifiedComponent() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifyComponentContext
- QualifiedComponentContext(ParserRuleContext, int) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifiedComponentContext
- Qualifier() - Constructor for class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.Qualifier
- QUALIFIER_DEFAULT - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
- qualifierList() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.ArrayQualifierListContext
- qualifierList() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.MapQualifierListContext
- qualifierList() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifiedComponentContext
- qualifierList() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- qualifierList(int) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.ArrayQualifierListContext
- qualifierList(int) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.MapQualifierListContext
- QualifierListContext() - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifierListContext
- QualifierListContext(ParserRuleContext, int) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifierListContext
- QualifyComponentContext(FieldSpecifierNotationParser.DotExpressionComponentContext) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifyComponentContext
- Quantifier - Class in org.apache.beam.sdk.extensions.sql.impl.cep
-
The
Quantifier
class is intended for storing the information of the quantifier for a pattern variable. - query(String) - Static method in class org.apache.beam.sdk.extensions.sql.SqlTransform
-
Returns a
SqlTransform
representing an equivalent execution plan. - query(MetricResults, Lineage.Type) - Static method in class org.apache.beam.sdk.metrics.Lineage
-
Query
BoundedTrie
metrics fromMetricResults
. - query(MetricResults, Lineage.Type, String) - Static method in class org.apache.beam.sdk.metrics.Lineage
-
Query
BoundedTrie
metrics fromMetricResults
. - QUERY_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Counter for the total number of queries issued during the execution of the Connector.
- queryChangeStreamAction(ChangeStreamDao, PartitionMetadataDao, ChangeStreamRecordMapper, PartitionMetadataMapper, DataChangeRecordAction, HeartbeatRecordAction, ChildPartitionsRecordAction, PartitionStartRecordAction, PartitionEndRecordAction, PartitionEventRecordAction, ChangeStreamMetrics) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.ActionFactory
-
Creates and returns a single instance of an action class capable of performing a change stream query for a given partition.
- QueryChangeStreamAction - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.action
-
Main action class for querying a partition change stream.
- queryMetrics(MetricsFilter) - Method in class org.apache.beam.runners.jet.metrics.JetMetricResults
- queryMetrics(MetricsFilter) - Method in class org.apache.beam.runners.portability.PortableMetrics
- queryMetrics(MetricsFilter) - Method in class org.apache.beam.sdk.metrics.MetricResults
-
Query for all metric values that match a given filter.
- QueryParameters() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.QueryPlanner.QueryParameters
- QueryPlanner - Interface in org.apache.beam.sdk.extensions.sql.impl
-
An interface that planners should implement to convert sql statement to
BeamRelNode
orSqlNode
. - QueryPlanner.Factory - Interface in org.apache.beam.sdk.extensions.sql.impl
- QueryPlanner.QueryParameters - Class in org.apache.beam.sdk.extensions.sql.impl
- QueryPlanner.QueryParameters.Kind - Enum Class in org.apache.beam.sdk.extensions.sql.impl
- queryResultHasChecksum(String) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher
- QueryStatementConverter - Class in org.apache.beam.sdk.extensions.sql.zetasql.translation
-
Converts a resolved Zeta SQL query represented by a tree to corresponding Calcite representation.
- QueryTrait - Class in org.apache.beam.sdk.extensions.sql.zetasql
-
QueryTrait.
- QueryTrait() - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.QueryTrait
- queryUnflattened(String, String, boolean, boolean) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
-
Performs a query without flattening results.
- queryUnflattened(String, String, boolean, boolean, String) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
-
Performs a query without flattening results.
- queryWithRetries(String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
- queryWithRetries(String, String, boolean) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
- queryWithRetriesUsingStandardSql(String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
- Queue() - Constructor for class org.apache.beam.sdk.io.solace.data.Semp.Queue
- QUEUE - Enum constant in enum class org.apache.beam.sdk.io.solace.data.Solace.DestinationType
- QueueData() - Constructor for class org.apache.beam.sdk.io.solace.data.Semp.QueueData
- queueUrl(T) - Method in interface org.apache.beam.sdk.io.aws2.sqs.SqsIO.WriteBatches.DynamicDestination
- Quick Overview - Search tag in class org.apache.beam.io.debezium.DebeziumIO
- Section
- Quick Overview - Search tag in class org.apache.beam.io.debezium.KafkaSourceConsumerFn
- Section
- quoteIdentifier(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamBigQuerySqlDialect
R
- RabbitMqIO - Class in org.apache.beam.sdk.io.rabbitmq
-
A IO to publish or consume messages with a RabbitMQ broker.
- RabbitMqIO.Read - Class in org.apache.beam.sdk.io.rabbitmq
-
A
PTransform
to consume messages from RabbitMQ server. - RabbitMqIO.Write - Class in org.apache.beam.sdk.io.rabbitmq
-
A
PTransform
to publish messages to a RabbitMQ server. - RabbitMqMessage - Class in org.apache.beam.sdk.io.rabbitmq
-
It contains the message payload, and additional metadata like routing key or attributes.
- RabbitMqMessage(byte[]) - Constructor for class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
- RabbitMqMessage(String, byte[], String, String, Map<String, Object>, Integer, Integer, String, String, String, String, Date, String, String, String, String) - Constructor for class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
- RabbitMqMessage(String, GetResponse) - Constructor for class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
- RampupThrottlingFn<T> - Class in org.apache.beam.sdk.io.gcp.datastore
-
An implementation of a client-side throttler that enforces a gradual ramp-up, broadly in line with Datastore best practices.
- RampupThrottlingFn(int, PCollectionView<Instant>) - Constructor for class org.apache.beam.sdk.io.gcp.datastore.RampupThrottlingFn
- RampupThrottlingFn(ValueProvider<Integer>, PCollectionView<Instant>) - Constructor for class org.apache.beam.sdk.io.gcp.datastore.RampupThrottlingFn
- random() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.Uuid
- RandomAccessData - Class in org.apache.beam.runners.dataflow.util
-
An elastic-sized byte array which allows you to manipulate it as a stream, or access it directly.
- RandomAccessData() - Constructor for class org.apache.beam.runners.dataflow.util.RandomAccessData
-
Constructs a RandomAccessData with a default buffer size.
- RandomAccessData(byte[]) - Constructor for class org.apache.beam.runners.dataflow.util.RandomAccessData
-
Constructs a RandomAccessData with the initial buffer.
- RandomAccessData(int) - Constructor for class org.apache.beam.runners.dataflow.util.RandomAccessData
-
Constructs a RandomAccessData with the given buffer size.
- RandomAccessData.RandomAccessDataCoder - Class in org.apache.beam.runners.dataflow.util
-
A
Coder
which encodes the valid parts of this stream. - RandomAccessData.UnsignedLexicographicalComparator - Class in org.apache.beam.runners.dataflow.util
-
A
Comparator
that compares two byte arrays lexicographically. - RandomAccessDataCoder() - Constructor for class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
- range - Variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
- range - Variable in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
- RANGE_OFFSET - Static variable in class org.apache.beam.sdk.io.contextualtextio.RecordWithMetadata
- RangeTracker<PositionT> - Interface in org.apache.beam.sdk.io.range
-
A
RangeTracker
is a thread-safe helper object for implementing dynamic work rebalancing in position-basedBoundedSource.BoundedReader
subclasses. - Rate() - Constructor for class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Rate
- RateLimitPolicy - Interface in org.apache.beam.sdk.io.aws2.kinesis
- RateLimitPolicyFactory - Interface in org.apache.beam.sdk.io.aws2.kinesis
-
Implement this interface to create a
RateLimitPolicy
. - RateLimitPolicyFactory.DefaultRateLimiter - Class in org.apache.beam.sdk.io.aws2.kinesis
-
Default rate limiter that throttles reading from a shard using an exponential backoff if the response is empty or if the consumer is throttled by AWS.
- RateLimitPolicyFactory.DelayIntervalRateLimiter - Class in org.apache.beam.sdk.io.aws2.kinesis
- RawUnionValue - Class in org.apache.beam.sdk.transforms.join
-
This corresponds to an integer union tag and value.
- RawUnionValue(int, Object) - Constructor for class org.apache.beam.sdk.transforms.join.RawUnionValue
-
Constructs a partial union from the given union tag and value.
- reachedEnd() - Method in class org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat
- reachedEnd() - Method in class org.apache.beam.runners.flink.translation.wrappers.SourceInputFormat
- read() - Static method in class org.apache.beam.io.debezium.DebeziumIO
-
Read data from a Debezium source.
- read() - Method in class org.apache.beam.runners.flink.translation.wrappers.DataInputViewWrapper
- read() - Static method in class org.apache.beam.sdk.io.amqp.AmqpIO
- read() - Static method in class org.apache.beam.sdk.io.aws2.dynamodb.DynamoDBIO
- read() - Static method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO
-
Returns a new
KinesisIO.Read
transform for reading from Kinesis. - read() - Static method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO
- read() - Static method in class org.apache.beam.sdk.io.cassandra.CassandraIO
-
Provide a
CassandraIO.Read
PTransform
to read data from a Cassandra database. - read() - Static method in class org.apache.beam.sdk.io.cdap.CdapIO
- read() - Static method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO
-
A
PTransform
that reads from one or more text files and returns a boundedPCollection
containing oneelement
for each line in the input files. - read() - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO
- read() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
-
Deprecated.Use
BigQueryIO.read(SerializableFunction)
orBigQueryIO.readTableRows()
instead.BigQueryIO.readTableRows()
does exactly the same asBigQueryIO.read()
, howeverBigQueryIO.read(SerializableFunction)
performs better. - read() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO
-
Creates an uninitialized
BigtableIO.Read
. - read() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1
-
Returns an empty
DatastoreV1.Read
builder. - read() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1
-
The class returned by this method provides the ability to create
PTransforms
for read operations available in the Firestore V1 API provided byFirestoreStub
. - read() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
-
Creates an uninitialized instance of
SpannerIO.Read
. - read() - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsIO
- read() - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsV19
- read() - Static method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO
-
Creates an uninitialized
HadoopFormatIO.Read
. - read() - Static method in class org.apache.beam.sdk.io.hbase.HBaseIO
-
Creates an uninitialized
HBaseIO.Read
. - read() - Static method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO
-
Read data from Hive.
- read() - Static method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO
- read() - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO
-
Read data from a JDBC datasource.
- read() - Static method in class org.apache.beam.sdk.io.jms.JmsIO
- read() - Static method in class org.apache.beam.sdk.io.kafka.KafkaIO
-
Creates an uninitialized
KafkaIO.Read
PTransform
. - read() - Static method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
- read() - Static method in class org.apache.beam.sdk.io.kudu.KuduIO
- read() - Static method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO
-
Read data from GridFS.
- read() - Static method in class org.apache.beam.sdk.io.mongodb.MongoDbIO
-
Read data from MongoDB.
- read() - Static method in class org.apache.beam.sdk.io.mqtt.MqttIO
- read() - Static method in class org.apache.beam.sdk.io.pulsar.PulsarIO
-
Read from Apache Pulsar.
- read() - Static method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO
- read() - Static method in class org.apache.beam.sdk.io.redis.RedisIO
-
Read data from a Redis server.
- read() - Static method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO
-
Read data from a SingleStoreDB datasource.
- read() - Static method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO
-
Read data from Snowflake via COPY statement using default
SnowflakeBatchServiceImpl
. - read() - Static method in class org.apache.beam.sdk.io.solace.SolaceIO
-
Create a
SolaceIO.Read
transform, to read from Solace. - read() - Static method in class org.apache.beam.sdk.io.solr.SolrIO
- read() - Static method in class org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIO
- read() - Static method in class org.apache.beam.sdk.io.TextIO
-
A
PTransform
that reads from one or more text files and returns a boundedPCollection
containing one element for each line of the input files. - read() - Static method in class org.apache.beam.sdk.io.TFRecordIO
-
A
PTransform
that reads from a TFRecord file (or multiple TFRecord files matching a pattern) and returns aPCollection
containing the decoding of each of the records of the TFRecord file(s) as a byte array. - read() - Static method in class org.apache.beam.sdk.io.xml.XmlIO
-
Reads XML files as a
PCollection
of a given type mapped via JAXB. - read() - Method in interface org.apache.beam.sdk.state.BagState
- read() - Method in interface org.apache.beam.sdk.state.CombiningState
- read() - Method in interface org.apache.beam.sdk.state.ReadableState
-
Read the current value, blocking until it is available.
- read() - Method in interface org.apache.beam.sdk.state.ValueState
-
Read the current value, blocking until it is available.
- read(byte[], int, int) - Method in class org.apache.beam.runners.flink.translation.wrappers.DataInputViewWrapper
- read(Kryo, Input) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.SideInputValues.BaseSideInputValues
- read(Kryo, Input, Class<ValueAndCoderLazySerializable<T>>) - Method in class org.apache.beam.runners.spark.translation.ValueAndCoderKryoSerializer
- read(Class<T>) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
-
Reads records of the given type from an Avro file (or multiple Avro files matching a pattern).
- read(Class<T>) - Static method in class org.apache.beam.sdk.io.azure.cosmos.CosmosIO
-
Provide a
CosmosIO.Read
PTransform
to read data from a Cosmos DB. - read(Object, Decoder) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.encoder.TimestampEncoding
-
Deserializes a
Timestamp
from the givenDecoder
. - read(String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
-
Read all HL7v2 Messages from a single store.
- read(String) - Static method in class org.apache.beam.sdk.managed.Managed
-
Instantiates a
Managed.ManagedTransform
transform for the specified source. - read(ByteBuffer) - Method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingReadableByteChannel
- read(ByteBuffer) - Method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingSeekableByteChannel
- read(Schema) - Static method in class org.apache.beam.sdk.io.parquet.ParquetIO
-
Reads
GenericRecord
from a Parquet file (or multiple Parquet files matching the pattern). - read(SubscriberOptions) - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteIO
-
Read messages from Pub/Sub Lite.
- read(SnowflakeBatchServiceConfig) - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceImpl
-
Reading data from Snowflake tables in batch processing.
- read(SnowflakeBatchServiceConfig) - Method in interface org.apache.beam.sdk.io.snowflake.services.SnowflakeServices.BatchService
- read(SnowflakeServices) - Static method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO
-
Read data from Snowflake via COPY statement using user-defined
SnowflakeServices
. - read(SnowflakeStreamingServiceConfig) - Method in interface org.apache.beam.sdk.io.snowflake.services.SnowflakeServices.StreamingService
- read(SnowflakeStreamingServiceConfig) - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeStreamingServiceImpl
-
Reading data from Snowflake in streaming mode is not supported.
- read(ValueProvider<String>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
-
Read all HL7v2 Messages from a single store.
- read(SerializableFunction<SchemaAndRecord, T>) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
-
Reads from a BigQuery table or query and returns a
PCollection
with one element per each row of the table or query result, parsed from the BigQuery AVRO format using the specified function. - read(TypeDescriptor<T>, SerializableFunction<BytesXMLMessage, T>, SerializableFunction<T, Instant>) - Static method in class org.apache.beam.sdk.io.solace.SolaceIO
-
Create a
SolaceIO.Read
transform, to read from Solace. - read(DataInputView) - Method in class org.apache.beam.runners.flink.adapter.FlinkKey
- read(JavaStreamingContext, SerializablePipelineOptions, UnboundedSource<T, CheckpointMarkT>, String) - Static method in class org.apache.beam.runners.spark.io.SparkUnboundedSource
- read(T) - Method in interface org.apache.beam.sdk.fn.stream.DataStreams.OutputChunkConsumer
- Read - Search tag in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1
- Section
- Read - Class in org.apache.beam.sdk.io
-
A
PTransform
for reading from aSource
. - Read() - Constructor for class org.apache.beam.io.debezium.DebeziumIO.Read
- Read() - Constructor for class org.apache.beam.sdk.extensions.avro.io.AvroIO.Read
- Read() - Constructor for class org.apache.beam.sdk.io.amqp.AmqpIO.Read
- Read() - Constructor for class org.apache.beam.sdk.io.aws2.dynamodb.DynamoDBIO.Read
- Read() - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
- Read() - Constructor for class org.apache.beam.sdk.io.aws2.sqs.SqsIO.Read
- Read() - Constructor for class org.apache.beam.sdk.io.azure.cosmos.CosmosIO.Read
- Read() - Constructor for class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
- Read() - Constructor for class org.apache.beam.sdk.io.cdap.CdapIO.Read
- Read() - Constructor for class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.Read
- Read() - Constructor for class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
- Read() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
- Read() - Constructor for class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
- Read() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read
-
Instantiates a new Read.
- Read() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read
- Read() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
- Read() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
- Read() - Constructor for class org.apache.beam.sdk.io.googleads.GoogleAdsV19.Read
- Read() - Constructor for class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
- Read() - Constructor for class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Read
- Read() - Constructor for class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Read
- Read() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
- Read() - Constructor for class org.apache.beam.sdk.io.jms.JmsIO.Read
- Read() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- Read() - Constructor for class org.apache.beam.sdk.io.kudu.KuduIO.Read
- Read() - Constructor for class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
- Read() - Constructor for class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
- Read() - Constructor for class org.apache.beam.sdk.io.mqtt.MqttIO.Read
- Read() - Constructor for class org.apache.beam.sdk.io.parquet.ParquetIO.Read
- Read() - Constructor for class org.apache.beam.sdk.io.pulsar.PulsarIO.Read
- Read() - Constructor for class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO.Read
- Read() - Constructor for class org.apache.beam.sdk.io.Read
- Read() - Constructor for class org.apache.beam.sdk.io.redis.RedisIO.Read
- Read() - Constructor for class org.apache.beam.sdk.io.singlestore.SingleStoreIO.Read
- Read() - Constructor for class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
- Read() - Constructor for class org.apache.beam.sdk.io.solr.SolrIO.Read
- Read() - Constructor for class org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIO.Read
- Read() - Constructor for class org.apache.beam.sdk.io.TextIO.Read
- Read() - Constructor for class org.apache.beam.sdk.io.TFRecordIO.Read
- Read() - Constructor for class org.apache.beam.sdk.io.xml.XmlIO.Read
- READ_DATA_URN - Static variable in class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar
- READ_JSON_URN - Static variable in class org.apache.beam.io.debezium.DebeziumTransformRegistrar
- READ_ONCE - Enum constant in enum class org.apache.beam.sdk.jmh.schemas.RowBundle.Action
-
Read field from
RowWithGetters
provided byGetterBasedSchemaProvider.toRowFunction(TypeDescriptor)
. - READ_REPEATED - Enum constant in enum class org.apache.beam.sdk.jmh.schemas.RowBundle.Action
-
Repeatedly (3x) read field from
RowWithGetters
provided byGetterBasedSchemaProvider.toRowFunction(TypeDescriptor)
. - READ_TRANSFORMS - Static variable in class org.apache.beam.sdk.managed.Managed
- READ_URN - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.internal.ExternalTransformRegistrarImpl
- READ_URN - Static variable in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar
- READ_URN - Static variable in class org.apache.beam.sdk.io.snowflake.crosslanguage.SnowflakeTransformRegistrar
- Read.Bounded<T> - Class in org.apache.beam.sdk.io
-
PTransform
that reads from aBoundedSource
. - Read.Builder - Class in org.apache.beam.sdk.io
-
Helper class for building
Read
transforms. - Read.Unbounded<T> - Class in org.apache.beam.sdk.io
-
PTransform
that reads from aUnboundedSource
. - ReadableFileCoder - Class in org.apache.beam.sdk.io
-
A
Coder
forFileIO.ReadableFile
. - ReadableState<T> - Interface in org.apache.beam.sdk.state
-
A
State
that can be read viaReadableState.read()
. - ReadableStates - Class in org.apache.beam.sdk.state
-
For internal use only; no backwards-compatibility guarantees.
- ReadableStates() - Constructor for class org.apache.beam.sdk.state.ReadableStates
- readAll() - Static method in class org.apache.beam.sdk.io.cassandra.CassandraIO
-
Provide a
CassandraIO.ReadAll
PTransform
to read data from a Cassandra database. - readAll() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
-
A
PTransform
that works likeSpannerIO.read()
, but executes read operations coming from aPCollection
. - readAll() - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsIO
- readAll() - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsV19
- readAll() - Static method in class org.apache.beam.sdk.io.hbase.HBaseIO
-
A
PTransform
that works likeHBaseIO.read()
, but executes read operations coming from aPCollection
ofHBaseIO.Read
. - readAll() - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO
-
Like
JdbcIO.read()
, but executes multiple instances of the query substituting each element of aPCollection
as query parameters. - readAll() - Static method in class org.apache.beam.sdk.io.neo4j.Neo4jIO
-
Read all rows using a Neo4j Cypher query.
- readAll() - Static method in class org.apache.beam.sdk.io.solr.SolrIO
- readAll() - Static method in class org.apache.beam.sdk.io.TextIO
-
Deprecated.You can achieve The functionality of
TextIO.readAll()
usingFileIO
matching plusTextIO.readFiles()
. This is the preferred method to make composition explicit.TextIO.ReadAll
will not receive upgrades and will be removed in a future version of Beam. - readAll(Class<T>) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
-
Deprecated.You can achieve The functionality of
AvroIO.readAll(java.lang.Class<T>)
usingFileIO
matching plusAvroIO.readFiles(Class)
. This is the preferred method to make composition explicit.AvroIO.ReadAll
will not receive upgrades and will be removed in a future version of Beam. - readAll(List<String>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
-
Read all HL7v2 Messages from multiple stores.
- readAll(ValueProvider<List<String>>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
-
Read all HL7v2 Messages from multiple stores.
- ReadAll() - Constructor for class org.apache.beam.sdk.extensions.avro.io.AvroIO.ReadAll
-
Deprecated.
- ReadAll() - Constructor for class org.apache.beam.sdk.io.cassandra.CassandraIO.ReadAll
- ReadAll() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
- ReadAll() - Constructor for class org.apache.beam.sdk.io.googleads.GoogleAdsV19.ReadAll
- ReadAll() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadAll
- ReadAll() - Constructor for class org.apache.beam.sdk.io.neo4j.Neo4jIO.ReadAll
- ReadAll() - Constructor for class org.apache.beam.sdk.io.solr.SolrIO.ReadAll
- ReadAll() - Constructor for class org.apache.beam.sdk.io.TextIO.ReadAll
-
Deprecated.
- readAllGenericRecords(String) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
-
Deprecated.You can achieve The functionality of
AvroIO.readAllGenericRecords(String)
usingFileIO
matching plusAvroIO.readFilesGenericRecords(String)
. This is the preferred method to make composition explicit.AvroIO.ReadAll
will not receive upgrades and will be removed in a future version of Beam. - readAllGenericRecords(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
-
Deprecated.You can achieve The functionality of
AvroIO.readAllGenericRecords(Schema)
usingFileIO
matching plusAvroIO.readFilesGenericRecords(Schema)
. This is the preferred method to make composition explicit.AvroIO.ReadAll
will not receive upgrades and will be removed in a future version of Beam. - readAllRequests() - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
-
Retrieve all HL7v2 Messages from a PCollection of
HL7v2ReadParameter
. - readAllStreamPartitions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
-
Read all the StreamPartition and output PartitionRecord to stream them.
- ReadAllViaFileBasedSource<T> - Class in org.apache.beam.sdk.io
-
Reads each file in the input
PCollection
ofFileIO.ReadableFile
using given parameters for splitting files into offset ranges and for creating aFileBasedSource
for a file. - ReadAllViaFileBasedSource(long, SerializableFunction<String, ? extends FileBasedSource<T>>, Coder<T>) - Constructor for class org.apache.beam.sdk.io.ReadAllViaFileBasedSource
- ReadAllViaFileBasedSource(long, SerializableFunction<String, ? extends FileBasedSource<T>>, Coder<T>, boolean, ReadAllViaFileBasedSource.ReadFileRangesFnExceptionHandler) - Constructor for class org.apache.beam.sdk.io.ReadAllViaFileBasedSource
- ReadAllViaFileBasedSource.ReadFileRangesFnExceptionHandler - Class in org.apache.beam.sdk.io
-
A class to handle errors which occur during file reads.
- ReadAllViaFileBasedSourceTransform<InT,
T> - Class in org.apache.beam.sdk.io - ReadAllViaFileBasedSourceTransform(long, SerializableFunction<String, ? extends FileBasedSource<InT>>, Coder<T>) - Constructor for class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceTransform
- ReadAllViaFileBasedSourceTransform(long, SerializableFunction<String, ? extends FileBasedSource<InT>>, Coder<T>, boolean, ReadAllViaFileBasedSource.ReadFileRangesFnExceptionHandler) - Constructor for class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceTransform
- ReadAllViaFileBasedSourceTransform.AbstractReadFileRangesFn<InT,
T> - Class in org.apache.beam.sdk.io - ReadAllViaFileBasedSourceTransform.SplitIntoRangesFn - Class in org.apache.beam.sdk.io
- ReadAllViaFileBasedSourceWithFilename<T> - Class in org.apache.beam.sdk.io
-
Reads each file of the input
PCollection
and outputs each element as the value of aKV
, where the key is the filename from which that value came. - ReadAllViaFileBasedSourceWithFilename(long, SerializableFunction<String, ? extends FileBasedSource<T>>, Coder<KV<String, T>>) - Constructor for class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceWithFilename
- readAllWithFilter(List<String>, String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
-
Read all HL7v2 Messages from a multiple stores matching a filter.
- readAllWithFilter(ValueProvider<List<String>>, ValueProvider<String>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
-
Read all HL7v2 Messages from a multiple stores matching a filter.
- readAsJson() - Static method in class org.apache.beam.io.debezium.DebeziumIO
-
Read data from Debezium source and convert a Kafka
SourceRecord
into a JSON string usingSourceRecordJson.SourceRecordJsonMapper
as default function mapper. - readAvroGenericRecords(Schema) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
-
Returns a
PTransform
that continuously reads binary encoded Avro messages into the AvroGenericRecord
type. - readAvros(Class<T>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
-
Returns A
PTransform
that continuously reads binary encoded Avro messages of the given type from a Google Cloud Pub/Sub stream. - readAvrosWithBeamSchema(Class<T>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
-
Returns a
PTransform
that continuously reads binary encoded Avro messages of the specific type. - ReadBuilder - Class in org.apache.beam.sdk.io.snowflake.crosslanguage
- ReadBuilder() - Constructor for class org.apache.beam.io.debezium.DebeziumTransformRegistrar.ReadBuilder
- ReadBuilder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead.ReadBuilder
- ReadBuilder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder
- ReadBuilder() - Constructor for class org.apache.beam.sdk.io.snowflake.crosslanguage.ReadBuilder
- ReadBuilder.Configuration - Class in org.apache.beam.sdk.io.snowflake.crosslanguage
-
Parameters class to expose the transform to an external SDK.
- readBytes() - Static method in class org.apache.beam.sdk.io.kafka.KafkaIO
-
A specific instance of uninitialized
KafkaIO.read()
where key and values are bytes. - readCallMetric(TableReference) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
- readChangeStream() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO
-
Creates an uninitialized
BigtableIO.ReadChangeStream
. - readChangeStream() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
-
Creates an uninitialized instance of
SpannerIO.ReadChangeStream
. - ReadChangeStream() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
- ReadChangeStream() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
- readChangeStreamPartition(PartitionRecord, StreamProgress, Instant, Duration) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.ChangeStreamDao
-
Streams a partition.
- readChangeStreamPartitionAction(MetadataTableDao, ChangeStreamDao, ChangeStreamMetrics, ChangeStreamAction, Duration, SizeEstimator<KV<ByteString, ChangeStreamRecord>>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ActionFactory
-
Creates and returns a singleton instance of an action class for processing
ReadChangeStreamPartitionDoFn
. - ReadChangeStreamPartitionAction - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.action
-
This class is part of
ReadChangeStreamPartitionDoFn
SDF. - ReadChangeStreamPartitionAction(MetadataTableDao, ChangeStreamDao, ChangeStreamMetrics, ChangeStreamAction, Duration, SizeEstimator<KV<ByteString, ChangeStreamRecord>>) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ReadChangeStreamPartitionAction
- ReadChangeStreamPartitionDoFn - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn
- ReadChangeStreamPartitionDoFn - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn
-
A SDF (Splittable DoFn) class which is responsible for performing a change stream query for a given partition.
- ReadChangeStreamPartitionDoFn(DaoFactory, ActionFactory, ChangeStreamMetrics, Duration) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.ReadChangeStreamPartitionDoFn
- ReadChangeStreamPartitionDoFn(DaoFactory, MapperFactory, ActionFactory, ChangeStreamMetrics) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
-
This class needs a
DaoFactory
to build DAOs to access the partition metadata tables and to perform the change streams query. - ReadChangeStreamPartitionProgressTracker - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction
-
RestrictionTracker used by
ReadChangeStreamPartitionDoFn
to keep track of the progress of the stream and to split the restriction for runner initiated checkpoints. - ReadChangeStreamPartitionProgressTracker(StreamProgress) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.ReadChangeStreamPartitionProgressTracker
-
Constructs a restriction tracker with the streamProgress.
- ReadChangeStreamPartitionRangeTracker - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction
-
This restriction tracker delegates most of its behavior to an internal
TimestampRangeTracker
. - ReadChangeStreamPartitionRangeTracker(PartitionMetadata, TimestampRange) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.ReadChangeStreamPartitionRangeTracker
-
Receives the partition that will be queried and the timestamp range that belongs to it.
- Read consistency - Search tag in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
- Section
- ReadDataBuilder() - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.ReadDataBuilder
- readDecompressed(ReadableByteChannel) - Method in enum class org.apache.beam.sdk.io.Compression
- readDetectNewPartitionMissingPartitions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
-
Read and deserialize missing partition and how long they have been missing from the metadata table.
- readDetectNewPartitionsState() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
-
Read the low watermark of the pipeline from Detect New Partition row.
- reader - Variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase.ReaderAndOutput
- Reader() - Constructor for class org.apache.beam.sdk.io.Source.Reader
- ReaderAndOutput(String, Source.Reader<T>, boolean) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase.ReaderAndOutput
- ReaderInvocationUtil<OutputT,
ReaderT> - Class in org.apache.beam.runners.flink.metrics -
Util for invoking
Source.Reader
methods that might require aMetricsContainerImpl
to be active. - ReaderInvocationUtil(String, PipelineOptions, FlinkMetricContainerBase) - Constructor for class org.apache.beam.runners.flink.metrics.ReaderInvocationUtil
- readExternal(ObjectInput) - Method in class org.apache.beam.sdk.io.hadoop.SerializableConfiguration
- readFhirResource(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Read fhir resource http body.
- readFhirResource(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- readField(Row, Blackhole) - Method in class org.apache.beam.sdk.jmh.schemas.RowBundle
-
Reads single field from row (of type
RowWithGetters
). - readField(Row, Blackhole) - Method in class org.apache.beam.sdk.jmh.schemas.RowBundles.ArrayOfNestedStringBundle
- readField(Row, Blackhole) - Method in class org.apache.beam.sdk.jmh.schemas.RowBundles.ArrayOfStringBundle
- readField(Row, Blackhole) - Method in class org.apache.beam.sdk.jmh.schemas.RowBundles.MapOfIntBundle
- readField(Row, Blackhole) - Method in class org.apache.beam.sdk.jmh.schemas.RowBundles.MapOfNestedIntBundle
- readField(Row, Blackhole) - Method in class org.apache.beam.sdk.jmh.schemas.RowBundles.NestedBytesBundle
- readField(Row, Blackhole) - Method in class org.apache.beam.sdk.jmh.schemas.RowBundles.NestedIntBundle
- ReadFileRangesFnExceptionHandler() - Constructor for class org.apache.beam.sdk.io.ReadAllViaFileBasedSource.ReadFileRangesFnExceptionHandler
- readFiles() - Static method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO
-
Like
ContextualTextIO.read()
, but reads each file in aPCollection
ofFileIO.ReadableFile
, returned byFileIO.readMatches()
. - readFiles() - Static method in class org.apache.beam.sdk.io.TextIO
-
Like
TextIO.read()
, but reads each file in aPCollection
ofFileIO.ReadableFile
, returned byFileIO.readMatches()
. - readFiles() - Static method in class org.apache.beam.sdk.io.TFRecordIO
-
Like
TFRecordIO.read()
, but reads each file in aPCollection
ofFileIO.ReadableFile
, returned byFileIO.readMatches()
. - readFiles() - Static method in class org.apache.beam.sdk.io.xml.XmlIO
-
Like
XmlIO.read()
, but reads each file in aPCollection
ofFileIO.ReadableFile
, which allows more flexible usage via different configuration options ofFileIO.match()
andFileIO.readMatches()
that are not explicitly provided forXmlIO.read()
. - readFiles(Class<T>) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
-
Like
AvroIO.read(java.lang.Class<T>)
, but reads each file in aPCollection
ofFileIO.ReadableFile
, returned byFileIO.readMatches()
. - readFiles(Class<T>) - Static method in class org.apache.beam.sdk.io.thrift.ThriftIO
-
Reads each file in a
PCollection
ofFileIO.ReadableFile
, which allows more flexible usage. - readFiles(Schema) - Static method in class org.apache.beam.sdk.io.parquet.ParquetIO
-
Like
ParquetIO.read(Schema)
, but reads each file in aPCollection
ofFileIO.ReadableFile
, which allows more flexible usage. - ReadFiles() - Constructor for class org.apache.beam.sdk.extensions.avro.io.AvroIO.ReadFiles
- ReadFiles() - Constructor for class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.ReadFiles
- ReadFiles() - Constructor for class org.apache.beam.sdk.io.parquet.ParquetIO.ReadFiles
- ReadFiles() - Constructor for class org.apache.beam.sdk.io.TextIO.ReadFiles
- ReadFiles() - Constructor for class org.apache.beam.sdk.io.TFRecordIO.ReadFiles
- ReadFiles() - Constructor for class org.apache.beam.sdk.io.thrift.ThriftIO.ReadFiles
- ReadFiles() - Constructor for class org.apache.beam.sdk.io.xml.XmlIO.ReadFiles
- readFilesGenericRecords(String) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
-
Like
AvroIO.readGenericRecords(String)
, but forFileIO.ReadableFile
collections. - readFilesGenericRecords(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
-
Like
AvroIO.readGenericRecords(Schema)
, but for aPCollection
ofFileIO.ReadableFile
, for example, returned byFileIO.readMatches()
. - readFrom(InputStream, int, int) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
-
Reads
length
bytes from the specified input stream writing them into the backing data store starting atoffset
. - Read from Cdap Plugin Bounded Source - Search tag in class org.apache.beam.sdk.io.cdap.CdapIO
- Section
- Read from Cdap Plugin Streaming Source - Search tag in class org.apache.beam.sdk.io.cdap.CdapIO
- Section
- Read from Kafka as a DoFn - Search tag in class org.apache.beam.sdk.io.kafka.KafkaIO
- Section
- Read from Kafka as UnboundedSource - Search tag in class org.apache.beam.sdk.io.kafka.KafkaIO
- Section
- Read From Kafka Dynamically - Search tag in class org.apache.beam.sdk.io.kafka.KafkaIO
- Section
- ReadFromMySqlSchemaTransformProvider - Class in org.apache.beam.sdk.io.jdbc.providers
- ReadFromMySqlSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.jdbc.providers.ReadFromMySqlSchemaTransformProvider
- ReadFromOracleSchemaTransformProvider - Class in org.apache.beam.sdk.io.jdbc.providers
- ReadFromOracleSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.jdbc.providers.ReadFromOracleSchemaTransformProvider
- readFromPort(BeamFnApi.RemoteGrpcPort, String) - Static method in class org.apache.beam.sdk.fn.data.RemoteGrpcPortRead
- ReadFromPostgresSchemaTransformProvider - Class in org.apache.beam.sdk.io.jdbc.providers
- ReadFromPostgresSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.jdbc.providers.ReadFromPostgresSchemaTransformProvider
- ReadFromPulsarDoFn - Class in org.apache.beam.sdk.io.pulsar
-
Transform for reading from Apache Pulsar.
- ReadFromPulsarDoFn(PulsarIO.Read) - Constructor for class org.apache.beam.sdk.io.pulsar.ReadFromPulsarDoFn
- readFromSource(BoundedSource<T>, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
-
Reads all elements from the given
BoundedSource
. - readFromSplitsOfSource(BoundedSource<T>, long, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
- ReadFromSqlServerSchemaTransformProvider - Class in org.apache.beam.sdk.io.jdbc.providers
- ReadFromSqlServerSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.jdbc.providers.ReadFromSqlServerSchemaTransformProvider
- readFromStartedReader(Source.Reader<T>) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
-
Reads all elements from the given started
Source.Reader
. - readFromUnstartedReader(Source.Reader<T>) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
-
Reads all elements from the given unstarted
Source.Reader
. - readFullyAsBytes() - Method in class org.apache.beam.sdk.io.FileIO.ReadableFile
-
Returns the full contents of the file as bytes.
- readFullyAsUTF8String() - Method in class org.apache.beam.sdk.io.FileIO.ReadableFile
-
Returns the full contents of the file as a
String
decoded as UTF-8. - readGenericRecords(String) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
-
Reads Avro file(s) containing records of the specified schema.
- readGenericRecords(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
-
Reads Avro file(s) containing records of the specified schema.
- Reading - Search tag in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
- Section
- Reading - Search tag in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
- Section
- Reading a very large number of files - Search tag in class org.apache.beam.sdk.extensions.avro.io.AvroIO
- Section
- Reading a very large number of files - Search tag in class org.apache.beam.sdk.io.TextIO
- Section
- Reading a very large number of files - Search tag in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO
- Section
- Reading Avro files - Search tag in class org.apache.beam.sdk.extensions.avro.io.AvroIO
- Section
- Reading CSV files - Search tag in class org.apache.beam.sdk.io.csv.CsvIO
- Section
- Reading files - Search tag in class org.apache.beam.sdk.io.FileIO
- Section
- Reading from a JMS destination - Search tag in class org.apache.beam.sdk.io.jms.JmsIO
- Section
- Reading from a MQTT broker - Search tag in class org.apache.beam.sdk.io.mqtt.MqttIO
- Section
- Reading from Apache Cassandra - Search tag in class org.apache.beam.sdk.io.cassandra.CassandraIO
- Section
- Reading from a PCollection of filepatterns - Search tag in class org.apache.beam.sdk.extensions.avro.io.AvroIO
- Section
- Reading from a queue (Read#from(Solace.Queue)} or a topic (Read#from(Solace.Topic)) - Search tag in class org.apache.beam.sdk.io.solace.SolaceIO
- Section
- Reading from Cloud Bigtable - Search tag in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO
- Section
- Reading from Cloud Spanner - Search tag in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
- Section
- Reading from DynamoDB - Search tag in class org.apache.beam.sdk.io.aws2.dynamodb.DynamoDBIO
- Section
- Reading from Elasticsearch - Search tag in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO
- Section
- Reading from HBase - Search tag in class org.apache.beam.sdk.io.hbase.HBaseIO
- Section
- Reading from InfluxDB - Search tag in class org.apache.beam.sdk.io.influxdb.InfluxDbIO
- Section
- Reading from JDBC datasource - Search tag in class org.apache.beam.sdk.io.jdbc.JdbcIO
- Section
- Reading from Kafka topics - Search tag in class org.apache.beam.sdk.io.kafka.KafkaIO
- Section
- Reading from Kinesis - Search tag in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO
- Section
- Reading from Kudu - Search tag in class org.apache.beam.sdk.io.kudu.KuduIO
- Section
- Reading from MongoDB - Search tag in class org.apache.beam.sdk.io.mongodb.MongoDbIO
- Section
- Reading from MongoDB via GridFS - Search tag in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO
- Section
- Reading from Neo4j - Search tag in class org.apache.beam.sdk.io.neo4j.Neo4jIO
- Section
- Reading from SingleStoreDB datasource - Search tag in class org.apache.beam.sdk.io.singlestore.SingleStoreIO
- Section
- Reading from Snowflake - Search tag in class org.apache.beam.sdk.io.snowflake.SnowflakeIO
- Section
- Reading from Solace - Search tag in class org.apache.beam.sdk.io.solace.SolaceIO
- Section
- Reading from Solr - Search tag in class org.apache.beam.sdk.io.solr.SolrIO
- Section
- Reading from SQS - Search tag in class org.apache.beam.sdk.io.aws2.sqs.SqsIO
- Section
- Reading from Tables - Search tag in class org.apache.beam.sdk.io.iceberg.IcebergIO
- Section
- Reading from text files - Search tag in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO
- Section
- Reading JSON files - Search tag in class org.apache.beam.sdk.io.json.JsonIO
- Section
- Reading Parquet files - Search tag in class org.apache.beam.sdk.io.parquet.ParquetIO
- Section
- Reading Records - Search tag in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
- Section
- Reading records of a known schema - Search tag in class org.apache.beam.sdk.extensions.avro.io.AvroIO
- Section
- Reading records of an unknown schema - Search tag in class org.apache.beam.sdk.extensions.avro.io.AvroIO
- Section
- Reading records of an unknown schema - Search tag in class org.apache.beam.sdk.io.parquet.ParquetIO
- Section
- Reading Redis key/value pairs - Search tag in class org.apache.beam.sdk.io.redis.RedisIO
- Section
- Reading Study-Level Metadata - Search tag in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO
- Section
- Reading text files - Search tag in class org.apache.beam.sdk.io.TextIO
- Section
- Reading Thrift Files - Search tag in class org.apache.beam.sdk.io.thrift.ThriftIO
- Section
- Reading using Hadoop HadoopFormatIO - Search tag in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO
- Section
- Reading using HCatalog - Search tag in class org.apache.beam.sdk.io.hcatalog.HCatalogIO
- Section
- Reading using SparkReceiverIO - Search tag in class org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIO
- Section
- Reading with Metadata from a MQTT broker - Search tag in class org.apache.beam.sdk.io.mqtt.MqttIO
- Section
- readKeyPatterns() - Static method in class org.apache.beam.sdk.io.redis.RedisIO
-
Like
RedisIO.read()
but executes multiple instances of the Redis query substituting each element of aPCollection
as key pattern. - ReadKeyPatterns() - Constructor for class org.apache.beam.sdk.io.redis.RedisIO.ReadKeyPatterns
- readLater() - Method in interface org.apache.beam.sdk.state.BagState
- readLater() - Method in interface org.apache.beam.sdk.state.CombiningState
- readLater() - Method in interface org.apache.beam.sdk.state.GroupingState
- readLater() - Method in interface org.apache.beam.sdk.state.ReadableState
-
Indicate that the value will be read later.
- readLater() - Method in interface org.apache.beam.sdk.state.SetState
- readLater() - Method in interface org.apache.beam.sdk.state.ValueState
- readLater() - Method in interface org.apache.beam.sdk.state.WatermarkHoldState
- readMatches() - Static method in class org.apache.beam.sdk.io.FileIO
-
Converts each result of
FileIO.match()
orFileIO.matchAll()
to aFileIO.ReadableFile
which can be used to read the contents of each file, optionally decompressing it. - ReadMatches() - Constructor for class org.apache.beam.sdk.io.FileIO.ReadMatches
- readMessage() - Static method in class org.apache.beam.sdk.io.jms.JmsIO
- readMessages() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
-
Returns A
PTransform
that continuously reads from a Google Cloud Pub/Sub stream. - readMessagesWithAttributes() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
-
Returns A
PTransform
that continuously reads from a Google Cloud Pub/Sub stream. - readMessagesWithAttributesAndMessageId() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
-
Returns A
PTransform
that continuously reads from a Google Cloud Pub/Sub stream. - readMessagesWithAttributesAndMessageIdAndOrderingKey() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
-
Returns A
PTransform
that continuously reads from a Google Cloud Pub/Sub stream. - readMessagesWithAttributesWithCoderAndParseFn(Coder<T>, SimpleFunction<PubsubMessage, T>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
-
Returns A
PTransform
that continuously reads from a Google Cloud Pub/Sub stream, mapping eachPubsubMessage
, with attributes, into type T using the supplied parse function and coder. - readMessagesWithCoderAndParseFn(Coder<T>, SimpleFunction<PubsubMessage, T>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
-
Returns A
PTransform
that continuously reads from a Google Cloud Pub/Sub stream, mapping eachPubsubMessage
into type T using the supplied parse function and coder. - readMessagesWithMessageId() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
-
Returns A
PTransform
that continuously reads from a Google Cloud Pub/Sub stream. - readNewPartitions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
- readNewPartitionsIncludingDeleted() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
- readNextBlock() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource.AvroReader
- readNextBlock() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
-
Read the next block from the input.
- readNextRecord() - Method in class org.apache.beam.sdk.io.BlockBasedSource.Block
-
Reads the next record from the block and returns true iff one exists.
- readNextRecord() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
-
Reads the next record from the
current block
if possible. - readNextRecord() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
-
Reads the next record via the delegate reader.
- readNextRecord() - Method in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
-
Reads the next record from the channel provided by
FileBasedSource.FileBasedReader.startReading(java.nio.channels.ReadableByteChannel)
. - readNItemsFromStartedReader(Source.Reader<T>, int) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
-
Read elements from a
Source.Reader
that has already hadSource.Reader.start()
called on it, until n elements are read. - readNItemsFromUnstartedReader(Source.Reader<T>, int) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
-
Read elements from a
Source.Reader
until n elements are read. - readObject(FileSystem, Path) - Static method in class org.apache.beam.runners.spark.translation.streaming.Checkpoint
- readOnly(String, Map<String, BeamSqlTable>) - Static method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
-
This method creates
BeamSqlEnv
using empty Pipeline Options. - ReadOnlyTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider
-
A
ReadOnlyTableProvider
provides in-memory read only set ofBeamSqlTable BeamSqlTables
. - ReadOnlyTableProvider(String, Map<String, BeamSqlTable>) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.ReadOnlyTableProvider
- ReadOperation - Class in org.apache.beam.sdk.io.gcp.spanner
-
Encapsulates a spanner read operation.
- ReadOperation() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- readPrivateKeyFile(String) - Static method in class org.apache.beam.sdk.io.snowflake.KeyPairUtils
- readProtoDynamicMessages(Descriptors.Descriptor) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
-
Similar to
PubsubIO.readProtoDynamicMessages(ProtoDomain, String)
but for when theDescriptors.Descriptor
is already known. - readProtoDynamicMessages(ProtoDomain, String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
-
Returns a
PTransform
that continuously reads binary encoded protobuf messages for the type specified byfullMessageName
. - readProtos(Class<T>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
-
Returns A
PTransform
that continuously reads binary encoded protobuf messages of the given type from a Google Cloud Pub/Sub stream. - readRange(Instant, Instant) - Method in interface org.apache.beam.sdk.state.OrderedListState
-
Read a timestamp-limited subrange of the list.
- readRangeLater(Instant, Instant) - Method in interface org.apache.beam.sdk.state.OrderedListState
-
Call to indicate that a specific range will be read from the list, allowing runners to batch multiple range reads.
- readRangesFn() - Method in class org.apache.beam.sdk.io.ReadAllViaFileBasedSource
- readRangesFn() - Method in class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceTransform
- readRangesFn() - Method in class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceWithFilename
- ReadRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOTranslation.ReadRegistrar
- ReadRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubSubPayloadTranslation.ReadRegistrar
- ReadRegistrar() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergSchemaTransformTranslation.ReadRegistrar
- ReadRegistrar() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaSchemaTransformTranslation.ReadRegistrar
- ReadRegistrar() - Constructor for class org.apache.beam.sdk.io.kafka.upgrade.KafkaIOTranslation.ReadRegistrar
- readRemainingFromReader(Source.Reader<T>, boolean) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
-
Read all remaining elements from a
Source.Reader
. - readResolve() - Method in class org.apache.beam.runners.twister2.translation.wrappers.Twister2BoundedSource
- readResolve() - Method in class org.apache.beam.runners.twister2.translators.functions.AssignWindowsFunction
- readResolve() - Method in class org.apache.beam.runners.twister2.translators.functions.ByteToElemFunction
- readResolve() - Method in class org.apache.beam.runners.twister2.translators.functions.ByteToWindowFunction
- readResolve() - Method in class org.apache.beam.runners.twister2.translators.functions.ByteToWindowFunctionPrimitive
- readResolve() - Method in class org.apache.beam.runners.twister2.translators.functions.DoFnFunction
- readResolve() - Method in class org.apache.beam.runners.twister2.translators.functions.ElemToBytesFunction
- readResolve() - Method in class org.apache.beam.runners.twister2.translators.functions.GroupByWindowFunction
- readResolve() - Method in class org.apache.beam.runners.twister2.translators.functions.internal.SystemReduceFnBuffering
- readResolve() - Method in class org.apache.beam.runners.twister2.translators.functions.MapToTupleFunction
- readResources() - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
-
Read resources from a PCollection of resource IDs (e.g.
- readRows() - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO
-
Read Beam
Row
s from a JDBC data source. - readRows() - Static method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO
-
Read Beam
Row
s from a SingleStoreDB datasource. - readRows(ReadRowsRequest) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StorageClient
-
Read rows in the context of a specific read stream.
- readRows(ReadRowsRequest, String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StorageClient
- readRows(IcebergCatalogConfig) - Static method in class org.apache.beam.sdk.io.iceberg.IcebergIO
- ReadRows() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergIO.ReadRows
- ReadRows() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadRows
- readSnapshot(int, DataInputView, ClassLoader) - Method in class org.apache.beam.runners.flink.translation.types.UnversionedTypeSerializerSnapshot
- readSnapshot(int, DataInputView, ClassLoader) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.FlinkStateNamespaceKeySerializer.FlinkStateNameSpaceSerializerSnapshot
- readSource(PipelineOptions, TupleTag<T>, DoFn.MultiOutputReceiver, BoundedSource<T>, BigQueryIO.TypedRead.ErrorHandlingParseFn<T>, BadRecordRouter) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- readSourceDescriptors() - Static method in class org.apache.beam.sdk.io.kafka.KafkaIO
-
Creates an uninitialized
KafkaIO.ReadSourceDescriptors
PTransform
. - ReadSourceDescriptors() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
- ReadSourceTranslatorBatch<T> - Class in org.apache.beam.runners.twister2.translators.batch
-
Source translator.
- ReadSourceTranslatorBatch() - Constructor for class org.apache.beam.runners.twister2.translators.batch.ReadSourceTranslatorBatch
- ReadSourceTranslatorStream<T> - Class in org.apache.beam.runners.twister2.translators.streaming
-
doc.
- ReadSourceTranslatorStream() - Constructor for class org.apache.beam.runners.twister2.translators.streaming.ReadSourceTranslatorStream
- ReadSpannerSchema - Class in org.apache.beam.sdk.io.gcp.spanner
-
This
DoFn
reads Cloud Spanner 'information_schema.*' tables to build theSpannerSchema
. - ReadSpannerSchema(SpannerConfig, PCollectionView<Dialect>) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.ReadSpannerSchema
-
Constructor for creating an instance of the ReadSpannerSchema class.
- ReadSpannerSchema(SpannerConfig, PCollectionView<Dialect>, Set<String>) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.ReadSpannerSchema
-
Constructor for creating an instance of the ReadSpannerSchema class.
- readStreamPartitionsWithWatermark() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
-
Return list of locked StreamPartition and their watermarks.
- readStrings() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
-
Returns A
PTransform
that continuously reads UTF-8 encoded strings from a Google Cloud Pub/Sub stream. - readStudyMetadata() - Static method in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO
- readTableRows() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
-
Like
BigQueryIO.read(SerializableFunction)
but represents each row as aTableRow
. - readTableRowsWithSchema() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
-
Like
BigQueryIO.readTableRows()
but withSchema
support. - readTimeout() - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration
-
Milliseconds to wait for a read on a socket before an exception is thrown.
- readTimeout(Integer) - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration.Builder
-
Milliseconds to wait for a read on a socket before an exception is thrown.
- ReadUtils - Class in org.apache.beam.sdk.io.iceberg
-
Helper class for source operations.
- ReadUtils() - Constructor for class org.apache.beam.sdk.io.iceberg.ReadUtils
- readValue - Variable in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
- readValue - Variable in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
- readWithDatumReader(AvroSource.DatumReaderFactory<T>) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
-
Reads from a BigQuery table or query and returns a
PCollection
with one element per each row of the table or query result. - readWithFilter(String, String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
-
Read all HL7v2 Messages from a single store matching a filter.
- readWithFilter(ValueProvider<String>, ValueProvider<String>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
-
Read all HL7v2 Messages from a single store matching a filter.
- readWithKeyDenormalization(byte[], DataInputView) - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueComparator
- readWithMetadata() - Static method in class org.apache.beam.sdk.io.mqtt.MqttIO
- readWithPartitions() - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO
- readWithPartitions() - Static method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO
-
Like
SingleStoreIO.read()
, but executes multiple instances of the query on the same table for each database partition. - readWithPartitions(JdbcReadWithPartitionsHelper<PartitionColumnT>) - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO
-
Like
JdbcIO.readAll()
, but executes multiple instances of the query on the same table (subquery) using ranges. - readWithPartitions(TypeDescriptor<PartitionColumnT>) - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO
-
Like
JdbcIO.readAll()
, but executes multiple instances of the query on the same table (subquery) using ranges. - ReadWithPartitions() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadWithPartitions
- ReadWithPartitions() - Constructor for class org.apache.beam.sdk.io.singlestore.SingleStoreIO.ReadWithPartitions
- readWithPartitionsRows() - Static method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO
-
Like
SingleStoreIO.readRows()
, but executes multiple instances of the query on the same table for each database partition. - readWithSchema() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
- ReadWriteRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQuerySchemaTransformTranslation.ReadWriteRegistrar
- ReadWriteRegistrar() - Constructor for class org.apache.beam.sdk.io.TFRecordSchemaTransformTranslation.ReadWriteRegistrar
- receive() - Method in interface org.apache.beam.sdk.io.solace.broker.MessageReceiver
-
Receives a message from the broker.
- receive() - Method in class org.apache.beam.sdk.io.solace.broker.SolaceMessageReceiver
- ReceiverBuilder<X,
T> - Class in org.apache.beam.sdk.io.sparkreceiver -
Class for building an instance for
Receiver
that uses Apache Beam mechanisms instead of Spark environment. - ReceiverBuilder(Class<T>) - Constructor for class org.apache.beam.sdk.io.sparkreceiver.ReceiverBuilder
- RecommendationAICreateCatalogItem - Class in org.apache.beam.sdk.extensions.ml
-
A
PTransform
using the Recommendations AI API (https://cloud.google.com/recommendations). - RecommendationAICreateCatalogItem() - Constructor for class org.apache.beam.sdk.extensions.ml.RecommendationAICreateCatalogItem
- RecommendationAIImportCatalogItems - Class in org.apache.beam.sdk.extensions.ml
-
A
PTransform
connecting to the Recommendations AI API (https://cloud.google.com/recommendations) and creatingCatalogItem
s. - RecommendationAIImportCatalogItems() - Constructor for class org.apache.beam.sdk.extensions.ml.RecommendationAIImportCatalogItems
- RecommendationAIImportUserEvents - Class in org.apache.beam.sdk.extensions.ml
-
A
PTransform
connecting to the Recommendations AI API (https://cloud.google.com/recommendations) and creatingUserEvent
s. - RecommendationAIImportUserEvents() - Constructor for class org.apache.beam.sdk.extensions.ml.RecommendationAIImportUserEvents
- RecommendationAIIO - Class in org.apache.beam.sdk.extensions.ml
-
The RecommendationAIIO class acts as a wrapper around the
invalid reference
PTransform
- RecommendationAIIO() - Constructor for class org.apache.beam.sdk.extensions.ml.RecommendationAIIO
- RecommendationAIPredict - Class in org.apache.beam.sdk.extensions.ml
-
A
PTransform
using the Recommendations AI API (https://cloud.google.com/recommendations). - RecommendationAIPredict() - Constructor for class org.apache.beam.sdk.extensions.ml.RecommendationAIPredict
- RecommendationAIWriteUserEvent - Class in org.apache.beam.sdk.extensions.ml
-
A
PTransform
using the Recommendations AI API (https://cloud.google.com/recommendations). - RecommendationAIWriteUserEvent() - Constructor for class org.apache.beam.sdk.extensions.ml.RecommendationAIWriteUserEvent
- record(List<FieldOperation>) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- Record() - Constructor for class org.apache.beam.sdk.io.solace.data.Solace.Record
- Record() - Constructor for class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Record
- RECORD - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.providers.PortableBigQueryDestinations
- RECORD_NUM - Static variable in class org.apache.beam.sdk.io.contextualtextio.RecordWithMetadata
- RECORD_NUM_IN_OFFSET - Static variable in class org.apache.beam.sdk.io.contextualtextio.RecordWithMetadata
- RECORD_OFFSET - Static variable in class org.apache.beam.sdk.io.contextualtextio.RecordWithMetadata
- RecordAggregation() - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.RecordAggregation
- recordException(Throwable) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- recordId() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.IncomingMessage
-
Id to pass to the runner to distinguish this message from all others.
- recordId() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.OutgoingMessage
-
If using an id attribute, the record id to associate with this record's metadata so the receiver can reject duplicates.
- RECORDING_ROUTER - Static variable in interface org.apache.beam.sdk.transforms.errorhandling.BadRecordRouter
- RecordingBadRecordRouter() - Constructor for class org.apache.beam.sdk.transforms.errorhandling.BadRecordRouter.RecordingBadRecordRouter
- RecordToPublishResultDoFn - Class in org.apache.beam.sdk.io.solace.write
-
This class just transforms to PublishResult to be able to capture the windowing with the right strategy.
- RecordToPublishResultDoFn() - Constructor for class org.apache.beam.sdk.io.solace.write.RecordToPublishResultDoFn
- RecordWithMetadata - Class in org.apache.beam.sdk.io.contextualtextio
-
Helper Class based on
Row
, it provides Metadata associated with each Record when reading from file(s) usingContextualTextIO
. - RecordWithMetadata() - Constructor for class org.apache.beam.sdk.io.contextualtextio.RecordWithMetadata
- recoverRecords(Consumer<HistoryRecord>) - Method in class org.apache.beam.io.debezium.KafkaSourceConsumerFn.DebeziumSDFDatabaseHistory
- RedisConnectionConfiguration - Class in org.apache.beam.sdk.io.redis
-
RedisConnectionConfiguration
describes and wraps a connectionConfiguration to Redis server or cluster. - RedisConnectionConfiguration() - Constructor for class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
- RedisCursor - Class in org.apache.beam.sdk.io.redis
- RedisIO - Class in org.apache.beam.sdk.io.redis
-
An IO to manipulate Redis key/value database.
- RedisIO.Read - Class in org.apache.beam.sdk.io.redis
-
Implementation of
RedisIO.read()
. - RedisIO.ReadKeyPatterns - Class in org.apache.beam.sdk.io.redis
-
Implementation of
RedisIO.readKeyPatterns()
. - RedisIO.Write - Class in org.apache.beam.sdk.io.redis
-
AÂ
PTransform
to write to a Redis server. - RedisIO.Write.Method - Enum Class in org.apache.beam.sdk.io.redis
-
Determines the method used to insert data in Redis.
- RedisIO.WriteStreams - Class in org.apache.beam.sdk.io.redis
-
AÂ
PTransform
to write stream key pairs (https://redis.io/topics/streams-intro) to a Redis server. - Redistribute - Class in org.apache.beam.sdk.transforms
-
A family of
PTransforms
that returns aPCollection
equivalent to its input but functions as an operational hint to a runner that redistributing the data in some way is likely useful. - Redistribute() - Constructor for class org.apache.beam.sdk.transforms.Redistribute
- Redistribute.RedistributeArbitrarily<T> - Class in org.apache.beam.sdk.transforms
-
Noop transform that hints to the runner to try to redistribute the work evenly, or via whatever clever strategy the runner comes up with.
- Redistribute.RedistributeByKey<K,
V> - Class in org.apache.beam.sdk.transforms - Redistribute.Registrar - Class in org.apache.beam.sdk.transforms
-
Registers translators for the Redistribute family of transforms.
- reduce(Iterable<WindowedValue<InputT>>, Collector<RawUnionValue>) - Method in class org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction
-
For stateful and timer processing via a GroupReduceFunction.
- reduce(Iterable<WindowedValue<KV<K, AccumT>>>, Collector<WindowedValue<KV<K, OutputT>>>) - Method in class org.apache.beam.runners.flink.translation.functions.FlinkReduceFunction
- reduce(Iterable<WindowedValue<KV<K, InputT>>>, Collector<WindowedValue<KV<K, Iterable<InputT>>>>) - Method in class org.apache.beam.runners.flink.translation.functions.FlinkNonMergingReduceFunction
- reduce(Iterable<WindowedValue<KV<K, InputT>>>, Collector<WindowedValue<KV<K, OutputT>>>) - Method in class org.apache.beam.runners.flink.translation.functions.FlinkMergingNonShuffleReduceFunction
- reduce(Iterable<WindowedValue<KV<K, V>>>, Collector<WindowedValue<RawUnionValue>>) - Method in class org.apache.beam.runners.flink.translation.functions.FlinkStatefulDoFnFunction
- ReferenceCountingExecutableStageContextFactory - Class in org.apache.beam.runners.fnexecution.control
-
ExecutableStageContext.Factory
which counts ExecutableStageContext reference for book keeping. - ReferenceCountingExecutableStageContextFactory.Creator - Interface in org.apache.beam.runners.fnexecution.control
-
Interface for creator which extends Serializable.
- References - Search tag in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct
- Section
- References - Search tag in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies
- Section
- References - Search tag in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles
- Section
- referencesSingleField() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
-
Returns true if this descriptor references only a single, non-wildcard field.
- reflect(Class<T>) - Static method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
-
Returns an
AvroCoder
instance for the provided element type respecting Avro's Reflect* suite for encoding and decoding. - reflect(Class<T>) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory
-
Returns an
AvroDatumFactory
instance for the provided element type respecting Avro's Reflect* suite for encoding and decoding. - reflect(Class<T>, Schema) - Static method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
-
Returns an
AvroCoder
instance for the provided element type respecting Avro's Reflect* suite for encoding and decoding. - reflect(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
-
Returns an
AvroCoder
instance for the provided element type respecting Avro's Reflect* suite for encoding and decoding. - ReflectDatumFactory(Class<T>) - Constructor for class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory.ReflectDatumFactory
- ReflectUtils - Class in org.apache.beam.sdk.schemas.utils
-
A set of reflection helper methods.
- ReflectUtils() - Constructor for class org.apache.beam.sdk.schemas.utils.ReflectUtils
- ReflectUtils.ClassWithSchema - Class in org.apache.beam.sdk.schemas.utils
-
Represents a class and a schema.
- ReflectUtils.TypeDescriptorWithSchema<T> - Class in org.apache.beam.sdk.schemas.utils
-
Represents a type descriptor and a schema.
- refreshSchema(TableReference, BigQueryServices.DatasetService) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableSchemaCache
- refreshThread() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableSchemaCache
- Regex - Class in org.apache.beam.sdk.transforms
-
PTransform
s to use Regular Expressions to process elements in aPCollection
. - Regex.AllMatches - Class in org.apache.beam.sdk.transforms
-
Regex.MatchesName<String>
takes aPCollection<String>
and returns aPCollection<List<String>>
representing the value extracted from all the Regex groups of the inputPCollection
to the number of times that element occurs in the input. - Regex.Find - Class in org.apache.beam.sdk.transforms
-
Regex.Find<String>
takes aPCollection<String>
and returns aPCollection<String>
representing the value extracted from the Regex groups of the inputPCollection
to the number of times that element occurs in the input. - Regex.FindAll - Class in org.apache.beam.sdk.transforms
-
Regex.Find<String>
takes aPCollection<String>
and returns aPCollection<List<String>>
representing the value extracted from the Regex groups of the inputPCollection
to the number of times that element occurs in the input. - Regex.FindKV - Class in org.apache.beam.sdk.transforms
-
Regex.MatchesKV<KV<String, String>>
takes aPCollection<String>
and returns aPCollection<KV<String, String>>
representing the key and value extracted from the Regex groups of the inputPCollection
to the number of times that element occurs in the input. - Regex.FindName - Class in org.apache.beam.sdk.transforms
-
Regex.Find<String>
takes aPCollection<String>
and returns aPCollection<String>
representing the value extracted from the Regex groups of the inputPCollection
to the number of times that element occurs in the input. - Regex.FindNameKV - Class in org.apache.beam.sdk.transforms
-
Regex.MatchesKV<KV<String, String>>
takes aPCollection<String>
and returns aPCollection<KV<String, String>>
representing the key and value extracted from the Regex groups of the inputPCollection
to the number of times that element occurs in the input. - Regex.Matches - Class in org.apache.beam.sdk.transforms
-
Regex.Matches<String>
takes aPCollection<String>
and returns aPCollection<String>
representing the value extracted from the Regex groups of the inputPCollection
to the number of times that element occurs in the input. - Regex.MatchesKV - Class in org.apache.beam.sdk.transforms
-
Regex.MatchesKV<KV<String, String>>
takes aPCollection<String>
and returns aPCollection<KV<String, String>>
representing the key and value extracted from the Regex groups of the inputPCollection
to the number of times that element occurs in the input. - Regex.MatchesName - Class in org.apache.beam.sdk.transforms
-
Regex.MatchesName<String>
takes aPCollection<String>
and returns aPCollection<String>
representing the value extracted from the Regex groups of the inputPCollection
to the number of times that element occurs in the input. - Regex.MatchesNameKV - Class in org.apache.beam.sdk.transforms
-
Regex.MatchesNameKV<KV<String, String>>
takes aPCollection<String>
and returns aPCollection<KV<String, String>>
representing the key and value extracted from the Regex groups of the inputPCollection
to the number of times that element occurs in the input. - Regex.ReplaceAll - Class in org.apache.beam.sdk.transforms
-
Regex.ReplaceAll<String>
takes aPCollection<String>
and returns aPCollection<String>
with all Strings that matched the Regex being replaced with the replacement string. - Regex.ReplaceFirst - Class in org.apache.beam.sdk.transforms
-
Regex.ReplaceFirst<String>
takes aPCollection<String>
and returns aPCollection<String>
with the first Strings that matched the Regex being replaced with the replacement string. - Regex.Split - Class in org.apache.beam.sdk.transforms
-
Regex.Split<String>
takes aPCollection<String>
and returns aPCollection<String>
with the input string split into individual items in a list. - RegexMatcher - Class in org.apache.beam.sdk.testing
-
Hamcrest matcher to assert a string matches a pattern.
- RegexMatcher(String) - Constructor for class org.apache.beam.sdk.testing.RegexMatcher
- region() - Method in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration
-
Optional
Region
. - region(Region) - Method in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration.Builder
-
Optional
Region
. - register(Class<? extends PipelineOptions>) - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
-
This registers the interface with this factory.
- register(WatchService, WatchEvent.Kind<?>...) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- register(WatchService, WatchEvent.Kind<?>[], WatchEvent.Modifier...) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- register(RelOptPlanner) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSinkRel
- register(RelOptPlanner) - Method in enum class org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention
- registerBadRecordErrorHandler(PTransform<PCollection<BadRecord>, OutputT>) - Method in class org.apache.beam.sdk.Pipeline
- registerByteSizeObserver(T, ElementByteSizeObserver) - Method in class org.apache.beam.sdk.coders.NullableCoder
-
Overridden to short-circuit the default
StructuredCoder
behavior of encoding and counting the bytes. - registerByteSizeObserver(IterableT, ElementByteSizeObserver) - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
- registerByteSizeObserver(Map<K, V>, ElementByteSizeObserver) - Method in class org.apache.beam.sdk.coders.MapCoder
- registerByteSizeObserver(Optional<T>, ElementByteSizeObserver) - Method in class org.apache.beam.sdk.coders.OptionalCoder
-
Overridden to short-circuit the default
StructuredCoder
behavior of encoding and counting the bytes. - registerByteSizeObserver(SortedMap<K, V>, ElementByteSizeObserver) - Method in class org.apache.beam.sdk.coders.SortedMapCoder
- registerByteSizeObserver(RawUnionValue, ElementByteSizeObserver) - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
-
Notifies ElementByteSizeObserver about the byte size of the encoded value using this coder.
- registerByteSizeObserver(IntervalWindow, ElementByteSizeObserver) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
- registerByteSizeObserver(KV<K, V>, ElementByteSizeObserver) - Method in class org.apache.beam.sdk.coders.KvCoder
-
Notifies ElementByteSizeObserver about the byte size of the encoded value using this coder.
- registerByteSizeObserver(WindowedValue<T>, ElementByteSizeObserver) - Method in class org.apache.beam.sdk.values.WindowedValues.FullWindowedValueCoder
- registerByteSizeObserver(WindowedValue<T>, ElementByteSizeObserver) - Method in class org.apache.beam.sdk.values.WindowedValues.ParamWindowedValueCoder
- registerByteSizeObserver(WindowedValue<T>, ElementByteSizeObserver) - Method in class org.apache.beam.sdk.values.WindowedValues.ValueOnlyWindowedValueCoder
-
Deprecated.
- registerByteSizeObserver(ReadableDuration, ElementByteSizeObserver) - Method in class org.apache.beam.sdk.coders.DurationCoder
- registerByteSizeObserver(T, ElementByteSizeObserver) - Method in class org.apache.beam.sdk.coders.Coder
-
Notifies the
ElementByteSizeObserver
about the byte size of the encoded value using thisCoder
. - registerByteSizeObserver(T, ElementByteSizeObserver) - Method in class org.apache.beam.sdk.coders.TimestampPrefixingWindowCoder
- registerClasses(Kryo) - Method in class org.apache.beam.runners.spark.coders.SparkRunnerKryoRegistrator
- registerClasses(Kryo) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.SparkSessionFactory.SparkKryoRegistrator
- registerCoderForClass(Class<?>, Coder<?>) - Method in class org.apache.beam.sdk.coders.CoderRegistry
-
Registers the provided
Coder
for the given class. - registerCoderForType(TypeDescriptor<?>, Coder<?>) - Method in class org.apache.beam.sdk.coders.CoderRegistry
-
Registers the provided
Coder
for the given type. - registerCoderProvider(CoderProvider) - Method in class org.apache.beam.sdk.coders.CoderRegistry
- registerConsumer(String, CloseableFnDataReceiver<BeamFnApi.Elements>) - Method in class org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer
-
Registers a consumer for the specified instruction id.
- registerEnvironment(String, RunnerApi.Environment) - Method in class org.apache.beam.runners.fnexecution.provisioning.StaticGrpcProvisionService
- registerFileSystemsOnce(PipelineOptions) - Static method in class org.apache.beam.sdk.io.FileSystems
-
Register file systems once if never done before.
- registerForProcessBundleInstructionId(String, StateRequestHandler) - Method in class org.apache.beam.runners.fnexecution.state.GrpcStateService
- registerForProcessBundleInstructionId(String, StateRequestHandler) - Method in interface org.apache.beam.runners.fnexecution.state.StateDelegator
-
Registers the supplied handler for the given process bundle instruction id for all
BeamFnApi.StateRequest
s with a matching id. - registerJavaBean(Class<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
-
Register a JavaBean type for automatic schema inference.
- registerJavaBean(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
-
Register a JavaBean type for automatic schema inference.
- registerJob(String, Map<String, List<RunnerApi.ArtifactInformation>>) - Method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService
-
Registers a set of artifacts to be staged with this service.
- registerKnownTableNames(List<TableName>) - Method in interface org.apache.beam.sdk.extensions.sql.meta.CustomTableResolver
-
Register the table names as extracted from the FROM clause.
- registerKnownTableNames(List<TableName>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.FullNameTableProvider
- registerLineage(String, Schema) - Method in class org.apache.beam.sdk.io.cdap.context.StreamingSourceContextImpl
- registerMetricsForPipelineResult() - Method in class org.apache.beam.runners.flink.metrics.FlinkMetricContainer
-
This should be called at the end of the Flink job and sets up an accumulator to push the metrics to the PipelineResult.
- registerOutputDataLocation(String, Coder<T>) - Method in class org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator
-
Register the outbound data logical endpoint, returns the FnDataReceiver for processing the endpoint's outbound data.
- registerOutputTimersLocation(String, String, Coder<T>) - Method in class org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator
-
Register the outbound timers logical endpoint, returns the FnDataReceiver for processing the endpoint's outbound timers data.
- registerPOJO(Class<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
-
Register a POJO type for automatic schema inference.
- registerPOJO(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
-
Register a POJO type for automatic schema inference.
- registerProcessBundleDescriptor(BeamFnApi.ProcessBundleDescriptor) - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClient
- registerProcessBundleDescriptor(BeamFnApi.ProcessBundleDescriptor) - Method in interface org.apache.beam.runners.fnexecution.control.InstructionRequestHandler
- registerProvider(TableProvider) - Method in class org.apache.beam.sdk.extensions.sql.meta.store.InMemoryMetaStore
- registerProvider(TableProvider) - Method in interface org.apache.beam.sdk.extensions.sql.meta.store.MetaStore
-
Register a table provider.
- registerReceiver(String, CloseableFnDataReceiver<BeamFnApi.Elements>) - Method in interface org.apache.beam.runners.fnexecution.data.FnDataService
-
Registers a receiver for the provided instruction id.
- registerReceiver(String, CloseableFnDataReceiver<BeamFnApi.Elements>) - Method in class org.apache.beam.runners.fnexecution.data.GrpcDataService
- registerSchemaForClass(Class<T>, Schema, SerializableFunction<T, Row>, SerializableFunction<Row, T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
-
Register a schema for a specific
Class
type. - registerSchemaForType(TypeDescriptor<T>, Schema, SerializableFunction<T, Row>, SerializableFunction<Row, T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
-
Register a schema for a specific
TypeDescriptor
type. - registerSchemaProvider(Class<T>, SchemaProvider) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
-
Register a
SchemaProvider
to be used for a specific type. - registerSchemaProvider(SchemaProvider) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
-
Register a
SchemaProvider
. - registerSchemaProvider(TypeDescriptor<T>, SchemaProvider) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
-
Register a
SchemaProvider
to be used for a specific type. - registerTableProvider(String, TableProvider) - Method in interface org.apache.beam.sdk.extensions.sql.meta.catalog.CatalogManager
-
Registers a
TableProvider
and propagates it to all theCatalog
instances available to this manager. - registerTableProvider(String, TableProvider) - Method in class org.apache.beam.sdk.extensions.sql.meta.catalog.EmptyCatalogManager
- registerTableProvider(String, TableProvider) - Method in class org.apache.beam.sdk.extensions.sql.meta.catalog.InMemoryCatalogManager
- registerTableProvider(TableProvider) - Method in interface org.apache.beam.sdk.extensions.sql.meta.catalog.CatalogManager
- registerTables(SchemaPlus, List<List<String>>) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.TableResolution
-
Registers tables that will be resolved during query analysis, so table providers can eagerly pre-load metadata.
- registerTransformTranslator(Class<TransformT>, TransformTranslator<? extends TransformT>) - Static method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator
-
Records that instances of the specified PTransform class should be translated by default by the corresponding
TransformTranslator
. - registerUdaf(String, Combine.CombineFn) - Method in class org.apache.beam.sdk.extensions.sql.SqlTransform
-
register a
Combine.CombineFn
as UDAF function used in this query. - registerUdf(String, Class<? extends BeamSqlUdf>) - Method in class org.apache.beam.sdk.extensions.sql.SqlTransform
-
register a UDF function used in this query.
- registerUdf(String, SerializableFunction) - Method in class org.apache.beam.sdk.extensions.sql.SqlTransform
-
Register
SerializableFunction
as a UDF function used in this query. - Registrar() - Constructor for class org.apache.beam.runners.dataflow.internal.DataflowGroupByKey.Registrar
- Registrar() - Constructor for class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.Registrar
- Registrar() - Constructor for class org.apache.beam.sdk.transforms.Redistribute.Registrar
- Registration Of PipelineOptions - Search tag in interface org.apache.beam.sdk.options.PipelineOptions
- Section
- Reify - Class in org.apache.beam.sdk.transforms
-
PTransforms
for converting between explicit and implicit form of various Beam values. - ReifyAsIterable<T> - Class in org.apache.beam.sdk.io.gcp.bigquery
-
This transforms turns a side input into a singleton PCollection that can be used as the main input for another transform.
- ReifyAsIterable() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.ReifyAsIterable
- ReifyTimestampsAndWindowsFunction<K,
V> - Class in org.apache.beam.runners.spark.translation -
Simple
Function
to bring the windowing information into the value from the implicit background representation of thePCollection
. - ReifyTimestampsAndWindowsFunction() - Constructor for class org.apache.beam.runners.spark.translation.ReifyTimestampsAndWindowsFunction
- REJECT - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions.TruncateTimestamps
-
Reject timestamps with greater-than-millisecond precision.
- rejectStateAndTimers(DoFn<?, ?>) - Static method in class org.apache.beam.runners.spark.translation.TranslationUtils
-
Reject state and timers
DoFn
. - rel(int) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall
- relativeErrorForPrecision(int) - Static method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct
- relativeFileNaming(ValueProvider<String>, FileIO.Write.FileNaming) - Static method in class org.apache.beam.sdk.io.FileIO.Write
- relativize(Path) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- relBuilder - Variable in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter
- release(ObjectT) - Method in class org.apache.beam.sdk.io.aws2.common.ObjectPool
-
Release a reference to a shared client instance.
- releaseByKey(KeyT) - Method in class org.apache.beam.sdk.io.aws2.common.ObjectPool
-
Release a reference to a shared object instance using
ObjectPool
. - releaseDataset(Dataset) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- releaseJobIdLock(Configuration) - Method in interface org.apache.beam.sdk.io.hadoop.format.ExternalSynchronization
-
Deletes lock ids bounded with given job if any exists.
- releaseJobIdLock(Configuration) - Method in class org.apache.beam.sdk.io.hadoop.format.HDFSSynchronization
- releaseSavepoint(Savepoint) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- releaseStreamPartitionLockForDeletion(Range.ByteStringRange, String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
-
This is the 1st step of 2 phase delete of StreamPartition.
- RelMdNodeStats - Class in org.apache.beam.sdk.extensions.sql.impl.planner
-
This is the implementation of NodeStatsMetadata.
- RelMdNodeStats() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.planner.RelMdNodeStats
- RelType(String) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter.RelType
- remerge() - Static method in class org.apache.beam.sdk.transforms.windowing.Window
-
Creates a
Window
PTransform
that does not change assigned windows, but will cause windows to be merged again as part of the nextGroupByKey
. - RemoteBundle - Interface in org.apache.beam.runners.fnexecution.control
-
A bundle capable of handling input data elements for a
bundle descriptor
by forwarding them to a remote environment for processing. - RemoteEnvironment - Interface in org.apache.beam.runners.fnexecution.environment
-
A handle to an available remote
RunnerApi.Environment
. - RemoteEnvironment.SimpleRemoteEnvironment - Class in org.apache.beam.runners.fnexecution.environment
-
A
RemoteEnvironment
which uses the defaultRemoteEnvironment.close()
behavior. - RemoteEnvironmentOptions - Interface in org.apache.beam.sdk.options
-
Options that are used to control configuration of the remote environment.
- RemoteEnvironmentOptions.Options - Class in org.apache.beam.sdk.options
-
Register the
RemoteEnvironmentOptions
. - RemoteGrpcPortRead - Class in org.apache.beam.sdk.fn.data
-
An execution-time only
RunnerApi.PTransform
which represents an SDK harness reading from aBeamFnApi.RemoteGrpcPort
. - RemoteGrpcPortRead() - Constructor for class org.apache.beam.sdk.fn.data.RemoteGrpcPortRead
- RemoteGrpcPortWrite - Class in org.apache.beam.sdk.fn.data
-
An execution-time only
RunnerApi.PTransform
which represents a write from within an SDK harness to aBeamFnApi.RemoteGrpcPort
. - RemoteGrpcPortWrite() - Constructor for class org.apache.beam.sdk.fn.data.RemoteGrpcPortWrite
- RemoteInputDestination<T> - Class in org.apache.beam.runners.fnexecution.data
-
A pair of
Coder
andinvalid reference
BeamFnApi.Target
FnDataService
to send data to a remote harness. - RemoteInputDestination() - Constructor for class org.apache.beam.runners.fnexecution.data.RemoteInputDestination
- RemoteOutputReceiver<T> - Class in org.apache.beam.runners.fnexecution.control
-
A pair of
Coder
andFnDataReceiver
which can be registered to receive elements for aLogicalEndpoint
. - RemoteOutputReceiver() - Constructor for class org.apache.beam.runners.fnexecution.control.RemoteOutputReceiver
- remove() - Method in class org.apache.beam.sdk.fn.stream.DataStreams.DataStreamDecoder
- remove(Object) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
- remove(Collection<String>) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
- remove(K) - Method in interface org.apache.beam.sdk.state.MapState
-
Remove the mapping for a key from this map if it is present.
- remove(K) - Method in interface org.apache.beam.sdk.state.MultimapState
-
Removes all values associated with the key from this multimap.
- remove(T) - Method in interface org.apache.beam.sdk.state.SetState
-
Removes the specified element from this set if it is present.
- removeAllPipelineOptions() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
- removeBucket(Bucket) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
-
Remove an empty
Bucket
in Cloud Storage or propagates an exception. - removeMetadata(MetadataEntity) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- removePipelineOption(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
- removePrefix() - Method in class org.apache.beam.sdk.extensions.sql.impl.TableName
-
Remove prefix, e.g.
- removeProperties(MetadataEntity) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- removeProperties(MetadataEntity, String...) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- removeStagedArtifacts(String) - Method in interface org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService.ArtifactDestinationProvider
- removeStagedArtifacts(String) - Method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService
- removeTags(MetadataEntity) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- removeTags(MetadataEntity, String...) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
- removeTemporaryFiles(Collection<ResourceId>) - Method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
- removeWatermarkHoldUsage(Instant) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals
- rename(Iterable<String>, Iterable<String>, MoveOptions...) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
- rename(String, String) - Method in class org.apache.beam.sdk.schemas.transforms.RenameFields.Inner
-
Rename a specific field.
- rename(List<ClassLoaderFileSystem.ClassLoaderResourceId>, List<ClassLoaderFileSystem.ClassLoaderResourceId>, MoveOptions...) - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem
- rename(List<ResourceId>, List<ResourceId>, MoveOptions...) - Static method in class org.apache.beam.sdk.io.FileSystems
-
Renames a
List
of file-like resources from one location to another. - rename(List<ResourceIdT>, List<ResourceIdT>, MoveOptions...) - Method in class org.apache.beam.sdk.io.FileSystem
-
Renames a
List
of file-like resources from one location to another. - rename(FieldAccessDescriptor, String) - Method in class org.apache.beam.sdk.schemas.transforms.RenameFields.Inner
-
Rename a specific field.
- RenameFields - Class in org.apache.beam.sdk.schemas.transforms
-
A transform for renaming fields inside an existing schema.
- RenameFields() - Constructor for class org.apache.beam.sdk.schemas.transforms.RenameFields
- RenameFields.Inner<T> - Class in org.apache.beam.sdk.schemas.transforms
-
The class implementing the actual PTransform.
- REPEATABLE_ERROR_TYPES - Static variable in class org.apache.beam.io.requestresponse.RequestResponseIO
-
Set
ofUserCodeExecutionException
s that warrant repeating. - Repeatedly - Class in org.apache.beam.sdk.transforms.windowing
-
A
Trigger
that fires according to its subtrigger forever. - replace(Class<V>, T) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.utils.ScalaInterop
- replace(String, String, String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
- REPLACE - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
- REPLACE_METHOD - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.BeamBuiltinMethods
- REPLACE_URN - Static variable in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar
- replaceAll(String, String) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.ReplaceAll
PTransform
that checks if a portion of the line matches the Regex and replaces all matches with the replacement String. - replaceAll(List<PTransformOverride>) - Method in class org.apache.beam.sdk.Pipeline
-
For internal use only; no backwards-compatibility guarantees.
- replaceAll(Pattern, String) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.ReplaceAll
PTransform
that checks if a portion of the line matches the Regex and replaces all matches with the replacement String. - ReplaceAll(Pattern, String) - Constructor for class org.apache.beam.sdk.transforms.Regex.ReplaceAll
- ReplaceBuilder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReplaceBuilder
- replaceFirst(String, String) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.ReplaceAll
PTransform
that checks if a portion of the line matches the Regex and replaces the first match with the replacement String. - replaceFirst(Pattern, String) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.ReplaceAll
PTransform
that checks if a portion of the line matches the Regex and replaces the first match with the replacement String. - ReplaceFirst(Pattern, String) - Constructor for class org.apache.beam.sdk.transforms.Regex.ReplaceFirst
- replaceGcsFilesWithLocalFiles(List<String>) - Static method in class org.apache.beam.runners.dataflow.DataflowRunner
-
Replaces GCS file paths with local file paths by downloading the GCS files locally.
- replaceTransforms(Pipeline, StreamingOptions) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator
- replaceV1Transforms(Pipeline) - Method in class org.apache.beam.runners.dataflow.DataflowRunner
- ReplicaInfo() - Constructor for class org.apache.beam.sdk.io.solr.SolrIO.ReplicaInfo
- report() - Method in class org.apache.beam.runners.spark.metrics.sink.CsvSink
- report() - Method in class org.apache.beam.runners.spark.metrics.sink.GraphiteSink
- report() - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.sink.CodahaleCsvSink
- report() - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.sink.CodahaleGraphiteSink
- REPORT_FAILURES - Enum constant in enum class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.FailureMode
-
Invalid mutations will be returned as part of the result of the write transform.
- reportCheckpointDuration(long) - Method in class org.apache.beam.runners.flink.translation.utils.CheckpointStats
- reportElementSize(long) - Method in class org.apache.beam.sdk.testing.CoderProperties.TestElementByteSizeObserver
- reportError(String, Throwable) - Method in class org.apache.beam.sdk.io.sparkreceiver.WrappedSupervisor
- reportFailedRPCMetrics(RetryManager.Operation.Context<?>, BigQuerySinkMetrics.RpcMethod) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
-
Records RpcRequests counter and RpcLatency histogram for this RPC call.
- reportFailedRPCMetrics(RetryManager.Operation.Context<?>, BigQuerySinkMetrics.RpcMethod, String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
-
Records RpcRequests counter and RpcLatency histogram for this RPC call.
- reportLineage(ResourceIdT, Lineage) - Method in class org.apache.beam.sdk.io.FileSystem
-
Report
Lineage
metrics for resource id at file level. - reportLineage(ResourceIdT, Lineage, FileSystem.LineageLevel) - Method in class org.apache.beam.sdk.io.FileSystem
-
Report
Lineage
metrics for resource id to a given level. - reportPCollectionConsumed(PCollection<?>) - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
-
Reports that given
PCollection
is consumed by aPTransform
in the pipeline. - reportPCollectionProduced(PCollection<?>) - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
-
Reports that given
PCollection
is consumed by aPTransform
in the pipeline. - reportPendingMetrics() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StorageClient
-
Call this method on Work Item thread to report outstanding metrics.
- reportSinkLineage(ResourceId) - Static method in class org.apache.beam.sdk.io.FileSystems
-
Report sink
Lineage
metrics for resource id. - reportSinkLineage(ResourceId, FileSystem.LineageLevel) - Static method in class org.apache.beam.sdk.io.FileSystems
-
Report source
Lineage
metrics for resource id at given level. - reportSourceLineage(ResourceId) - Static method in class org.apache.beam.sdk.io.FileSystems
-
Report source
Lineage
metrics for resource id. - reportSourceLineage(ResourceId, FileSystem.LineageLevel) - Static method in class org.apache.beam.sdk.io.FileSystems
-
Report source
Lineage
metrics for resource id at given level. - reportSuccessfulRpcMetrics(RetryManager.Operation.Context<?>, BigQuerySinkMetrics.RpcMethod) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
-
Records RpcRequests counter and RpcLatency histogram for this RPC call.
- reportSuccessfulRpcMetrics(RetryManager.Operation.Context<?>, BigQuerySinkMetrics.RpcMethod, String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
-
Records RpcRequests counter and RpcLatency histogram for this RPC call.
- reportWorkItemStatus(String, ReportWorkItemStatusRequest) - Method in class org.apache.beam.runners.dataflow.DataflowClient
-
Reports the status of the work item for
jobId
. - requestProgress() - Method in interface org.apache.beam.runners.fnexecution.control.RemoteBundle
-
Ask the remote bundle for progress.
- requestProgress() - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient.BundleProcessor.ActiveBundle
- RequestResponseIO<RequestT,
ResponseT> - Class in org.apache.beam.io.requestresponse -
PTransform
for reading from and writing to Web APIs. - requestsFinalization(String) - Method in interface org.apache.beam.runners.fnexecution.control.BundleFinalizationHandler
-
This callback is invoked whenever an inflight bundle that is being processed requests finalization.
- requestsFinalization(String) - Method in class org.apache.beam.runners.fnexecution.control.BundleFinalizationHandlers.InMemoryFinalizer
- requestsInProgress() - Method in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
- requestTimeMsSinceEpoch() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.IncomingMessage
-
Timestamp (in system time) at which we requested the message (ms since epoch).
- REQUIRED_MEMORY_FOR_DEFAULT_BUFFER_BYTES - Static variable in class org.apache.beam.sdk.io.aws2.options.S3Options.S3UploadBufferSizeBytesFactory
- Required Minimum Functionality - Search tag in class org.apache.beam.sdk.io.googleads.GoogleAdsV19
- Section
- Requirements - Class in org.apache.beam.sdk.transforms
-
Describes the run-time requirements of a
Contextful
, such as access to side inputs. - requiresDataSchema() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSchemaIOProvider
- requiresDataSchema() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaIOProvider
-
Indicates whether this transform requires a specified data schema.
- requiresDataSchema() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider
- requiresDataSchema() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
- requiresDataSchema() - Method in class org.apache.beam.sdk.io.jdbc.JdbcSchemaIOProvider
- requiresDataSchema() - Method in interface org.apache.beam.sdk.schemas.io.SchemaIOProvider
-
Indicates whether the dataSchema value is necessary.
- requiresDeduping() - Method in class org.apache.beam.sdk.io.solace.read.UnboundedSolaceSource
- requiresDeduping() - Method in class org.apache.beam.sdk.io.UnboundedSource
-
Returns whether this source requires explicit deduping.
- requiresSideInputs(Collection<PCollectionView<?>>) - Static method in class org.apache.beam.sdk.transforms.Requirements
-
Describes the need for access to the given side inputs.
- requiresSideInputs(PCollectionView<?>...) - Static method in class org.apache.beam.sdk.transforms.Requirements
- requiresStableInput(RunnerApi.ExecutableStagePayload) - Static method in class org.apache.beam.runners.flink.translation.utils.FlinkPortableRunnerUtils
- requiresTimeSortedInput(RunnerApi.ExecutableStagePayload, boolean) - Static method in class org.apache.beam.runners.flink.translation.utils.FlinkPortableRunnerUtils
- reset() - Method in class org.apache.beam.runners.spark.metrics.MetricsContainerStepMapAccumulator
- reset() - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.MetricsAccumulator
- reset() - Method in class org.apache.beam.sdk.fn.CancellableQueue
-
Enables the queue to be re-used after it has been cancelled.
- reset() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver
-
Enables this receiver to be used again for another bundle.
- reset() - Method in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
- reset() - Method in class org.apache.beam.sdk.testing.CoderProperties.TestElementByteSizeObserver
- resetCache() - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
-
Resets the set of interfaces registered with this factory to the default state.
- resetForNewKey() - Method in class org.apache.beam.runners.fnexecution.state.InMemoryBagUserStateFactory
-
Prepares previous emitted state handlers for processing a new key.
- resetLocal() - Method in class org.apache.beam.runners.flink.metrics.MetricsAccumulator
- resetTo(int) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
-
Resets the end of the stream to the specified position.
- reshuffle(JavaRDD<WindowedValue<T>>, WindowedValues.WindowedValueCoder<T>) - Static method in class org.apache.beam.runners.spark.translation.GroupCombineFunctions
-
An implementation of
Reshuffle
for the Spark runner. - Reshuffle<K,
V> - Class in org.apache.beam.sdk.transforms -
For internal use only; no backwards compatibility guarantees.
- Reshuffle.AssignShardFn<T> - Class in org.apache.beam.sdk.transforms
- Reshuffle.ViaRandomKey<T> - Class in org.apache.beam.sdk.transforms
-
Implementation of
Reshuffle.viaRandomKey()
. - ReshuffleTrigger<W> - Class in org.apache.beam.sdk.transforms.windowing
-
For internal use only; no backwards compatibility guarantees.
- ReshuffleTrigger() - Constructor for class org.apache.beam.sdk.transforms.windowing.ReshuffleTrigger
- resolve(String) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- resolve(String, ResolveOptions) - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsResourceId
- resolve(String, ResolveOptions) - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem.ClassLoaderResourceId
- resolve(String, ResolveOptions) - Method in interface org.apache.beam.sdk.io.fs.ResourceId
-
Returns a child
ResourceId
underthis
. - resolve(Path) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- resolve(Supplier<PipelineOptions>, Dataset<WindowedValue<InT>>) - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator.UnresolvedTranslation
- resolve(Schema) - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
-
Resolve the
FieldAccessDescriptor
against a schema. - RESOLVE_DIRECTORY - Enum constant in enum class org.apache.beam.sdk.io.fs.ResolveOptions.StandardResolveOptions
-
Resolve a directory.
- RESOLVE_FILE - Enum constant in enum class org.apache.beam.sdk.io.fs.ResolveOptions.StandardResolveOptions
-
Resolve a file.
- resolveAlias(ResolvedColumn) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.QueryTrait
- resolveArtifacts(ArtifactApi.ResolveArtifactsRequest, StreamObserver<ArtifactApi.ResolveArtifactsResponse>) - Method in class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
- resolveArtifacts(RunnerApi.Pipeline) - Method in class org.apache.beam.runners.dataflow.DataflowRunner
- resolveCalciteTable(SchemaPlus, List<String>) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.TableResolution
-
Resolves
tablePath
according to the givenschemaPlus
. - resolveCredentials() - Method in class org.apache.beam.sdk.io.aws2.auth.StsAssumeRoleForFederatedCredentialsProvider
- resolvedTables - Variable in class org.apache.beam.sdk.extensions.sql.zetasql.QueryTrait
- resolveMetastore() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
- ResolveOptions - Interface in org.apache.beam.sdk.io.fs
-
An object that configures
ResourceId.resolve(java.lang.String, org.apache.beam.sdk.io.fs.ResolveOptions)
. - ResolveOptions.StandardResolveOptions - Enum Class in org.apache.beam.sdk.io.fs
-
Defines the standard resolve options.
- resolveSchemaCompatibility(TypeSerializer<StateNamespace>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.FlinkStateNamespaceKeySerializer.FlinkStateNameSpaceSerializerSnapshot
- resolveSchemaCompatibility(TypeSerializer<T>) - Method in class org.apache.beam.runners.flink.translation.types.UnversionedTypeSerializerSnapshot
- resolveSibling(String) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- resolveSibling(Path) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- resolveTokenValue(String) - Method in class org.apache.beam.sdk.io.aws2.auth.GoogleADCIdTokenProvider
- resolveTokenValue(String) - Method in interface org.apache.beam.sdk.io.aws2.auth.WebIdTokenProvider
-
Resolves the value for a OIDC web identity token.
- resolveType(Type) - Method in class org.apache.beam.sdk.values.TypeDescriptor
-
Returns a
TypeDescriptor
representing the given type, with type variables resolved according to the specialization in this type. - RESOURCE - Enum constant in enum class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Import.ContentStructure
-
The source file contains one or more lines of newline-delimited JSON (ndjson).
- RESOURCE_HINTS - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- RESOURCE_ID - Static variable in class org.apache.beam.sdk.io.contextualtextio.RecordWithMetadata
- ResourceHint - Class in org.apache.beam.sdk.transforms.resourcehints
-
Provides a definition of a resource hint known to the SDK.
- ResourceHint() - Constructor for class org.apache.beam.sdk.transforms.resourcehints.ResourceHint
- resourceHints - Variable in class org.apache.beam.sdk.transforms.PTransform
- ResourceHints - Class in org.apache.beam.sdk.transforms.resourcehints
-
Pipeline authors can use resource hints to provide additional information to runners about the desired aspects of the execution environment.
- ResourceHintsOptions - Interface in org.apache.beam.sdk.transforms.resourcehints
-
Options that are used to control configuration of the remote environment.
- ResourceHintsOptions.EmptyListDefault - Class in org.apache.beam.sdk.transforms.resourcehints
- ResourceHintsOptions.Options - Class in org.apache.beam.sdk.transforms.resourcehints
-
Register the
ResourceHintsOptions
. - resourceId() - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata
- ResourceId - Interface in org.apache.beam.sdk.io.fs
-
An identifier which represents a file-like resource.
- ResourceIdCoder - Class in org.apache.beam.sdk.io.fs
-
A
Coder
forResourceId
. - ResourceIdCoder() - Constructor for class org.apache.beam.sdk.io.fs.ResourceIdCoder
- ResourceIdTester - Class in org.apache.beam.sdk.io.fs
-
A utility to test
ResourceId
implementations. - responseReceivedEx(Object) - Method in class org.apache.beam.sdk.io.solace.broker.PublishResultHandler
- restoreEnumerator(SplitEnumeratorContext<FlinkSourceSplit<T>>, Map<Integer, List<FlinkSourceSplit<T>>>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSource
- restoreSerializer() - Method in class org.apache.beam.runners.flink.translation.types.UnversionedTypeSerializerSnapshot
- restoreSerializer() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.FlinkStateNamespaceKeySerializer.FlinkStateNameSpaceSerializerSnapshot
- RESTRICTION_CODER - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
-
Deprecated.Uses the incorrect terminology.
PropertyNames.RESTRICTION_ENCODING
. Should be removed once non FnAPI SplittableDoFn expansion for Dataflow is removed. - RESTRICTION_ENCODING - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- RestrictionInterrupter<T> - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction
-
An interrupter for restriction tracker of type T.
- Restrictions - Search tag in interface org.apache.beam.sdk.options.PipelineOptions
- Section
- restrictionTracker(StreamProgress) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.ReadChangeStreamPartitionDoFn
- restrictionTracker(PulsarSourceDescriptor, OffsetRange) - Method in class org.apache.beam.sdk.io.pulsar.ReadFromPulsarDoFn
- restrictionTracker(OffsetRange) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.DetectNewPartitionsDoFn
- RestrictionTracker<RestrictionT,
PositionT> - Class in org.apache.beam.sdk.transforms.splittabledofn -
Manages access to the restriction and keeps track of its claimed part for a splittable
DoFn
. - RestrictionTracker() - Constructor for class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker
- RestrictionTracker.HasProgress - Interface in org.apache.beam.sdk.transforms.splittabledofn
-
All
RestrictionTracker
s SHOULD implement this interface to improve auto-scaling and splitting performance. - RestrictionTracker.IsBounded - Enum Class in org.apache.beam.sdk.transforms.splittabledofn
- RestrictionTracker.Progress - Class in org.apache.beam.sdk.transforms.splittabledofn
-
A representation for the amount of known completed and remaining work.
- RestrictionTracker.TruncateResult<RestrictionT> - Class in org.apache.beam.sdk.transforms.splittabledofn
-
A representation of the truncate result.
- RestrictionTrackers - Class in org.apache.beam.sdk.fn.splittabledofn
-
Support utilities for interacting with
RestrictionTrackers
. - RestrictionTrackers() - Constructor for class org.apache.beam.sdk.fn.splittabledofn.RestrictionTrackers
- RestrictionTrackers.ClaimObserver<PositionT> - Interface in org.apache.beam.sdk.fn.splittabledofn
-
Interface allowing a runner to observe the calls to
RestrictionTracker.tryClaim(PositionT)
. - Result<ResponseT> - Class in org.apache.beam.io.requestresponse
- Result() - Constructor for class org.apache.beam.sdk.schemas.transforms.CoGroup.Result
- Result() - Constructor for class org.apache.beam.sdk.transforms.WithFailures.Result
- resultBuilder() - Static method in class org.apache.beam.sdk.transforms.JsonToRow.ParseResult
- resume() - Static method in class org.apache.beam.sdk.transforms.DoFn.ProcessContinuation
-
Indicates that there is more work to be done for the current element.
- RESUME_OR_FAIL - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ExistingPipelineOptions
- RESUME_OR_NEW - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ExistingPipelineOptions
- resumeDelay() - Method in class org.apache.beam.sdk.transforms.DoFn.ProcessContinuation
-
A minimum duration that should elapse between the end of this
DoFn.ProcessElement
call and theDoFn.ProcessElement
call continuing processing of the same element. - resumeFromPreviousPipelineAction(ChangeStreamMetrics, MetadataTableDao, Instant, ProcessNewPartitionsAction) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ActionFactory
- ResumeFromPreviousPipelineAction - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.action
- ResumeFromPreviousPipelineAction(ChangeStreamMetrics, MetadataTableDao, Instant, ProcessNewPartitionsAction) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ResumeFromPreviousPipelineAction
- retain(KeyT) - Method in class org.apache.beam.sdk.io.aws2.common.ObjectPool
-
Retain a reference to a shared client instance.
- retain(AwsOptions, ClientConfiguration) - Method in class org.apache.beam.sdk.io.aws2.common.ObjectPool.ClientPool
-
Retain a reference to a shared client instance.
- RETRACTING_FIRED_PANES - Enum constant in enum class org.apache.beam.sdk.values.WindowingStrategy.AccumulationMode
- Retries - Search tag in class org.apache.beam.sdk.io.aws2.dynamodb.DynamoDBIO
- Section
- Retries - Search tag in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO
- Section
- Retries - Search tag in class org.apache.beam.sdk.io.aws2.sns.SnsIO
- Section
- Retries - Search tag in class org.apache.beam.sdk.io.aws2.sqs.SqsIO
- Section
- retrievalToken() - Method in class org.apache.beam.runners.fnexecution.provisioning.JobInfo
- retrieveDicomStudyMetadata(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Retrieve DicomStudyMetadata.
- retrieveDicomStudyMetadata(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- retrieveFieldNames(List<ResolvedColumn>) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.QueryTrait
- retrieveInternalProcessingTimerQueue(InternalTimerService<TimerInternals.TimerData>) - Static method in class org.apache.beam.runners.flink.translation.utils.Workarounds
- retrieveRexNode(ResolvedNodes.ResolvedProjectScan, List<RelDataTypeField>) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.ExpressionConverter
-
Extract expressions from a project scan node.
- retrieveRexNodeFromOrderByScan(RelOptCluster, ResolvedNodes.ResolvedOrderByScan, List<RelDataTypeField>) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.ExpressionConverter
-
Extract expressions from order by scan node.
- retry() - Method in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration
-
Optional
RetryConfiguration
for AWS clients. - retry(Consumer<RetryConfiguration.Builder>) - Method in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration.Builder
-
Optional
RetryConfiguration
for AWS clients. - retry(RetryConfiguration) - Method in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration.Builder
-
Optional
RetryConfiguration
for AWS clients. - retryCallable(Callable<V>, Set<Class<? extends Exception>>) - Method in class org.apache.beam.sdk.io.solace.RetryCallableManager
-
Method that executes and repeats the execution of the callable argument, if it throws one of the exceptions from the exceptionsToIntercept Set.
- RetryCallableManager - Class in org.apache.beam.sdk.io.solace
-
A class that manages retrying of callables based on the exceptions they throw.
- RetryCallableManager() - Constructor for class org.apache.beam.sdk.io.solace.RetryCallableManager
- RetryCallableManager.Builder - Class in org.apache.beam.sdk.io.solace
- RetryConfiguration - Class in org.apache.beam.sdk.io.aws2.common
-
Configuration of the retry behavior for AWS SDK clients.
- RetryConfiguration - Class in org.apache.beam.sdk.io.jms
- RetryConfiguration() - Constructor for class org.apache.beam.sdk.io.aws2.common.RetryConfiguration
- RetryConfiguration() - Constructor for class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.RetryConfiguration
- RetryConfiguration() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcIO.RetryConfiguration
- RetryConfiguration() - Constructor for class org.apache.beam.sdk.io.jms.RetryConfiguration
- RetryConfiguration() - Constructor for class org.apache.beam.sdk.io.solr.SolrIO.RetryConfiguration
- RetryConfiguration.Builder - Class in org.apache.beam.sdk.io.aws2.common
- RetryHttpRequestInitializer - Class in org.apache.beam.sdk.extensions.gcp.util
-
Implements a request initializer that adds retry handlers to all HttpRequests.
- RetryHttpRequestInitializer() - Constructor for class org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer
- RetryHttpRequestInitializer(Collection<Integer>) - Constructor for class org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer
- RetryHttpRequestInitializer(Collection<Integer>, HttpResponseInterceptor) - Constructor for class org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer
- retryTransientErrors() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy
-
Retry all failures except for known persistent errors.
- reverse(String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
- REVERSE - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
- REVERSE_METHOD - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.BeamBuiltinMethods
- reverseArtifactRetrievalService(StreamObserver<ArtifactApi.ArtifactRequestWrapper>) - Method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService
- reverseBytes(byte[]) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinStringFunctions
- Reversed() - Constructor for class org.apache.beam.sdk.transforms.Top.Reversed
- reverseString(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinStringFunctions
- revision() - Method in interface org.apache.beam.sdk.options.PipelineOptions
-
A monotonically increasing revision number of this
PipelineOptions
object that can be used to detect changes. - RHS_TAG - Static variable in class org.apache.beam.sdk.schemas.transforms.Join
- right(Integer...) - Method in class org.apache.beam.sdk.schemas.transforms.Join.FieldsEqual.Impl
- right(Integer...) - Method in class org.apache.beam.sdk.schemas.transforms.Join.FieldsEqual
- right(String...) - Method in class org.apache.beam.sdk.schemas.transforms.Join.FieldsEqual.Impl
- right(String...) - Method in class org.apache.beam.sdk.schemas.transforms.Join.FieldsEqual
- right(FieldAccessDescriptor) - Method in class org.apache.beam.sdk.schemas.transforms.Join.FieldsEqual.Impl
- right(FieldAccessDescriptor) - Method in class org.apache.beam.sdk.schemas.transforms.Join.FieldsEqual
- RIGHT - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamJoinPushThroughJoinRule
-
Instance of the rule that works on logical joins only, and pushes to the right.
- rightOuterJoin(String, PCollection<KV<K, V1>>, PCollection<KV<K, V2>>, V1) - Static method in class org.apache.beam.sdk.extensions.joinlibrary.Join
-
Right Outer Join of two collections of KV elements.
- rightOuterJoin(PCollection<KV<K, V1>>, PCollection<KV<K, V2>>, V1) - Static method in class org.apache.beam.sdk.extensions.joinlibrary.Join
-
Right Outer Join of two collections of KV elements.
- rightOuterJoin(PCollection<RhsT>) - Static method in class org.apache.beam.sdk.schemas.transforms.Join
-
Perform a right outer join.
- RingRange - Class in org.apache.beam.sdk.io.cassandra
-
Models a Cassandra token range.
- rollback() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- rollback(Savepoint) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- root() - Static method in class org.apache.beam.sdk.transforms.display.DisplayData.Path
-
Path for display data registered by a top-level component.
- roundRobinSubList(List<T>, int, int) - Static method in class org.apache.beam.runners.jet.Utils
-
Assigns the
list
tocount
sublists in a round-robin fashion. - route(DoFn.MultiOutputReceiver, RecordT, Coder<RecordT>, Exception, String) - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecordRouter.RecordingBadRecordRouter
- route(DoFn.MultiOutputReceiver, RecordT, Coder<RecordT>, Exception, String) - Method in interface org.apache.beam.sdk.transforms.errorhandling.BadRecordRouter
- route(DoFn.MultiOutputReceiver, RecordT, Coder<RecordT>, Exception, String) - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecordRouter.ThrowingBadRecordRouter
- route(DoFn.FinishBundleContext, RecordT, Coder<RecordT>, Exception, String, BoundedWindow) - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecordRouter.RecordingBadRecordRouter
- route(DoFn.FinishBundleContext, RecordT, Coder<RecordT>, Exception, String, BoundedWindow) - Method in interface org.apache.beam.sdk.transforms.errorhandling.BadRecordRouter
- route(DoFn.FinishBundleContext, RecordT, Coder<RecordT>, Exception, String, BoundedWindow) - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecordRouter.ThrowingBadRecordRouter
- row(Schema) - Static method in class org.apache.beam.sdk.schemas.Schema.FieldType
-
Create a map type for the given key and value types.
- Row - Class in org.apache.beam.sdk.values
-
Row
is an immutable tuple-like schema to represent one element in aPCollection
. - ROW - Enum constant in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- ROW - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils
- ROW_PROPERTY_MUTATION_INFO - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider
- ROW_PROPERTY_MUTATION_SQN - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider
- ROW_PROPERTY_MUTATION_TYPE - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider
- ROW_SCHEMA - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider
- ROW_SCHEMA_MUTATION_INFO - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider
- Row.Builder - Class in org.apache.beam.sdk.values
-
Builder for
Row
. - Row.Equals - Class in org.apache.beam.sdk.values
- Row.FieldValueBuilder - Class in org.apache.beam.sdk.values
-
Builder for
Row
that bases a row on another row. - rowBag(Schema) - Static method in class org.apache.beam.sdk.state.StateSpecs
- RowBundle<T> - Class in org.apache.beam.sdk.jmh.schemas
-
Bundle of rows according to the configured
Factory
as input for benchmarks. - RowBundle() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundle
- RowBundle(Class<T>) - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundle
- RowBundle.Action - Enum Class in org.apache.beam.sdk.jmh.schemas
- RowBundles - Interface in org.apache.beam.sdk.jmh.schemas
- RowBundles.ArrayOfNestedStringBundle - Class in org.apache.beam.sdk.jmh.schemas
- RowBundles.ArrayOfNestedStringBundle.Field - Class in org.apache.beam.sdk.jmh.schemas
- RowBundles.ArrayOfStringBundle - Class in org.apache.beam.sdk.jmh.schemas
- RowBundles.ArrayOfStringBundle.Field - Class in org.apache.beam.sdk.jmh.schemas
- RowBundles.ByteBufferBundle - Class in org.apache.beam.sdk.jmh.schemas
- RowBundles.ByteBufferBundle.Field - Class in org.apache.beam.sdk.jmh.schemas
- RowBundles.BytesBundle - Class in org.apache.beam.sdk.jmh.schemas
- RowBundles.BytesBundle.Field - Class in org.apache.beam.sdk.jmh.schemas
- RowBundles.DateTimeBundle - Class in org.apache.beam.sdk.jmh.schemas
- RowBundles.DateTimeBundle.Field - Class in org.apache.beam.sdk.jmh.schemas
- RowBundles.IntBundle - Class in org.apache.beam.sdk.jmh.schemas
- RowBundles.IntBundle.Field - Class in org.apache.beam.sdk.jmh.schemas
- RowBundles.MapOfIntBundle - Class in org.apache.beam.sdk.jmh.schemas
- RowBundles.MapOfIntBundle.Field - Class in org.apache.beam.sdk.jmh.schemas
- RowBundles.MapOfNestedIntBundle - Class in org.apache.beam.sdk.jmh.schemas
- RowBundles.MapOfNestedIntBundle.Field - Class in org.apache.beam.sdk.jmh.schemas
- RowBundles.NestedBytesBundle - Class in org.apache.beam.sdk.jmh.schemas
- RowBundles.NestedBytesBundle.Field - Class in org.apache.beam.sdk.jmh.schemas
- RowBundles.NestedIntBundle - Class in org.apache.beam.sdk.jmh.schemas
- RowBundles.NestedIntBundle.Field - Class in org.apache.beam.sdk.jmh.schemas
- RowBundles.StringBuilderBundle - Class in org.apache.beam.sdk.jmh.schemas
- RowBundles.StringBuilderBundle.Field - Class in org.apache.beam.sdk.jmh.schemas
- RowBundles.StringBundle - Class in org.apache.beam.sdk.jmh.schemas
- RowBundles.StringBundle.Field - Class in org.apache.beam.sdk.jmh.schemas
- RowCoder - Class in org.apache.beam.sdk.coders
-
A sub-class of SchemaCoder that can only encode
Row
instances. - RowCoderCloudObjectTranslator - Class in org.apache.beam.runners.dataflow.util
-
Translator for row coders.
- RowCoderCloudObjectTranslator() - Constructor for class org.apache.beam.runners.dataflow.util.RowCoderCloudObjectTranslator
- RowCoderGenerator - Class in org.apache.beam.sdk.coders
- RowCoderGenerator() - Constructor for class org.apache.beam.sdk.coders.RowCoderGenerator
- rowFromProto(SchemaApi.Row, Schema.FieldType) - Static method in class org.apache.beam.sdk.schemas.SchemaTranslation
- rowMap(Schema, Schema) - Static method in class org.apache.beam.sdk.state.StateSpecs
- RowMessages - Class in org.apache.beam.sdk.schemas
- rowMultimap(Schema, Schema) - Static method in class org.apache.beam.sdk.state.StateSpecs
-
Create a
StateSpec
for aMultimapState
, optimized for key lookups, key puts, and clear. - RowMutation - Class in org.apache.beam.sdk.io.gcp.bigquery
-
A convenience class for applying row updates to BigQuery using
BigQueryIO.applyRowMutations()
. - RowMutation() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.RowMutation
- RowMutation.RowMutationCoder - Class in org.apache.beam.sdk.io.gcp.bigquery
- RowMutationCoder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.RowMutation.RowMutationCoder
- RowMutationInformation - Class in org.apache.beam.sdk.io.gcp.bigquery
-
This class indicates how to apply a row update to BigQuery.
- RowMutationInformation() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.RowMutationInformation
- RowMutationInformation.MutationType - Enum Class in org.apache.beam.sdk.io.gcp.bigquery
- rowOrderedList(Schema) - Static method in class org.apache.beam.sdk.state.StateSpecs
- rowReceiver(DoFn.WindowedContext, TupleTag<T>, SchemaCoder<T>) - Static method in class org.apache.beam.sdk.transforms.DoFnOutputReceivers
-
Returns a
DoFn.OutputReceiver
that automatically converts aRow
to the user's output type and delegates toDoFnOutputReceivers.WindowedContextOutputReceiver
. - rowRestrictionProvider - Variable in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
- rows() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
The
TypeDescriptor
forRow
. - RowSchemaInformationProvider - Class in org.apache.beam.sdk.schemas.utils
- RowSchemaInformationProvider() - Constructor for class org.apache.beam.sdk.schemas.utils.RowSchemaInformationProvider
- RowSelector - Interface in org.apache.beam.sdk.schemas.utils
-
A selector interface for extracting fields from a row.
- RowSelectorContainer(Schema, FieldAccessDescriptor, boolean) - Constructor for class org.apache.beam.sdk.schemas.utils.SelectHelpers.RowSelectorContainer
- rowSet(Schema) - Static method in class org.apache.beam.sdk.state.StateSpecs
- rowsFromRecordBatch(Schema, VectorSchemaRoot) - Static method in class org.apache.beam.sdk.extensions.arrow.ArrowConversion
-
Returns a
ArrowConversion.RecordBatchRowIterator
backed by the Arrow record batch stored invectorSchemaRoot
. - rowsFromSerializedRecordBatch(Schema, InputStream, RootAllocator) - Static method in class org.apache.beam.sdk.extensions.arrow.ArrowConversion
- rowToBytesFn(SchemaProvider, TypeDescriptor<T>, Coder<? super T>) - Static method in class org.apache.beam.sdk.schemas.RowMessages
- rowToBytesFn(SchemaProvider, TypeDescriptor<T>, ProcessFunction<? super T, byte[]>) - Static method in class org.apache.beam.sdk.schemas.RowMessages
- RowToEntity - Class in org.apache.beam.sdk.io.gcp.datastore
- rowToProto(Row) - Static method in class org.apache.beam.sdk.schemas.SchemaTranslation
- RowUtils - Class in org.apache.beam.sdk.io.gcp.bigtable
- RowUtils() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.RowUtils
- rowValue(Schema) - Static method in class org.apache.beam.sdk.state.StateSpecs
-
Create a
StateSpec
for a row value with the specified schema. - RowWithGetters<T> - Class in org.apache.beam.sdk.values
-
A Concrete subclass of
Row
that delegates to a set of providedFieldValueGetter
s. - RowWithStorage - Class in org.apache.beam.sdk.values
-
Concrete subclass of
Row
that explicitly stores all fields of the row. - rpad(byte[], Long) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinStringFunctions
- rpad(byte[], Long, byte[]) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinStringFunctions
- rpad(String, Long) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinStringFunctions
- rpad(String, Long, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinStringFunctions
- RpcQosOptions - Class in org.apache.beam.sdk.io.gcp.firestore
-
Quality of Service manager options for Firestore RPCs.
- RpcQosOptions.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
-
Mutable Builder class for creating instances of
RpcQosOptions
. - RPUSH - Enum constant in enum class org.apache.beam.sdk.io.redis.RedisIO.Write.Method
-
Use RPUSH command.
- rtrim(String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
- rtrim(String, String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
- RTRIM - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
- RTRIM_METHOD - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.BeamBuiltinMethods
- RULE_arrayQualifier - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- RULE_dotExpression - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- RULE_dotExpressionComponent - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- RULE_fieldSpecifier - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- RULE_mapQualifier - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- RULE_qualifiedComponent - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- RULE_qualifierList - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- ruleNames - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
- ruleNames - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- run() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver
- run() - Method in interface org.apache.beam.sdk.function.ThrowingRunnable
- run() - Method in class org.apache.beam.sdk.Pipeline
-
Runs this
Pipeline
according to thePipelineOptions
used to create thePipeline
viaPipeline.create(PipelineOptions)
. - run() - Method in class org.apache.beam.sdk.testing.TestPipeline
-
Runs this
TestPipeline
, unwrapping anyAssertionError
that is raised during testing. - run(JobApi.RunJobRequest, StreamObserver<JobApi.RunJobResponse>) - Method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
- run(JobApi.RunJobRequest, StreamObserver<JobApi.RunJobResponse>) - Method in class org.apache.beam.runners.portability.testing.TestJobService
- run(RunnerApi.Pipeline, JobInfo) - Method in class org.apache.beam.runners.flink.FlinkPipelineRunner
- run(RunnerApi.Pipeline, JobInfo) - Method in class org.apache.beam.runners.jobsubmission.PortablePipelineJarCreator
-
Does not actually run the pipeline. Instead bundles the input pipeline along with all dependencies, artifacts, etc.
- run(RunnerApi.Pipeline, JobInfo) - Method in interface org.apache.beam.runners.jobsubmission.PortablePipelineRunner
- run(RunnerApi.Pipeline, JobInfo) - Method in class org.apache.beam.runners.spark.SparkPipelineRunner
- run(PartitionRecord, ChangeStreamRecord, RestrictionTracker<StreamProgress, StreamProgress>, DoFn.OutputReceiver<KV<ByteString, ChangeStreamRecord>>, ManualWatermarkEstimator<Instant>, BytesThroughputEstimator<KV<ByteString, ChangeStreamRecord>>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ChangeStreamAction
-
This class processes ReadChangeStreamResponse from bigtable server.
- run(PartitionRecord, RestrictionTracker<StreamProgress, StreamProgress>, DoFn.OutputReceiver<KV<ByteString, ChangeStreamRecord>>, ManualWatermarkEstimator<Instant>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ReadChangeStreamPartitionAction
-
Streams changes from a specific partition.
- run(PartitionMetadata, ChildPartitionsRecord, RestrictionTracker<TimestampRange, Timestamp>, RestrictionInterrupter<Timestamp>, ManualWatermarkEstimator<Instant>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.ChildPartitionsRecordAction
-
This is the main processing function for a
ChildPartitionsRecord
. - run(PartitionMetadata, DataChangeRecord, RestrictionTracker<TimestampRange, Timestamp>, RestrictionInterrupter<Timestamp>, DoFn.OutputReceiver<DataChangeRecord>, ManualWatermarkEstimator<Instant>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.DataChangeRecordAction
-
This is the main processing function for a
DataChangeRecord
. - run(PartitionMetadata, HeartbeatRecord, RestrictionTracker<TimestampRange, Timestamp>, RestrictionInterrupter<Timestamp>, ManualWatermarkEstimator<Instant>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.HeartbeatRecordAction
-
This is the main processing function for a
HeartbeatRecord
. - run(PartitionMetadata, PartitionEndRecord, RestrictionTracker<TimestampRange, Timestamp>, RestrictionInterrupter<Timestamp>, ManualWatermarkEstimator<Instant>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.PartitionEndRecordAction
-
This is the main processing function for a
PartitionEndRecord
. - run(PartitionMetadata, PartitionEventRecord, RestrictionTracker<TimestampRange, Timestamp>, RestrictionInterrupter<Timestamp>, ManualWatermarkEstimator<Instant>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.PartitionEventRecordAction
-
This is the main processing function for a
PartitionEventRecord
. - run(PartitionMetadata, PartitionStartRecord, RestrictionTracker<TimestampRange, Timestamp>, RestrictionInterrupter<Timestamp>, ManualWatermarkEstimator<Instant>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.PartitionStartRecordAction
-
This is the main processing function for a
PartitionStartRecord
. - run(PartitionMetadata, RestrictionTracker<TimestampRange, Timestamp>, DoFn.OutputReceiver<DataChangeRecord>, ManualWatermarkEstimator<Instant>, DoFn.BundleFinalizer) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.QueryChangeStreamAction
-
This method will dispatch a change stream query for the given partition, it delegate the processing of the records to one of the corresponding action classes registered and it will keep the state of the partition up to date in the Connector's metadata table.
- run(PipelineOptions) - Method in class org.apache.beam.sdk.Pipeline
-
Runs this
Pipeline
using the givenPipelineOptions
, using the runner specified by the options. - run(PipelineOptions) - Method in class org.apache.beam.sdk.testing.TestPipeline
-
Like
TestPipeline.run()
but with the given potentially modified options. - run(Pipeline) - Method in class org.apache.beam.runners.dataflow.DataflowRunner
- run(Pipeline) - Method in class org.apache.beam.runners.dataflow.TestDataflowRunner
- run(Pipeline) - Method in class org.apache.beam.runners.direct.DirectRunner
- run(Pipeline) - Method in class org.apache.beam.runners.flink.FlinkRunner
- run(Pipeline) - Method in class org.apache.beam.runners.flink.TestFlinkRunner
- run(Pipeline) - Method in class org.apache.beam.runners.jet.JetRunner
- run(Pipeline) - Method in class org.apache.beam.runners.portability.PortableRunner
- run(Pipeline) - Method in class org.apache.beam.runners.portability.testing.TestPortableRunner
- run(Pipeline) - Method in class org.apache.beam.runners.portability.testing.TestUniversalRunner
- run(Pipeline) - Method in class org.apache.beam.runners.prism.PrismRunner
- run(Pipeline) - Method in class org.apache.beam.runners.prism.TestPrismRunner
- run(Pipeline) - Method in class org.apache.beam.runners.spark.SparkRunner
- run(Pipeline) - Method in class org.apache.beam.runners.spark.SparkRunnerDebugger
- run(Pipeline) - Method in class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingRunner
- run(Pipeline) - Method in class org.apache.beam.runners.spark.TestSparkRunner
- run(Pipeline) - Method in class org.apache.beam.runners.twister2.Twister2Runner
- run(Pipeline) - Method in class org.apache.beam.runners.twister2.Twister2TestRunner
- run(Pipeline) - Method in class org.apache.beam.sdk.PipelineRunner
-
Processes the given
Pipeline
, potentially asynchronously, returning a runner-specific type of result. - run(Pipeline) - Method in class org.apache.beam.sdk.testing.CrashingRunner
- run(DoFn.OutputReceiver<PartitionRecord>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ResumeFromPreviousPipelineAction
-
Resume from previously drained pipeline.
- run(DoFn.OutputReceiver<PartitionRecord>, Instant) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.GenerateInitialPartitionsAction
-
The very first step of the pipeline when there are no partitions being streamed yet.
- run(PTransform<PBegin, ?>) - Method in class org.apache.beam.sdk.PipelineRunner
-
Overloaded
PTransform
runner that runs with the default appPipelineOptions
. - run(PTransform<PBegin, ?>, PipelineOptions) - Method in class org.apache.beam.sdk.PipelineRunner
-
Creates a
Pipeline
out of a singlePTransform
step, and executes it. - run(RestrictionTracker<TimestampRange, Timestamp>, DoFn.OutputReceiver<PartitionMetadata>, ManualWatermarkEstimator<Instant>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.DetectNewPartitionsAction
-
Executes the main logic to schedule new partitions.
- run(RestrictionTracker<OffsetRange, Long>, DoFn.OutputReceiver<PartitionRecord>, ManualWatermarkEstimator<Instant>, InitialPipelineState) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.DetectNewPartitionsAction
-
Perform the necessary steps to manage initial set of partitions and new partitions.
- run(SourceFunction.SourceContext<WindowedValue<byte[]>>) - Method in class org.apache.beam.runners.flink.translation.functions.ImpulseSourceFunction
- run(SourceFunction.SourceContext<WindowedValue<byte[]>>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.StreamingImpulseSource
-
Deprecated.
- run(SourceFunction.SourceContext<WindowedValue<ValueWithRecordId<OutputT>>>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.UnboundedSourceWrapper
- run(SourceFunction.SourceContext<WindowedValue<T>>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.TestStreamSource
- runBeforeProcessing(PipelineOptions) - Static method in class org.apache.beam.sdk.fn.JvmInitializers
-
Finds all registered implementations of JvmInitializer and executes their
beforeProcessing
methods. - RunInference<OutputT> - Class in org.apache.beam.sdk.extensions.python.transforms
-
Wrapper for invoking external Python
RunInference
. - runInTransaction(Function<PartitionMetadataDao.InTransactionContext, T>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
-
Runs a given function in a transaction context.
- runInTransaction(Function<PartitionMetadataDao.InTransactionContext, T>, String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
- Runner() - Constructor for class org.apache.beam.runners.dataflow.DataflowPipelineRegistrar.Runner
- Runner() - Constructor for class org.apache.beam.runners.direct.DirectRegistrar.Runner
- Runner() - Constructor for class org.apache.beam.runners.flink.FlinkRunnerRegistrar.Runner
- Runner() - Constructor for class org.apache.beam.runners.jet.JetRunnerRegistrar.Runner
- Runner() - Constructor for class org.apache.beam.runners.prism.PrismRegistrar.Runner
- Runner() - Constructor for class org.apache.beam.runners.spark.SparkRunnerRegistrar.Runner
- Runner() - Constructor for class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingRunnerRegistrar.Runner
- Runner() - Constructor for class org.apache.beam.runners.twister2.Twister2RunnerRegistrar.Runner
- RunnerRegistrar() - Constructor for class org.apache.beam.runners.portability.testing.TestUniversalRunner.RunnerRegistrar
- Runner specific features - Search tag in class org.apache.beam.sdk.managed.Managed
- Section
- RUNNING - Enum constant in enum class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.State
- RUNNING - Enum constant in enum class org.apache.beam.sdk.PipelineResult.State
-
The job is currently running.
- runOnStartup() - Static method in class org.apache.beam.sdk.fn.JvmInitializers
-
Finds all registered implementations of JvmInitializer and executes their
onStartup
methods. - runQuery() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.Read
-
Factory method to create a new type safe builder for
RunQueryRequest
operations. - runResourceIdBattery(ResourceId) - Static method in class org.apache.beam.sdk.io.fs.ResourceIdTester
- runTest(Pipeline) - Method in class org.apache.beam.runners.twister2.Twister2Runner
- runWindowFn(WindowFn<T, W>, List<Long>) - Static method in class org.apache.beam.sdk.testing.WindowFnTestUtils
-
Runs the
WindowFn
over the provided input, returning a map of windows to the timestamps in those windows. - runWindowFnWithValue(WindowFn<T, W>, List<TimestampedValue<T>>) - Static method in class org.apache.beam.sdk.testing.WindowFnTestUtils
-
Runs the
WindowFn
over the provided input, returning a map of windows to the timestamps in those windows. - runWithAdditionalOptionArgs(List<String>) - Method in class org.apache.beam.sdk.testing.TestPipeline
-
Runs this
TestPipeline
with additional cmd pipeline option args.
S
- S3ClientBuilderFactory - Interface in org.apache.beam.sdk.io.aws2.options
-
Construct S3ClientBuilder from S3 pipeline options.
- S3FileSystemConfiguration - Class in org.apache.beam.sdk.io.aws2.s3
-
Object used to configure
S3FileSystem
. - S3FileSystemConfiguration() - Constructor for class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
- S3FileSystemConfiguration.Builder - Class in org.apache.beam.sdk.io.aws2.s3
- S3FileSystemRegistrar - Class in org.apache.beam.sdk.io.aws2.s3
-
AutoService
registrar for theS3FileSystem
. - S3FileSystemRegistrar() - Constructor for class org.apache.beam.sdk.io.aws2.s3.S3FileSystemRegistrar
- S3FileSystemSchemeRegistrar - Interface in org.apache.beam.sdk.io.aws2.s3
-
A registrar that creates
S3FileSystemConfiguration
instances fromPipelineOptions
. - S3Options - Interface in org.apache.beam.sdk.io.aws2.options
-
Options used to configure Amazon Web Services S3.
- S3Options.S3UploadBufferSizeBytesFactory - Class in org.apache.beam.sdk.io.aws2.options
-
Provide the default s3 upload buffer size in bytes: 64MB if more than 512MB in RAM are available and 5MB otherwise.
- S3Options.SSECustomerKeyFactory - Class in org.apache.beam.sdk.io.aws2.options
- S3UploadBufferSizeBytesFactory() - Constructor for class org.apache.beam.sdk.io.aws2.options.S3Options.S3UploadBufferSizeBytesFactory
- SADD - Enum constant in enum class org.apache.beam.sdk.io.redis.RedisIO.Write.Method
-
Use SADD command.
- SAME - Enum constant in enum class org.apache.beam.sdk.schemas.Schema.EquivalenceNullablePolicy
- SAME_BIT_SIGNED - Enum constant in enum class org.apache.beam.sdk.extensions.sbe.UnsignedOptions.Behavior
-
Uses the signed primitive with the same bit count.
- Sample - Class in org.apache.beam.sdk.transforms
-
PTransform
s for taking samples of the elements in aPCollection
, or samples of the values associated with each key in aPCollection
ofKV
s. - Sample() - Constructor for class org.apache.beam.sdk.transforms.Sample
- SAMPLE_PARTITION - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
-
We use a bogus partition here to estimate the average size of a partition metadata record.
- Sample.FixedSizedSampleFn<T> - Class in org.apache.beam.sdk.transforms
-
CombineFn
that computes a fixed-size sample of a collection of values. - SampleAllFiles() - Constructor for class org.apache.beam.sdk.io.TextRowCountEstimator.SampleAllFiles
- satisfies(List<SerializableFunction<Iterable<T>, Void>>) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionListContentsAssert
-
Takes list of
SerializableFunction
s of the same size asPAssert.PCollectionListContentsAssert.pCollectionList
, and applies each matcher to thePCollection
with the identical index in thePAssert.PCollectionListContentsAssert.pCollectionList
. - satisfies(SerializableFunction<Iterable<T>, Void>) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
-
Applies the provided checking function (presumably containing assertions) to the iterable in question.
- satisfies(SerializableFunction<Iterable<T>, Void>) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
- satisfies(SerializableFunction<Iterable<T>, Void>) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionListContentsAssert
- satisfies(SerializableFunction<T, Void>) - Method in interface org.apache.beam.sdk.testing.PAssert.SingletonAssert
-
Applies the provided checking function (presumably containing assertions) to the value in question.
- satisfies(RelTrait) - Method in enum class org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention
- saveAsync(T) - Method in interface org.apache.beam.sdk.io.cassandra.Mapper
-
This method is called for each save event.
- SbeLogicalTypes - Class in org.apache.beam.sdk.extensions.sbe
-
Classes that represent various SBE semantic types.
- SbeLogicalTypes.LocalMktDate - Class in org.apache.beam.sdk.extensions.sbe
-
Representation of SBE's LocalMktDate.
- SbeLogicalTypes.TZTimeOnly - Class in org.apache.beam.sdk.extensions.sbe
-
Represents SBE's TimeOnly composite type.
- SbeLogicalTypes.TZTimestamp - Class in org.apache.beam.sdk.extensions.sbe
-
Represents SBE's TZTimestamp composite type.
- SbeLogicalTypes.Uint16 - Class in org.apache.beam.sdk.extensions.sbe
-
Represents SBE's uint16 type.
- SbeLogicalTypes.Uint32 - Class in org.apache.beam.sdk.extensions.sbe
-
Represents SBE's uint32 type.
- SbeLogicalTypes.Uint64 - Class in org.apache.beam.sdk.extensions.sbe
-
Represents SBE's uint64 type.
- SbeLogicalTypes.Uint8 - Class in org.apache.beam.sdk.extensions.sbe
-
Represents SBE's uint8 type.
- SbeLogicalTypes.UTCDateOnly - Class in org.apache.beam.sdk.extensions.sbe
-
Representation of SBE's UTCDateOnly.
- SbeLogicalTypes.UTCTimeOnly - Class in org.apache.beam.sdk.extensions.sbe
-
Represents SBE's UTCTimeOnly composite type.
- SbeLogicalTypes.UTCTimestamp - Class in org.apache.beam.sdk.extensions.sbe
-
Represents SBE's UTCTimestamp composite type.
- SbeSchema - Class in org.apache.beam.sdk.extensions.sbe
-
Represents an SBE schema.
- SbeSchema.IrOptions - Class in org.apache.beam.sdk.extensions.sbe
-
Options for configuring schema generation from an
Ir
. - SbeSchema.IrOptions.Builder - Class in org.apache.beam.sdk.extensions.sbe
-
Builder for
SbeSchema.IrOptions
. - ScalaInterop - Class in org.apache.beam.runners.spark.structuredstreaming.translation.utils
-
Utilities for easier interoperability with the Spark Scala API.
- ScalaInterop.Fun1<T,
V> - Interface in org.apache.beam.runners.spark.structuredstreaming.translation.utils - ScalaInterop.Fun2<T1,
T2, - Interface in org.apache.beam.runners.spark.structuredstreaming.translation.utilsV> - scalaIterator(Iterable<T>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.utils.ScalaInterop
-
Scala
Iterator
of JavaIterable
. - scalaIterator(Iterator<T>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.utils.ScalaInterop
-
Scala
Iterator
of JavaIterator
. - SCALAR_FIELD_NAME - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- ScalarFn - Class in org.apache.beam.sdk.extensions.sql.udf
-
A scalar function that can be executed as part of a SQL query.
- ScalarFn() - Constructor for class org.apache.beam.sdk.extensions.sql.udf.ScalarFn
- ScalarFn.ApplyMethod - Annotation Interface in org.apache.beam.sdk.extensions.sql.udf
-
Annotates the single method in a
ScalarFn
implementation that is to be applied to SQL function arguments. - ScalarFnReflector - Class in org.apache.beam.sdk.extensions.sql.impl
-
Reflection-based implementation logic for
ScalarFn
. - ScalarFnReflector() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.ScalarFnReflector
- ScalarFunctionImpl - Class in org.apache.beam.sdk.extensions.sql.impl
-
Beam-customized version from
ScalarFunctionImpl
, to address BEAM-5921. - ScalarFunctionImpl(Method, CallImplementor) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.ScalarFunctionImpl
- ScalarFunctionImpl(Method, CallImplementor, String) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.ScalarFunctionImpl
- SCHEDULED - Enum constant in enum class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.State
- ScheduledExecutorServiceFactory() - Constructor for class org.apache.beam.sdk.options.ExecutorOptions.ScheduledExecutorServiceFactory
- scheduleForCurrentProcessingTime(ProcessingTimeService.ProcessingTimeCallback) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- scheduleTask(Runnable, long) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- scheduleTaskAtFixedRate(Runnable, long, long) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- schema - Variable in class org.apache.beam.sdk.extensions.sql.meta.SchemaBaseBeamTable
- schema - Variable in class org.apache.beam.sdk.schemas.logicaltypes.NanosDuration
- schema - Variable in class org.apache.beam.sdk.schemas.logicaltypes.NanosInstant
- schema - Variable in class org.apache.beam.sdk.schemas.SchemaCoder
- schema() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.hcatalog.HCatalogTable
- schema() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
- schema() - Method in interface org.apache.beam.sdk.schemas.io.SchemaIO
-
Returns the schema of the data.
- schema(Schema) - Method in class org.apache.beam.sdk.extensions.sql.meta.Table.Builder
- Schema - Search tag in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
- Section
- Schema - Class in org.apache.beam.sdk.schemas
- Schema(List<Schema.Field>) - Constructor for class org.apache.beam.sdk.schemas.Schema
- Schema(List<Schema.Field>, Schema.Options) - Constructor for class org.apache.beam.sdk.schemas.Schema
- SCHEMA - Static variable in class org.apache.beam.sdk.io.ClassLoaderFileSystem
- SCHEMA - Static variable in class org.apache.beam.sdk.managed.testing.TestSchemaTransformProvider
- Schema.Builder - Class in org.apache.beam.sdk.schemas
-
Builder class for building
Schema
objects. - Schema.EquivalenceNullablePolicy - Enum Class in org.apache.beam.sdk.schemas
-
Control whether nullable is included in equivalence check.
- Schema.Field - Class in org.apache.beam.sdk.schemas
-
Field of a row.
- Schema.Field.Builder - Class in org.apache.beam.sdk.schemas
-
Builder for
Schema.Field
. - Schema.FieldType - Class in org.apache.beam.sdk.schemas
-
A descriptor of a single field type.
- Schema.LogicalType<InputT,
BaseT> - Interface in org.apache.beam.sdk.schemas -
A LogicalType allows users to define a custom schema type.
- Schema.Options - Class in org.apache.beam.sdk.schemas
- Schema.Options.Builder - Class in org.apache.beam.sdk.schemas
- Schema.TypeName - Enum Class in org.apache.beam.sdk.schemas
-
An enumerated list of type constructors.
- SchemaAndRecord - Class in org.apache.beam.sdk.io.gcp.bigquery
-
A wrapper for a
GenericRecord
and theTableSchema
representing the schema of the table (or query) it was generated from. - SchemaAndRecord(GenericRecord, TableSchema) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.SchemaAndRecord
- SchemaBaseBeamTable - Class in org.apache.beam.sdk.extensions.sql.meta
-
Each IO in Beam has one table schema, by extending
SchemaBaseBeamTable
. - SchemaBaseBeamTable(Schema) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.SchemaBaseBeamTable
- SchemaCaseFormat - Annotation Interface in org.apache.beam.sdk.schemas.annotations
- schemaCoder(Class<T>) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
-
Returns an
SchemaCoder
instance for the provided element class. - schemaCoder(Class<T>, Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
-
Returns an
SchemaCoder
instance for the provided element type using the provided Avro schema. - schemaCoder(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
-
Returns an
SchemaCoder
instance for the Avro schema. - schemaCoder(AvroCoder<T>) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
-
Returns an
SchemaCoder
instance based on the provided AvroCoder for the element type. - schemaCoder(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
-
Returns an
SchemaCoder
instance for the provided element type. - SchemaCoder<T> - Class in org.apache.beam.sdk.schemas
-
SchemaCoder
is used as the coder for types that have schemas registered. - SchemaCoder(Schema, TypeDescriptor<T>, SerializableFunction<T, Row>, SerializableFunction<Row, T>) - Constructor for class org.apache.beam.sdk.schemas.SchemaCoder
- SchemaCoderCloudObjectTranslator - Class in org.apache.beam.runners.dataflow.util
-
Translator for Schema coders.
- SchemaCoderCloudObjectTranslator() - Constructor for class org.apache.beam.runners.dataflow.util.SchemaCoderCloudObjectTranslator
- SchemaConversionOptions() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.SchemaConversionOptions
- SchemaCreate - Annotation Interface in org.apache.beam.sdk.schemas.annotations
-
Can be put on a constructor or a static method, in which case that constructor or method will be used to created instance of the class by Beam's schema code.
- SchemaFieldDescription - Annotation Interface in org.apache.beam.sdk.schemas.annotations
- SchemaFieldName - Annotation Interface in org.apache.beam.sdk.schemas.annotations
- SchemaFieldNumber - Annotation Interface in org.apache.beam.sdk.schemas.annotations
- schemaFor(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.extensions.avro.schemas.AvroRecordSchema
- schemaFor(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoMessageSchema
- schemaFor(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.io.aws2.schemas.AwsSchemaProvider
- schemaFor(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.io.thrift.ThriftSchema
- schemaFor(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.annotations.DefaultSchema.DefaultSchemaProvider
- schemaFor(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.AutoValueSchema
- schemaFor(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.JavaBeanSchema
- schemaFor(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.JavaFieldSchema
- schemaFor(TypeDescriptor<T>) - Method in interface org.apache.beam.sdk.schemas.SchemaProvider
-
Lookup a schema for the given type.
- schemaFromClass(TypeDescriptor<?>, FieldValueTypeSupplier) - Static method in class org.apache.beam.sdk.schemas.utils.StaticSchemaInference
-
Infer a schema from a Java class.
- schemaFromJavaBeanClass(TypeDescriptor<?>, FieldValueTypeSupplier) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
-
Create a
Schema
for a Java Bean class. - schemaFromPojoClass(TypeDescriptor<?>, FieldValueTypeSupplier) - Static method in class org.apache.beam.sdk.schemas.utils.POJOUtils
- schemaFromProto(SchemaApi.Schema) - Static method in class org.apache.beam.sdk.schemas.SchemaTranslation
- SchemaIgnore - Annotation Interface in org.apache.beam.sdk.schemas.annotations
-
When used on a POJO field or a JavaBean getter, that field or getter is ignored from the inferred schema.
- SchemaInformationProvider - Interface in org.apache.beam.sdk.schemas.utils
-
Provides an instance of
ConvertHelpers.ConvertedSchemaInformation
. - SchemaIO - Interface in org.apache.beam.sdk.schemas.io
-
An abstraction to create schema capable and aware IOs.
- SchemaIOProvider - Interface in org.apache.beam.sdk.schemas.io
-
Provider to create
SchemaIO
instances for use in Beam SQL and other SDKs. - SchemaIOTableProviderWrapper - Class in org.apache.beam.sdk.extensions.sql.meta.provider
-
A general
TableProvider
for IOs for consumption by Beam SQL. - SchemaIOTableProviderWrapper() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.SchemaIOTableProviderWrapper
- SchemaLogicalType - Class in org.apache.beam.sdk.schemas.logicaltypes
-
A schema represented as a serialized proto bytes.
- SchemaLogicalType() - Constructor for class org.apache.beam.sdk.schemas.logicaltypes.SchemaLogicalType
- schemaPathFromId(String, String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
- schemaPathFromPath(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
- SchemaProvider - Interface in org.apache.beam.sdk.schemas
-
Concrete implementations of this class allow creation of schema service objects that vend a
Schema
for a specific type. - SchemaProviderRegistrar - Interface in org.apache.beam.sdk.schemas
-
SchemaProvider
creators have the ability to automatically have theirschemaProvider
registered with this SDK by creating aServiceLoader
entry and a concrete implementation of this interface. - SchemaRegistry - Class in org.apache.beam.sdk.schemas
- schemaToProto(Schema, boolean) - Static method in class org.apache.beam.sdk.schemas.SchemaTranslation
- schemaToProto(Schema, boolean, boolean) - Static method in class org.apache.beam.sdk.schemas.SchemaTranslation
- schemaToProtoTableSchema(TableSchema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
- SchemaTransform - Class in org.apache.beam.sdk.schemas.transforms
-
An abstraction representing schema capable and aware transforms.
- SchemaTransform() - Constructor for class org.apache.beam.sdk.schemas.transforms.SchemaTransform
- SchemaTransformPayloadTranslator() - Constructor for class org.apache.beam.sdk.schemas.transforms.SchemaTransformTranslation.SchemaTransformPayloadTranslator
- SchemaTransformProvider - Interface in org.apache.beam.sdk.schemas.transforms
-
Provider to create
SchemaTransform
instances for use in Beam SQL and other SDKs. - SchemaTransformTranslation - Class in org.apache.beam.sdk.schemas.transforms
-
A
PTransformTranslation.TransformPayloadTranslator
implementation that translates between a JavaSchemaTransform
and a protobuf payload for that transform. - SchemaTransformTranslation() - Constructor for class org.apache.beam.sdk.schemas.transforms.SchemaTransformTranslation
- SchemaTransformTranslation.SchemaTransformPayloadTranslator<T> - Class in org.apache.beam.sdk.schemas.transforms
- SchemaTranslation - Class in org.apache.beam.sdk.schemas
-
Utility methods for translating schemas.
- SchemaTranslation() - Constructor for class org.apache.beam.sdk.schemas.SchemaTranslation
- schemaTypeCreator(Class<?>, Schema) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProvider
-
Deprecated.new implementations should override
GetterBasedSchemaProvider.schemaTypeCreator(TypeDescriptor, Schema)
and make this method throw anUnsupportedOperationException
- schemaTypeCreator(Class<?>, Schema) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProviderV2
- schemaTypeCreator(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.io.thrift.ThriftSchema
- schemaTypeCreator(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.extensions.avro.schemas.AvroRecordSchema
- schemaTypeCreator(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoMessageSchema
- schemaTypeCreator(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.io.aws2.schemas.AwsSchemaProvider
- schemaTypeCreator(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.schemas.AutoValueSchema
- schemaTypeCreator(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProvider
-
Deprecated.Delegates to the
GetterBasedSchemaProvider.schemaTypeCreator(Class, Schema)
for backwards compatibility, override it if you want to use the richer type signature contained in theTypeDescriptor
not subject to the type erasure. - schemaTypeCreator(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProviderV2
- schemaTypeCreator(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.schemas.JavaBeanSchema
- schemaTypeCreator(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.schemas.JavaFieldSchema
- SchemaUserTypeCreator - Interface in org.apache.beam.sdk.schemas
-
A creator interface for user types that have schemas.
- SchemaUtil - Class in org.apache.beam.sdk.io.jdbc
-
Provides utility functions for working with Beam
Schema
types. - SchemaUtil() - Constructor for class org.apache.beam.sdk.io.jdbc.SchemaUtil
- SchemaUtil.BeamRowMapper - Class in org.apache.beam.sdk.io.jdbc
-
A
JdbcIO.RowMapper
implementation that converts JDBC results into BeamRow
objects. - SchemaUtils - Class in org.apache.beam.sdk.schemas
-
A set of utility functions for schemas.
- SchemaUtils() - Constructor for class org.apache.beam.sdk.schemas.SchemaUtils
- SchemaVerification - Class in org.apache.beam.sdk.values
- SchemaVerification() - Constructor for class org.apache.beam.sdk.values.SchemaVerification
- SchemaZipFold<T> - Class in org.apache.beam.sdk.schemas.utils
-
Visitor that zips schemas, and accepts pairs of fields and their types.
- SchemaZipFold() - Constructor for class org.apache.beam.sdk.schemas.utils.SchemaZipFold
- SchemaZipFold.Context - Class in org.apache.beam.sdk.schemas.utils
-
Context referring to a current position in a schema.
- SCHEME - Static variable in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- scopedMetricsContainer(MetricsContainer) - Static method in class org.apache.beam.sdk.metrics.MetricsEnvironment
-
Set the
MetricsContainer
for the current thread. - SDF_PREFIX - Static variable in class org.apache.beam.runners.fnexecution.control.BundleCheckpointHandlers.StateAndTimerBundleCheckpointHandler
- SdfFlinkKeyKeySelector<K,
V> - Class in org.apache.beam.runners.flink.translation.wrappers.streaming -
KeySelector
that retrieves a key from aKV<KV<element, KV<restriction, watermarkState>>, size>
. - SdfFlinkKeyKeySelector(Coder<K>) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.SdfFlinkKeyKeySelector
- SdkCoreByteStringOutputStream() - Constructor for class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark.SdkCoreByteStringOutputStream
- sdkFields() - Method in class org.apache.beam.sdk.io.aws2.schemas.AwsBuilderFactory
- SdkHarnessClient - Class in org.apache.beam.runners.fnexecution.control
-
A high-level client for an SDK harness.
- SdkHarnessClient.BundleProcessor - Class in org.apache.beam.runners.fnexecution.control
-
A processor capable of creating bundles for some registered
BeamFnApi.ProcessBundleDescriptor
. - SdkHarnessClient.BundleProcessor.ActiveBundle - Class in org.apache.beam.runners.fnexecution.control
-
An active bundle for a particular
BeamFnApi.ProcessBundleDescriptor
. - SdkHarnessLogLevelOverrides() - Constructor for class org.apache.beam.sdk.options.SdkHarnessOptions.SdkHarnessLogLevelOverrides
- SdkHarnessOptions - Interface in org.apache.beam.sdk.options
-
Options that are used to control configuration of the SDK harness.
- SdkHarnessOptions.BundleProcessorCacheTimeoutFactory - Class in org.apache.beam.sdk.options
- SdkHarnessOptions.DefaultMaxCacheMemoryUsageMb - Class in org.apache.beam.sdk.options
-
The default implementation which detects how much memory to use for a process wide cache.
- SdkHarnessOptions.DefaultMaxCacheMemoryUsageMbFactory - Class in org.apache.beam.sdk.options
-
A
DefaultValueFactory
which constructs an instance of the class specified bymaxCacheMemoryUsageMbClass
to compute the maximum amount of memory to allocate to the process wide cache within an SDK harness instance. - SdkHarnessOptions.LogLevel - Enum Class in org.apache.beam.sdk.options
-
The set of log levels that can be used in the SDK harness.
- SdkHarnessOptions.MaxCacheMemoryUsageMb - Interface in org.apache.beam.sdk.options
-
Specifies the maximum amount of memory to use within the current SDK harness instance.
- SdkHarnessOptions.SdkHarnessLogLevelOverrides - Class in org.apache.beam.sdk.options
-
Defines a log level override for a specific class, package, or name.
- SEARCH - Enum constant in enum class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.FhirResourcePagesIterator.FhirMethod
- searchFhirResource(String, String, Map<String, Object>, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Search fhir resource http body.
- searchFhirResource(String, String, Map<String, Object>, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- searchResources(String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
-
Search resources from a Fhir store with String parameter values.
- searchResourcesWithGenericParameters(String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
-
Search resources from a Fhir store with any type of parameter values.
- secretManagerProjectId() - Method in class org.apache.beam.sdk.io.solace.broker.GCPSecretSessionServiceFactory
- secretManagerProjectId(String) - Method in class org.apache.beam.sdk.io.solace.broker.GCPSecretSessionServiceFactory.Builder
-
Optional for Dataflow or VMs running on Google Cloud.
- Seekability - Search tag in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
- Section
- seekable(BeamRelNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
-
check if
BeamRelNode
implementsBeamSeekableTable
. - seekableInputIndex() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
- seekRow(Row) - Method in interface org.apache.beam.sdk.extensions.sql.BeamSqlSeekableTable
-
return a list of
Row
with given key set. - select(Row) - Method in interface org.apache.beam.sdk.schemas.utils.RowSelector
- select(Row) - Method in class org.apache.beam.sdk.schemas.utils.SelectHelpers.RowSelectorContainer
- Select - Class in org.apache.beam.sdk.schemas.transforms
-
A
PTransform
for selecting a subset of fields from a schema type. - Select() - Constructor for class org.apache.beam.sdk.schemas.transforms.Select
- Select.Fields<T> - Class in org.apache.beam.sdk.schemas.transforms
- Select.Flattened<T> - Class in org.apache.beam.sdk.schemas.transforms
-
A
PTransform
representing a flattened schema. - selectedFieldsProvider - Variable in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
- SelectHelpers - Class in org.apache.beam.sdk.schemas.utils
-
Helper methods to select subrows out of rows.
- SelectHelpers() - Constructor for class org.apache.beam.sdk.schemas.utils.SelectHelpers
- SelectHelpers.RowSelectorContainer - Class in org.apache.beam.sdk.schemas.utils
- Semp - Class in org.apache.beam.sdk.io.solace.data
- Semp() - Constructor for class org.apache.beam.sdk.io.solace.data.Semp
- Semp.Queue - Class in org.apache.beam.sdk.io.solace.data
- Semp.QueueData - Class in org.apache.beam.sdk.io.solace.data
- SempBasicAuthClientExecutor - Class in org.apache.beam.sdk.io.solace.broker
-
A class to execute requests to SEMP v2 with Basic Auth authentication.
- SempBasicAuthClientExecutor(String, String, String, String, HttpRequestFactory) - Constructor for class org.apache.beam.sdk.io.solace.broker.SempBasicAuthClientExecutor
- SempClient - Interface in org.apache.beam.sdk.io.solace.broker
-
This interface defines methods for interacting with a Solace message broker using the Solace Element Management Protocol (SEMP).
- SempClientFactory - Interface in org.apache.beam.sdk.io.solace.broker
-
This interface serves as a blueprint for creating SempClient objects, which are used to interact with a Solace message broker using the Solace Element Management Protocol (SEMP).
- sendElements(BeamFnApi.Elements) - Method in class org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator
- SENDER_TIMESTAMP_FUNCTION - Static variable in class org.apache.beam.sdk.io.solace.SolaceIO
- Sending messages to a AMQP endpoint - Search tag in class org.apache.beam.sdk.io.amqp.AmqpIO
- Section
- sendOrCollectBufferedDataAndFinishOutboundStreams() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator
-
Closes the streams for all registered outbound endpoints.
- seqOf(T...) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.utils.ScalaInterop
- sequence_id_outside_valid_range - Enum constant in enum class org.apache.beam.sdk.extensions.ordered.UnprocessedEvent.Reason
- SequenceDefinition() - Constructor for class org.apache.beam.sdk.transforms.PeriodicSequence.SequenceDefinition
- SequenceDefinition(Instant, Instant, Duration) - Constructor for class org.apache.beam.sdk.transforms.PeriodicSequence.SequenceDefinition
- SequenceDefinition(Instant, Instant, Duration, boolean) - Constructor for class org.apache.beam.sdk.transforms.PeriodicSequence.SequenceDefinition
-
catchUpToNow is experimental; no backwards-compatibility guarantees.
- SequenceRangeAccumulator - Class in org.apache.beam.sdk.extensions.ordered.combiner
-
Default accumulator used to combine sequence ranges.
- SequenceRangeAccumulator() - Constructor for class org.apache.beam.sdk.extensions.ordered.combiner.SequenceRangeAccumulator
- SequenceRangeAccumulator.SequenceRangeAccumulatorCoder - Class in org.apache.beam.sdk.extensions.ordered.combiner
- serde(T) - Static method in class org.apache.beam.runners.jet.Utils
-
Returns a deep clone of an object by serializing and deserializing it (ser-de).
- SerdeUtils - Class in org.apache.beam.runners.flink.translation.utils
-
Util methods to help with serialization / deserialization.
- Serializability of DoFns - Search tag in class org.apache.beam.sdk.transforms.ParDo
- Section
- SerializableBiConsumer<FirstInputT,
SecondInputT> - Interface in org.apache.beam.sdk.transforms -
A union of the
BiConsumer
andSerializable
interfaces. - SerializableBiFunction<FirstInputT,
SecondInputT, - Interface in org.apache.beam.sdk.transformsOutputT> -
A union of the
BiFunction
andSerializable
interfaces. - SerializableCoder<T> - Class in org.apache.beam.sdk.coders
-
A
Coder
for Java classes that implementSerializable
. - SerializableCoder(Class<T>, TypeDescriptor<T>) - Constructor for class org.apache.beam.sdk.coders.SerializableCoder
- SerializableCoder.SerializableCoderProviderRegistrar - Class in org.apache.beam.sdk.coders
-
A
CoderProviderRegistrar
which registers aCoderProvider
which can handle serializable types. - SerializableCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.coders.SerializableCoder.SerializableCoderProviderRegistrar
- SerializableComparator<T> - Interface in org.apache.beam.sdk.transforms
-
A
Comparator
that is alsoSerializable
. - SerializableConfiguration - Class in org.apache.beam.sdk.io.hadoop
-
A wrapper to allow Hadoop
Configuration
s to be serialized using Java's standard serialization mechanisms. - SerializableConfiguration() - Constructor for class org.apache.beam.sdk.io.hadoop.SerializableConfiguration
- SerializableConfiguration(Configuration) - Constructor for class org.apache.beam.sdk.io.hadoop.SerializableConfiguration
- SerializableFunction<InputT,
OutputT> - Interface in org.apache.beam.sdk.transforms -
A function that computes an output value of type
OutputT
from an input value of typeInputT
, isSerializable
, and does not allow checked exceptions to be declared. - SerializableFunctions - Class in org.apache.beam.sdk.transforms
-
Useful
SerializableFunction
overrides. - SerializableFunctions() - Constructor for class org.apache.beam.sdk.transforms.SerializableFunctions
- SerializableIr - Class in org.apache.beam.sdk.extensions.sbe
-
A wrapper around
Ir
that fulfils Java'sSerializable
contract. - SerializableMatcher<T> - Interface in org.apache.beam.sdk.testing
-
A
Matcher
that is alsoSerializable
. - SerializableMatchers - Class in org.apache.beam.sdk.testing
-
Static class for building and using
SerializableMatcher
instances. - serializablePipelineOptions - Variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSource
- SerializableRexFieldAccess - Class in org.apache.beam.sdk.extensions.sql.impl.utils
-
SerializableRexFieldAccess.
- SerializableRexFieldAccess(RexFieldAccess) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.utils.SerializableRexFieldAccess
- SerializableRexInputRef - Class in org.apache.beam.sdk.extensions.sql.impl.utils
-
SerializableRexInputRef.
- SerializableRexInputRef(RexInputRef) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.utils.SerializableRexInputRef
- SerializableRexNode - Class in org.apache.beam.sdk.extensions.sql.impl.utils
-
SerializableRexNode.
- SerializableRexNode() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.utils.SerializableRexNode
- SerializableRexNode.Builder - Class in org.apache.beam.sdk.extensions.sql.impl.utils
-
SerializableRexNode.Builder.
- SerializableSplit() - Constructor for class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.SerializableSplit
- SerializableSplit(InputSplit) - Constructor for class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.SerializableSplit
- Serialization Of PipelineOptions - Search tag in interface org.apache.beam.sdk.options.PipelineOptions
- Section
- serialize(byte[], DataOutputView) - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueSerializer
- serialize(String, Instant) - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantSerializer
- serialize(StateNamespace, DataOutputView) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.FlinkStateNamespaceKeySerializer
- serialize(ValueProvider<?>, JsonGenerator, SerializerProvider) - Method in class org.apache.beam.sdk.options.ValueProvider.Serializer
- serialize(Row) - Method in interface org.apache.beam.sdk.schemas.io.payloads.PayloadSerializer
- serialize(T, DataOutputView) - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeSerializer
- serializeAwsCredentialsProvider(AwsCredentialsProvider) - Static method in class org.apache.beam.sdk.io.aws2.options.AwsSerializableUtils
- SERIALIZED_FN - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- SERIALIZED_TEST_STREAM - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- serializedOptions - Variable in class org.apache.beam.runners.flink.translation.functions.FlinkPartialReduceFunction
- serializedOptions - Variable in class org.apache.beam.runners.flink.translation.functions.FlinkReduceFunction
- serializedOptions - Variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- serializeObject(Object) - Static method in class org.apache.beam.runners.flink.translation.utils.SerdeUtils
- serializeOneOf(Expression, List<Encoder<T>>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers
- serializer() - Static method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceSplit
- Serializer() - Constructor for class org.apache.beam.sdk.options.ValueProvider.Serializer
- serializeTimers(Collection<TimerInternals.TimerData>, TimerInternals.TimerDataCoderV2) - Static method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
- serializeToCloudSource(Source<?>, PipelineOptions) - Static method in class org.apache.beam.runners.dataflow.internal.CustomSources
- serialVersionUID - Static variable in class org.apache.beam.sdk.extensions.protobuf.DynamicProtoCoder
- serialVersionUID - Static variable in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
- serialVersionUID - Static variable in class org.apache.beam.sdk.extensions.protobuf.ProtoDomain
- serialVersionUID - Static variable in class org.apache.beam.sdk.extensions.protobuf.ProtoDynamicMessageSchema
- serialVersionUID - Static variable in interface org.apache.beam.sdk.schemas.io.payloads.PayloadSerializer
- seriesId - Variable in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
- ServerConfiguration() - Constructor for class org.apache.beam.runners.jobsubmission.JobServerDriver.ServerConfiguration
- serverDirect() - Static method in class org.apache.beam.sdk.fn.stream.OutboundObserverFactory
-
Like
OutboundObserverFactory.clientDirect()
but for server-side RPCs. - ServerFactory - Class in org.apache.beam.sdk.fn.server
-
A
gRPC server
factory. - ServerFactory() - Constructor for class org.apache.beam.sdk.fn.server.ServerFactory
- ServerFactory.InetSocketAddressServerFactory - Class in org.apache.beam.sdk.fn.server
-
Creates a
gRPC Server
using the default server factory. - ServerFactory.UrlFactory - Interface in org.apache.beam.sdk.fn.server
-
Factory that constructs client-accessible URLs from a local server address and port.
- ServerInfo() - Constructor for class org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.ServerInfo
- SESSION_WINDOW_TVF - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.TVFStreamingUtils
- sessionBuilder(String) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.SparkSessionFactory
-
Creates Spark session builder with some optimizations for local mode, e.g.
- sessionDurationSecs() - Method in class org.apache.beam.sdk.io.aws2.auth.StsAssumeRoleForFederatedCredentialsProvider
- Sessions - Class in org.apache.beam.sdk.transforms.windowing
-
A
WindowFn
that windows values into sessions separated by periods with no input for at least the duration specified bySessions.getGapDuration()
. - SessionService - Class in org.apache.beam.sdk.io.solace.broker
-
The SessionService interface provides a set of methods for managing a session with the Solace messaging system.
- SessionService() - Constructor for class org.apache.beam.sdk.io.solace.broker.SessionService
- SessionServiceFactory - Class in org.apache.beam.sdk.io.solace.broker
-
This abstract class serves as a blueprint for creating `SessionServiceFactory` objects.
- SessionServiceFactory() - Constructor for class org.apache.beam.sdk.io.solace.broker.SessionServiceFactory
- set() - Static method in class org.apache.beam.sdk.state.StateSpecs
- set(long) - Method in class org.apache.beam.runners.jet.metrics.GaugeImpl
- set(long) - Method in class org.apache.beam.sdk.metrics.DelegatingGauge
-
Set the gauge.
- set(long) - Method in interface org.apache.beam.sdk.metrics.Gauge
-
Set current value for this gauge.
- set(long...) - Static method in class org.apache.beam.sdk.testing.WindowFnTestUtils
-
Creates a Set of elements to be used as expected output in
WindowFnTestUtils.runWindowFn(org.apache.beam.sdk.transforms.windowing.WindowFn<T, W>, java.util.List<java.lang.Long>)
. - set(String, Instant) - Method in interface org.apache.beam.sdk.state.TimerMap
- set(ObjectT, ValueT) - Method in interface org.apache.beam.sdk.schemas.FieldValueSetter
-
Sets the specified field on object to value.
- set(Coder<T>) - Static method in class org.apache.beam.sdk.state.StateSpecs
-
Identical to
StateSpecs.set()
, but with an element coder explicitly supplied. - set(Instant) - Method in interface org.apache.beam.sdk.state.Timer
-
Sets or resets the time in the timer's
TimeDomain
at which it should fire. - SET - Enum constant in enum class org.apache.beam.sdk.io.redis.RedisIO.Write.Method
-
Use SET command.
- setAccessKey(String) - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
- setAccountName(String) - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
- setActiveWorkRefreshPeriodMillis(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- setAggregationEnabled(Boolean) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.ReadDataBuilder.Configuration
- setAggregationEnabled(Boolean) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.WriteBuilder.Configuration
- setAggregationMaxBufferedTime(Long) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.ReadDataBuilder.Configuration
- setAggregationMaxBufferedTime(Long) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.WriteBuilder.Configuration
- setAggregationMaxBytes(Long) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.ReadDataBuilder.Configuration
- setAggregationMaxBytes(Long) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.WriteBuilder.Configuration
- setAggregationShardRefreshInterval(Long) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.ReadDataBuilder.Configuration
- setAggregationShardRefreshInterval(Long) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.WriteBuilder.Configuration
- setAllowDuplicates(Boolean) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
- setAllowNonRestoredState(Boolean) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- setAlsoStartLoopbackWorker(boolean) - Method in interface org.apache.beam.sdk.expansion.service.ExpansionServiceOptions
- setApiRootUrl(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
- setAppend(Boolean) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider.Configuration.Builder
- setAppName(String) - Method in interface org.apache.beam.sdk.options.ApplicationNameOptions
- setArtifactPort(int) - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver.ServerConfiguration
- setAssumedRoleArn(String) - Method in class org.apache.beam.sdk.io.aws2.auth.StsAssumeRoleForFederatedCredentialsProvider.Builder
-
Sets the role to be assumed by the authentication request.
- setAttachedMode(boolean) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- setAttachmentBytes(byte[]) - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record.Builder
- setAttributeId(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder
- setAttributeId(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration.Builder
- setAttributeMap(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder
- setAttributes(List<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
- setAttributes(List<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.Builder
- setAttributes(List<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder
- setAttributes(List<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration.Builder
- setAttributesMap(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
- setAttributesMap(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.Builder
- setAudience(String) - Method in class org.apache.beam.sdk.io.aws2.auth.StsAssumeRoleForFederatedCredentialsProvider.Builder
-
Sets the audience to be used for the web id token request.
- setAuthenticator(String) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- setAutoBalanceWriteFilesShardingEnabled(Boolean) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- setAutoCommit(boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- setAutoOffsetResetConfig(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
- setAutoscalingAlgorithm(DataflowPipelineWorkerPoolOptions.AutoscalingAlgorithmType) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
- setAutosharding(Boolean) - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration.Builder
- setAutoSharding(Boolean) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.Builder
- setAutoWatermarkInterval(Long) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- setAveragePartitionBytesSize(long) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
-
Sets the average partition bytes size to estimate the backlog of this DoFn.
- setAwsAccessKey(String) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.ReadDataBuilder.Configuration
- setAwsAccessKey(String) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.WriteBuilder.Configuration
- setAwsCredentialsProvider(AwsCredentialsProvider) - Method in interface org.apache.beam.sdk.io.aws2.options.AwsOptions
- setAwsRegion(Region) - Method in interface org.apache.beam.sdk.io.aws2.options.AwsOptions
- setAwsSecretKey(String) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.ReadDataBuilder.Configuration
- setAwsSecretKey(String) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.WriteBuilder.Configuration
- setAzureConnectionString(String) - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
- setAzureCredentialsProvider(TokenCredential) - Method in interface org.apache.beam.sdk.io.azure.options.AzureOptions
- setBatching(Boolean) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
- setBatching(Boolean) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration.Builder
- setBatchIntervalMillis(Long) - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
- setBatchRequestSupplier(Supplier<GcsUtil.BatchInterface>) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
- setBatchSize(Integer) - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteConfiguration.Builder
- setBatchSize(Long) - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration.Builder
- setBatchSizeBytes(Integer) - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText.Builder
- setBatchSizeBytes(Integer) - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText.Builder
- setBatchSizeBytes(Integer) - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText.Builder
- setBeamVersion(String) - Method in class org.apache.beam.sdk.transformservice.launcher.TransformServiceLauncher
-
Specifies the Beam version to get containers for the transform service.
- setBigQueryEndpoint(String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setBigQueryLocation(String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.TestBigQueryOptions
- setBigQueryProject(String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setBigQueryServices(BigQueryServices) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransform
- setBigQueryServices(BigQueryServices) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform
- setBigtableChangeStreamInstanceId(String) - Method in interface org.apache.beam.sdk.io.gcp.bigtable.changestreams.BigtableChangeStreamTestOptions
- setBlobServiceEndpoint(String) - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
- setBlobstoreClientFactoryClass(Class<? extends BlobstoreClientBuilderFactory>) - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
- setBlockOnRun(boolean) - Method in interface org.apache.beam.runners.direct.DirectOptions
- setBlockOnRun(boolean) - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
- setBootstrapServers(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
-
Sets the bootstrap servers for the Kafka consumer.
- setBootstrapServers(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration.Builder
- setBqStreamingApiLoggingFrequencySec(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setBranch(String) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
- setBucketKeyEnabled(boolean) - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
- setBucketKeyEnabled(boolean) - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration.Builder
- setBundleFinishedCallback(Runnable) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- setBundleProcessorCacheTimeout(Duration) - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
- setBundleSize(Long) - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
- setCacheDisabled(boolean) - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
- setCalciteConnectionProperties(Map<String, String>) - Method in interface org.apache.beam.sdk.extensions.sql.impl.BeamSqlPipelineOptions
- setCallable(String) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf.Configuration.Builder
- setCaseSensitive(Boolean) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
- setCatalog(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- setCatalogConfig(IcebergCatalogConfig) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
- setCatalogName(String) - Method in class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig.Builder
- setCatalogName(String) - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration.Builder
- setCatalogProperties(Map<String, String>) - Method in class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig.Builder
- setCatalogProperties(Map<String, String>) - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration.Builder
- setChangeStreamName(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration.Builder
- setChannelzShowOnlyWindmillServiceChannels(boolean) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- setCharset(String) - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.XmlConfiguration.Builder
-
The charset used to write the file.
- setCheckpoint(Long) - Method in interface org.apache.beam.sdk.io.sparkreceiver.HasOffset
-
Some
Receiver
support mechanism of checkpoint (e.g. - setCheckpointDir(String) - Method in interface org.apache.beam.runners.spark.SparkCommonPipelineOptions
- setCheckpointDurationMillis(Long) - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
- setCheckpointingInterval(Long) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- setCheckpointingInterval(Long) - Method in interface org.apache.beam.sdk.io.kafka.KafkaIO.Read.FakeFlinkPipelineOptions
- setCheckpointingMode(String) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- setCheckpointTimeoutMillis(Long) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- setChecksum(String) - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata.Builder
- setClientBuilderFactory(Class<? extends ClientBuilderFactory>) - Method in interface org.apache.beam.sdk.io.aws2.options.AwsOptions
- setClientFactory(PubsubTestClient.PubsubTestClientFactory) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
- setClientInfo(String, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- setClientInfo(Properties) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- setClock(Clock) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
- setClock(PubsubIO.Read<T>, Clock) - Method in interface org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient.PubsubTestClientFactory
- setCloningBehavior(DoFnTester.CloningBehavior) - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
Deprecated.Use
TestPipeline
with theDirectRunner
. - setClusteringFields(List<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.Builder
- setClusterName(String) - Method in interface org.apache.beam.runners.jet.JetPipelineOptions
- setClusterType(String) - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
- setCodeJarPathname(String) - Method in interface org.apache.beam.runners.jet.JetPipelineOptions
- setCoder(String) - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Record.Builder
- setCoder(Coder<T>) - Method in class org.apache.beam.sdk.values.PCollection
-
Sets the
Coder
used by thisPCollection
to encode and decode the values stored in it. - SetCoder<T> - Class in org.apache.beam.sdk.coders
- SetCoder(Coder<T>) - Constructor for class org.apache.beam.sdk.coders.SetCoder
- setCoderRegistry(CoderRegistry) - Method in class org.apache.beam.sdk.Pipeline
-
Deprecated.this should never be used - every
Pipeline
has a registry throughout its lifetime. - setColumnDelimiter(String) - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText.Builder
- setColumnDelimiter(String) - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText.Builder
- setColumnDelimiter(String) - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText.Builder
- setColumns(List<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration.Builder
- setColumns(SnowflakeColumn[]) - Method in class org.apache.beam.sdk.io.snowflake.data.SnowflakeTableSchema
- setCommitOffsetInFinalize(Boolean) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
- setCompression(String) - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.Builder
-
The file
Compression
SeeFileWriteSchemaTransformConfiguration.getCompression()
for more details. - setCompression(String) - Method in class org.apache.beam.sdk.io.TFRecordReadSchemaTransformConfiguration.Builder
- setCompression(String) - Method in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformConfiguration.Builder
- setCompression(Compression) - Method in class org.apache.beam.sdk.io.TextRowCountEstimator.Builder
- setCompressionCodecName(String) - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.ParquetConfiguration.Builder
-
Specifies compression codec.
- setConfig(byte[]) - Method in class org.apache.beam.sdk.extensions.schemaio.expansion.ExternalSchemaIOTransformRegistrar.Configuration
- setConfigProperties(Map<String, String>) - Method in class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig.Builder
- setConfigProperties(Map<String, String>) - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration.Builder
- setConfluentSchemaRegistrySubject(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
- setConfluentSchemaRegistryUrl(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
- setConnectionInitSql(List<String>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration.Builder
- setConnectionInitSql(List<String>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration.Builder
- setConnectionProperties(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration.Builder
- setConnectionProperties(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration.Builder
- setConnectionProperties(List<String>) - Method in class org.apache.beam.io.debezium.DebeziumTransformRegistrar.ReadBuilder.Configuration
- setConnectorClass(String) - Method in class org.apache.beam.io.debezium.DebeziumTransformRegistrar.ReadBuilder.Configuration
- setConsumerConfig(Map<String, String>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
- setConsumerConfigUpdates(Map<String, String>) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
- setConsumerPollingTimeout(Long) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
- setContentType(String) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.CreateOptions.Builder
- setContext(BatchContextImpl) - Method in class org.apache.beam.sdk.io.cdap.Plugin.Builder
- setCosmosClientBuilder(CosmosClientBuilder) - Method in interface org.apache.beam.sdk.io.azure.cosmos.CosmosOptions
- setCosmosKey(String) - Method in interface org.apache.beam.sdk.io.azure.cosmos.CosmosOptions
- setCosmosServiceEndpoint(String) - Method in interface org.apache.beam.sdk.io.azure.cosmos.CosmosOptions
- setCountBackoffs(Boolean) - Method in class org.apache.beam.io.requestresponse.Monitoring.Builder
- setCountCacheReadFailures(Boolean) - Method in class org.apache.beam.io.requestresponse.Monitoring.Builder
- setCountCacheReadNonNulls(Boolean) - Method in class org.apache.beam.io.requestresponse.Monitoring.Builder
- setCountCacheReadNulls(Boolean) - Method in class org.apache.beam.io.requestresponse.Monitoring.Builder
- setCountCacheReadRequests(Boolean) - Method in class org.apache.beam.io.requestresponse.Monitoring.Builder
- setCountCacheWriteFailures(Boolean) - Method in class org.apache.beam.io.requestresponse.Monitoring.Builder
- setCountCacheWriteRequests(Boolean) - Method in class org.apache.beam.io.requestresponse.Monitoring.Builder
- setCountCacheWriteSuccesses(Boolean) - Method in class org.apache.beam.io.requestresponse.Monitoring.Builder
- setCountCalls(Boolean) - Method in class org.apache.beam.io.requestresponse.Monitoring.Builder
- setCountFailures(Boolean) - Method in class org.apache.beam.io.requestresponse.Monitoring.Builder
- setCountRequests(Boolean) - Method in class org.apache.beam.io.requestresponse.Monitoring.Builder
- setCountResponses(Boolean) - Method in class org.apache.beam.io.requestresponse.Monitoring.Builder
- setCountryOfResidence(String) - Method in class org.apache.beam.sdk.extensions.sql.example.model.Customer
- setCountSetup(Boolean) - Method in class org.apache.beam.io.requestresponse.Monitoring.Builder
- setCountShouldBackoff(Boolean) - Method in class org.apache.beam.io.requestresponse.Monitoring.Builder
- setCountSleeps(Boolean) - Method in class org.apache.beam.io.requestresponse.Monitoring.Builder
- setCountTeardown(Boolean) - Method in class org.apache.beam.io.requestresponse.Monitoring.Builder
- setCreatedAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
-
Sets the time at which the partition was created.
- setCreateDisposition(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.Builder
- setCreateDisposition(String) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.WriteBuilder.Configuration
- setCreateFromSnapshot(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
- setCredentialFactoryClass(Class<? extends CredentialFactory>) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
- setCrossProduct(Boolean) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider.Configuration.Builder
- setCsvConfiguration(FileWriteSchemaTransformConfiguration.CsvConfiguration) - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.Builder
-
Configures extra details related to writing CSV formatted files.
- setCurrentBundleTimestamp(Instant) - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
- setCurrentContainer(MetricsContainer) - Static method in class org.apache.beam.sdk.metrics.MetricsEnvironment
-
Set the
MetricsContainer
for the current thread. - setCurrentKey(Object) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.ExecutableStageDoFnOperator
-
We don't want to set anything here.
- setCurrentSchema(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv.BeamSqlEnvBuilder
-
Set the current (default) schema.
- setCurrentTransform(AppliedPTransform<?, ?, ?>) - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
- setCurrentTransform(AppliedPTransform<?, ?, ?>) - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
- setCustomBeamRequirement(String) - Method in interface org.apache.beam.sdk.extensions.python.PythonExternalTransformOptions
-
Set custom Beam version for bootstrap Beam venv.
- setCustomerId(int) - Method in class org.apache.beam.sdk.extensions.sql.example.model.Order
- setCustomerProvidedKey(CustomerProvidedKey) - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
- setCustomErrors(CustomHttpErrors) - Method in class org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer
- setDatabase(String) - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration.Builder
- setDatabase(String) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- setDatabase(ValueProvider<String>) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- setDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration.Builder
- setDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration.Builder
- setDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.CrossLanguageConfiguration
- setDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration.Builder
- setDataCatalogEndpoint(String) - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogPipelineOptions
- setDataflowClient(Dataflow) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
- setDataflowEndpoint(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
- setDataflowEndpoint(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
- setDataflowJobFile(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
- setDataflowKmsKey(String) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
- setDataflowServiceOptions(List<String>) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
- setDataflowWorkerJar(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
- setDataSchema(byte[]) - Method in class org.apache.beam.sdk.extensions.schemaio.expansion.ExternalSchemaIOTransformRegistrar.Configuration
- setDataSourceConfiguration(SingleStoreIO.DataSourceConfiguration) - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadConfiguration.Builder
- setDataSourceConfiguration(SingleStoreIO.DataSourceConfiguration) - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteConfiguration.Builder
- setDataType(SnowflakeDataType) - Method in class org.apache.beam.sdk.io.snowflake.data.SnowflakeColumn
- setDdl(String) - Method in class org.apache.beam.sdk.extensions.sql.expansion.ExternalSqlTransformRegistrar.Configuration
- setDebeziumConnectionProperties(List<String>) - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration.Builder
- setDeduplicate(Deduplicate.KeyedValues<Uuid, SequencedMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions.Builder
-
Set the deduplication transform.
- setDefaultEnvironmentConfig(String) - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
- setDefaultEnvironmentType(String) - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
- setDefaultPipelineOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.io.FileSystems
-
Sets the default configuration in workers.
- setDefaultPipelineOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.metrics.Metrics
-
Initialize metrics flags if not already done so.
- setDefaultSdkHarnessLogLevel(SdkHarnessOptions.LogLevel) - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
- setDefaultTimezone(String) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner
- setDefaultWorkerLogLevel(DataflowWorkerLoggingOptions.Level) - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
-
Deprecated.
- setDeidentifyConfig(DeidentifyConfig) - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText.Builder
- setDeidentifyTemplateName(String) - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText.Builder
- setDeleteCheckpointDir(boolean) - Method in interface org.apache.beam.runners.spark.TestSparkPipelineOptions
- setDelimiter(String) - Method in class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider.CsvWriteConfiguration.Builder
- setDelimiters(byte[]) - Method in class org.apache.beam.sdk.io.TextRowCountEstimator.Builder
- setDescription(String) - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation.Builder
- setDescription(String) - Method in class org.apache.beam.sdk.schemas.Schema.Field.Builder
- setDescription(String) - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Failure.Builder
- setDesiredNumUnboundedSourceSplits(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
- setDestination(Solace.Destination) - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record.Builder
- setDialect(String) - Method in class org.apache.beam.sdk.extensions.sql.expansion.ExternalSqlTransformRegistrar.Configuration
- setDirectoryTreatment(FileIO.ReadMatches.DirectoryTreatment) - Method in class org.apache.beam.sdk.io.TextRowCountEstimator.Builder
- setDisableAutoCommit(Boolean) - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration.Builder
- setDisableMetrics(Boolean) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- setDiskSizeGb(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
- setDisplayData(List<DisplayData.ItemSpec<?>>) - Method in class org.apache.beam.sdk.transforms.PTransform
-
Set display data for your PTransform.
- setDriverClassName(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration.Builder
- setDriverClassName(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration.Builder
- setDriverJars(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration.Builder
- setDriverJars(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration.Builder
- setDrop(List<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.Builder
- setDrop(List<String>) - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration.Builder
- setDrop(List<String>) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider.Configuration.Builder
- setDropFields(List<String>) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
- setDumpHeapOnOOM(boolean) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
- setDuplicateCount(long) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus.Builder
- setDynamicReadPollIntervalSeconds(Long) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
- setEarliestBufferedSequence(Long) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus.Builder
- setElasticsearchHttpPort(Integer) - Method in interface org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOITCommon.ElasticsearchPipelineOptions
- setElasticsearchServer(String) - Method in interface org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOITCommon.ElasticsearchPipelineOptions
- setElements(Long) - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Rate.Builder
- setElementsPerPeriod(Long) - Method in class org.apache.beam.sdk.io.GenerateSequence.External.ExternalConfiguration
- setElementType(FieldValueTypeInformation) - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation.Builder
- setEmptyMatchTreatment(EmptyMatchTreatment) - Method in class org.apache.beam.sdk.io.TextRowCountEstimator.Builder
- setEmulatorHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.CrossLanguageConfiguration
- setEmulatorHost(String) - Method in interface org.apache.beam.sdk.io.gcp.firestore.FirestoreOptions
-
Define a host port pair to allow connecting to a Cloud Firestore emulator instead of the live service.
- setEnableBucketReadMetricCounter(Boolean) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
- setEnableBucketWriteMetricCounter(Boolean) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
- setEnableHeapDumps(boolean) - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
- setEnableLogViaFnApi(boolean) - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
- setEnableSparkMetricSinks(Boolean) - Method in interface org.apache.beam.runners.spark.SparkCommonPipelineOptions
- setEnableStableInputDrain(Boolean) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- setEnableStorageReadApiV2(Boolean) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setEnableStreamingEngine(boolean) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
- setEnableWebUI(Boolean) - Method in interface org.apache.beam.runners.prism.PrismPipelineOptions
- setEncodedRecord(byte[]) - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Record.Builder
- setEncodingPositions(Map<String, Integer>) - Method in class org.apache.beam.sdk.schemas.Schema
-
Sets the encoding positions for this schema.
- setEnd(Long) - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Builder
- setEndAtTimestamp(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration.Builder
- setEndpoint(URI) - Method in interface org.apache.beam.sdk.io.aws2.options.AwsOptions
- setEndTime(Instant) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
- setEndTimestamp(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
-
Sets the end time of the partition.
- setEnforceEncodability(boolean) - Method in interface org.apache.beam.runners.direct.DirectOptions
- setEnforceImmutability(boolean) - Method in interface org.apache.beam.runners.direct.DirectOptions
- setEnvironmentCacheMillis(int) - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
- setEnvironmentExpirationMillis(int) - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
- setEnvironmentOptions(List<String>) - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
- setError(String) - Method in class org.apache.beam.sdk.io.solace.data.Solace.PublishResult.Builder
- setError(String) - Method in class org.apache.beam.sdk.schemas.io.Failure.Builder
- setErrorField(String) - Method in class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn
-
Adds the error message to the returned error Row.
- setErrorHandling(BigQueryWriteConfiguration.ErrorHandling) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.Builder
- setErrorHandling(PubsubReadSchemaTransformConfiguration.ErrorHandling) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
- setErrorHandling(PubsubWriteSchemaTransformConfiguration.ErrorHandling) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.Builder
- setErrorHandling(ErrorHandling) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder
- setErrorHandling(ErrorHandling) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration.Builder
- setErrorHandling(ErrorHandling) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration.Builder
- setErrorHandling(ErrorHandling) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration.Builder
- setErrorHandling(ErrorHandling) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
- setErrorHandling(ErrorHandling) - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration.Builder
- setErrorHandling(ErrorHandling) - Method in class org.apache.beam.sdk.io.TFRecordReadSchemaTransformConfiguration.Builder
- setErrorHandling(ErrorHandling) - Method in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformConfiguration.Builder
- setErrorHandling(ErrorHandling) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider.Configuration.Builder
- setErrorHandling(ErrorHandling) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider.Configuration.Builder
- setException(String) - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Failure.Builder
- setExceptionStacktrace(String) - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Failure.Builder
- setExecutionModeForBatch(String) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- setExecutionRetryDelay(Long) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- setExecutorService(ExecutorService) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
-
Deprecated.use
ExecutorOptions.setScheduledExecutorService(java.util.concurrent.ScheduledExecutorService)
instead. If set, it may result in multiple ExecutorServices, and therefore thread pools, in the runtime. - setExpansionPort(int) - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver.ServerConfiguration
- setExpansionServiceConfig(ExpansionServiceConfig) - Method in interface org.apache.beam.sdk.expansion.service.ExpansionServiceOptions
- setExpansionServiceConfigFile(String) - Method in interface org.apache.beam.sdk.expansion.service.ExpansionServiceOptions
- setExpectedAssertions(Integer) - Method in interface org.apache.beam.runners.spark.TestSparkPipelineOptions
- setExpectFileToNotExist(boolean) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.CreateOptions.Builder
- setExpectFileToNotExist(Boolean) - Method in class org.apache.beam.sdk.io.fs.CreateOptions.Builder
- setExperiments(List<String>) - Method in interface org.apache.beam.sdk.options.ExperimentalOptions
- setExpiration(long) - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record.Builder
- setExpression(String) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf.Configuration.Builder
- setExternalizedCheckpointsEnabled(Boolean) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- setExtraInteger(Integer) - Method in class org.apache.beam.sdk.managed.testing.TestSchemaTransformProvider.Config.Builder
- setExtraString(String) - Method in class org.apache.beam.sdk.managed.testing.TestSchemaTransformProvider.Config.Builder
- setFailOnCheckpointingErrors(Boolean) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- setFailsafeTableRowPayload(byte[]) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload.Builder
- setFailToLock(boolean) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
- setFailure(BadRecord.Failure) - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Builder
- setFasterCopy(Boolean) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- setFeatures(AnnotateTextRequest.Features) - Method in class org.apache.beam.sdk.extensions.ml.AnnotateText.Builder
- setFetchSize(Integer) - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration.Builder
- setField(Field) - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation.Builder
- setFieldId(Integer) - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.Builder
- setFieldName(String) - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.Builder
- setFieldRename(String) - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.Builder
- setFields(List<String>) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider.Configuration.Builder
- setFields(Map<String, JavaRowUdf.Configuration>) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider.Configuration.Builder
- setFileDescriptorPath(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder
- setFileDescriptorPath(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration.Builder
- setFileDescriptorPath(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
- setFileDescriptorPath(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration.Builder
- setFileFormat(FileFormat) - Method in class org.apache.beam.sdk.io.iceberg.IcebergDestination.Builder
- setFileInputSplitMaxSizeMB(Long) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- setFilenamePrefix(String) - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.Builder
-
A common prefix to use for all generated filenames.
- setFilenameSuffix(String) - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.Builder
-
Configures the filename suffix for written files.
- setFilenameSuffix(String) - Method in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformConfiguration.Builder
- setFilepattern(String) - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration.Builder
- setFilePattern(String) - Method in class org.apache.beam.sdk.io.TextRowCountEstimator.Builder
- setFilePattern(String) - Method in class org.apache.beam.sdk.io.TFRecordReadSchemaTransformConfiguration.Builder
- setFilesToStage(List<String>) - Method in interface org.apache.beam.sdk.options.FileStagingOptions
- setFileSystem(FileSystem) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- setFilterString(String) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
- setFinishBundleBeforeCheckpointing(boolean) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- setFinishedAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
-
Sets the time at which the partition finished running.
- setFirestoreDb(String) - Method in interface org.apache.beam.sdk.io.gcp.firestore.FirestoreOptions
-
Set the Firestore database ID to connect to.
- setFirestoreHost(String) - Method in interface org.apache.beam.sdk.io.gcp.firestore.FirestoreOptions
-
Define a host port pair to allow connecting to a Cloud Firestore instead of the default live service.
- setFirestoreProject(String) - Method in interface org.apache.beam.sdk.io.gcp.firestore.FirestoreOptions
-
Set the Firestore project ID, it will override the value from
GcpOptions.getProject()
. - setFlexRSGoal(DataflowPipelineOptions.FlexResourceSchedulingGoal) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
- setFlinkConfDir(String) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- setFlinkMaster(String) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- setForceSlotSharingGroup(Boolean) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- setForceStreaming(boolean) - Method in interface org.apache.beam.runners.spark.TestSparkPipelineOptions
- setForceUnalignedCheckpointEnabled(boolean) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- setFormat(String) - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration.Builder
- setFormat(String) - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.Builder
-
The format of the file content.
- setFormat(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
- setFormat(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.Builder
- setFormat(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder
- setFormat(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration.Builder
- setFormat(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
- setFormat(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration.Builder
- setFormatClass(Class<?>) - Method in class org.apache.beam.sdk.io.cdap.Plugin.Builder
- setFormatProviderClass(Class<?>) - Method in class org.apache.beam.sdk.io.cdap.Plugin.Builder
- setFromSnapshotExclusive(Long) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
- setFromSnapshotInclusive(Long) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
- setFromSnapshotRefExclusive(String) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
- setFromSnapshotRefInclusive(String) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
- setFromTimestamp(Long) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
- setGcpCredential(Credentials) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
- setGcpOauthScopes(List<String>) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
- setGcpTempLocation(String) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
- setGcsCustomAuditEntries(GcsOptions.GcsCustomAuditEntries) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
- setGcsEndpoint(String) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
- setGcsHttpRequestReadTimeout(Integer) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
- setGcsHttpRequestWriteTimeout(Integer) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
- setGcsPerformanceMetrics(Boolean) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
- setGcsReadCounterPrefix(String) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
- setGcsRewriteDataOpBatchLimit(Integer) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
- setGcsUploadBufferSizeBytes(Integer) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
- setGcsUploadBufferSizeBytes(Integer) - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsCreateOptions.Builder
- setGcsUtil(GcsUtil) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
- setGcsWriteCounterPrefix(String) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
- setGCThrashingPercentagePerPeriod(Double) - Method in interface org.apache.beam.sdk.options.MemoryMonitorOptions
- setGetOffsetFn(SerializableFunction<V, Long>) - Method in class org.apache.beam.sdk.io.cdap.Plugin.Builder
- setGetReceiverArgsFromConfigFn(SerializableFunction<PluginConfig, Object[]>) - Method in class org.apache.beam.sdk.io.cdap.Plugin.Builder
- setGlobalConfigRefreshPeriod(Duration) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- setGoogleAdsClientId(String) - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsOptions
- setGoogleAdsClientSecret(String) - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsOptions
- setGoogleAdsCredential(Credentials) - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsOptions
- setGoogleAdsCredentialFactoryClass(Class<? extends CredentialFactory>) - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsOptions
- setGoogleAdsDeveloperToken(String) - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsOptions
- setGoogleAdsEndpoint(String) - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsOptions
- setGoogleAdsRefreshToken(String) - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsOptions
- setGoogleApiTrace(GoogleApiDebugOptions.GoogleApiTracer) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GoogleApiDebugOptions
- setGoogleCloudStorageReadOptions(GoogleCloudStorageReadOptions) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
- setGroupFilesFileLoad(Boolean) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setGroupingTableMaxSizeMb(int) - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
- setGzipCompressHeapDumps(boolean) - Method in interface org.apache.beam.sdk.options.MemoryMonitorOptions
- setHdfsConfiguration(List<Configuration>) - Method in interface org.apache.beam.sdk.io.hdfs.HadoopFileSystemOptions
- setHeaderColumns(PCollectionView<List<String>>) - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText.Builder
- setHeaderColumns(PCollectionView<List<String>>) - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText.Builder
- setHeaderColumns(PCollectionView<List<String>>) - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText.Builder
- setHeartbeatMillis(long) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
-
Sets the heartbeat interval in millis.
- setHoldability(int) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- setHooks(DataflowRunnerHooks) - Method in class org.apache.beam.runners.dataflow.DataflowRunner
-
Sets callbacks to invoke during execution see
DataflowRunnerHooks
. - setHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.CrossLanguageConfiguration
- setHost(String) - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration.Builder
- setHost(String) - Method in class org.apache.beam.io.debezium.DebeziumTransformRegistrar.ReadBuilder.Configuration
- setHotKeyLoggingEnabled(boolean) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
- setHttpClient(HttpClient) - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
- setHttpClientConfiguration(HttpClientConfiguration) - Method in interface org.apache.beam.sdk.io.aws2.options.AwsOptions
- setHttpHeaders(Map<String, String>) - Method in class org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer
- setHttpPipeline(HttpPipeline) - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
- setHTTPReadTimeout(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setHTTPWriteTimeout(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setHumanReadableJsonRecord(String) - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Record.Builder
- setId(int) - Method in class org.apache.beam.sdk.extensions.sql.example.model.Customer
- setId(int) - Method in class org.apache.beam.sdk.extensions.sql.example.model.Order
- setIdAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
- setIdAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.Builder
- setIdLabel(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead.Configuration
- setIdLabel(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite.Configuration
- setIdleShutdownTimeout(String) - Method in interface org.apache.beam.runners.prism.PrismPipelineOptions
- setImpersonateServiceAccount(String) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
- setIndex(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration.Builder
- setInferMaps(boolean) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.SchemaConversionOptions.Builder
- setInitialPositionInStream(String) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.ReadDataBuilder.Configuration
- setInitialTimestampInStream(Long) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.ReadDataBuilder.Configuration
- setInput(Input) - Method in class org.apache.beam.sdk.io.cdap.context.BatchSourceContextImpl
-
Overrides the input configuration of this Batch job to the specified
Input
. - setInputFile(String) - Method in interface org.apache.beam.runners.spark.structuredstreaming.examples.WordCount.WordCountOptions
- setInsertBundleParallelism(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setInspectConfig(InspectConfig) - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText.Builder
- setInspectConfig(InspectConfig) - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText.Builder
- setInspectConfig(InspectConfig) - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText.Builder
- setInspectTemplateName(String) - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText.Builder
- setInspectTemplateName(String) - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText.Builder
- setInspectTemplateName(String) - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText.Builder
- setInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration.Builder
- setInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration.Builder
- setInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration.Builder
- setInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration.Builder
- setInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.CrossLanguageConfiguration
- setInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration.Builder
- setIsBoundedInternal(PCollection.IsBounded) - Method in class org.apache.beam.sdk.values.PCollection
-
For internal use only; no backwards-compatibility guarantees.
- setIsReadSeekEfficient(boolean) - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata.Builder
- setIsWindmillServiceDirectPathEnabled(boolean) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- setJavaAggregateFunctions(ImmutableMap<List<String>, Combine.CombineFn<?, ?, ?>>) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.UserFunctionDefinitions.Builder
- setJavaClassLookupAllowlist(JavaClassLookupTransformProvider.AllowList) - Method in interface org.apache.beam.sdk.expansion.service.ExpansionServiceOptions
- setJavaClassLookupAllowlistFile(String) - Method in interface org.apache.beam.sdk.expansion.service.ExpansionServiceOptions
- setJavaScalarFunctions(ImmutableMap<List<String>, UserFunctionDefinitions.JavaScalarFunction>) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.UserFunctionDefinitions.Builder
- setJdbcType(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration.Builder
- setJdbcType(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration.Builder
- setJdbcUrl(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration.Builder
- setJdbcUrl(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration.Builder
- setJdkAddOpenModules(List<String>) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
- setJdkAddOpenModules(List<String>) - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
- setJdkAddRootModules(List<String>) - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
- setJetDefaultParallelism(Integer) - Method in interface org.apache.beam.runners.jet.JetPipelineOptions
- setJetLocalMode(Integer) - Method in interface org.apache.beam.runners.jet.JetPipelineOptions
- setJetProcessorsCooperative(Boolean) - Method in interface org.apache.beam.runners.jet.JetPipelineOptions
- setJetServers(String) - Method in interface org.apache.beam.runners.jet.JetPipelineOptions
- setJfrRecordingDurationSec(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
- setJobCheckIntervalInSecs(int) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- setJobEndpoint(String) - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
- setJobFileZip(String) - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
- setJobId(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerHarnessOptions
- setJobLabelsMap(Map<String, String>) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setJobName(String) - Method in interface org.apache.beam.sdk.options.PipelineOptions
- setJobServerConfig(String...) - Method in interface org.apache.beam.runners.portability.testing.TestPortablePipelineOptions
- setJobServerDriver(Class<JobServerDriver>) - Method in interface org.apache.beam.runners.portability.testing.TestPortablePipelineOptions
- setJobServerTimeout(int) - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
- setJobType(String) - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
- setJsonToRowWithErrFn(JsonToRow.JsonToRowWithErrFn) - Method in class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn.ParseWithError.Builder
- setKeep(List<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.Builder
- setKeep(List<String>) - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration.Builder
- setKeep(JavaRowUdf.Configuration) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider.Configuration.Builder
- setKeepFields(List<String>) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
- setKeyContextElement1(StreamRecord) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.ExecutableStageDoFnOperator
-
Note: This is only relevant when we have a stateful DoFn.
- setKeyDeserializer(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
- setKeySerializer(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write.External.Configuration
- setKinesisIOConsumerArns(Map<String, String>) - Method in interface org.apache.beam.sdk.io.aws2.kinesis.KinesisIOOptions
- setKmsKey(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration.Builder
- setKmsKey(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.Builder
- setLabels(Map<String, String>) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
- setLanguage(String) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider.Configuration.Builder
- setLanguage(String) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider.Configuration.Builder
- setLanguageHint(String) - Method in class org.apache.beam.sdk.extensions.ml.AnnotateText.Builder
- setLastContiguousSequenceRange(ContiguousSequenceRange) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus.Builder
- setLastEventReceived(boolean) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus.Builder
- setLastModifiedMillis(long) - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata.Builder
- setLastProcessedSequence(Long) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus.Builder
- setLatencyNanos(Long) - Method in class org.apache.beam.sdk.io.solace.data.Solace.PublishResult.Builder
- setLatencyTrackingInterval(Long) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- setLatestBufferedSequence(Long) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus.Builder
- setLength(Long) - Method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeVarchar
- setLevel(String) - Method in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider.Configuration.Builder
- setLineField(String) - Method in class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn
-
Sets the field name for the line field in the returned Row.
- setListeners(List<JavaStreamingListener>) - Method in interface org.apache.beam.runners.spark.SparkContextOptions
- setLoadBalanceBundles(boolean) - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
- setLocalJobServicePortFile(String) - Method in interface org.apache.beam.runners.portability.testing.TestUniversalRunner.Options
- setLocalWindmillHostport(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- setLocation(String) - Method in class org.apache.beam.sdk.extensions.schemaio.expansion.ExternalSchemaIOTransformRegistrar.Configuration
- setLocation(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder
- setLocation(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration.Builder
- setLocation(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration.Builder
- setLocation(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration.Builder
- setLoginTimeout(String) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- setLogMdc(boolean) - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
- setManifestListLocation(String) - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo.Builder
- setMapKeyType(FieldValueTypeInformation) - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation.Builder
- setMapping(Contextful<Contextful.Fn<KV<K, V>, KV<K, Long>>>) - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.PerKey.Builder
- setMapping(Contextful<Contextful.Fn<T, Long>>) - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.Globally.Builder
- setMapValueType(FieldValueTypeInformation) - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation.Builder
- setMaxBufferingDurationMilliSec(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setMaxBundlesFromWindmillOutstanding(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- setMaxBundleSize(Long) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- setMaxBundleTimeMills(Long) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- setMaxBytesFromWindmillOutstanding(long) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- setMaxCacheMemoryUsageMb(int) - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
- setMaxCacheMemoryUsageMbClass(Class<? extends SdkHarnessOptions.MaxCacheMemoryUsageMb>) - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
- setMaxCacheMemoryUsagePercent(float) - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
- setMaxCapacityPerShard(Long) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.ReadDataBuilder.Configuration
- setMaxConnectionPoolConnections(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setMaxNumberOfRecords(Long) - Method in class org.apache.beam.io.debezium.DebeziumTransformRegistrar.ReadBuilder.Configuration
- setMaxNumRecords(Long) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.ReadDataBuilder.Configuration
- setMaxNumRecords(Long) - Method in class org.apache.beam.sdk.io.aws2.sqs.providers.SqsReadConfiguration.Builder
- setMaxNumRecords(Long) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
- setMaxNumWorkers(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
- setMaxNumWritersPerBundle(Integer) - Method in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformConfiguration.Builder
- setMaxOutputElementsPerBundle(int) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler
-
Overrides the default value.
- setMaxParallelism(Integer) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- setMaxReadTime(Long) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.ReadDataBuilder.Configuration
- setMaxReadTime(Long) - Method in class org.apache.beam.sdk.io.GenerateSequence.External.ExternalConfiguration
- setMaxReadTime(Long) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
- setMaxReadTimeSeconds(Integer) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
- setMaxReadTimeSecs(Long) - Method in class org.apache.beam.sdk.io.aws2.sqs.providers.SqsReadConfiguration.Builder
- setMaxRecordsPerBatch(Long) - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
- setMaxStackTraceDepthToReport(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- setMaxStreamingBatchSize(Long) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setMaxStreamingRowsToBatch(Long) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setMemoryMB(int) - Method in class org.apache.beam.sdk.extensions.sorter.ExternalSorter.Options
-
Sets the size of the memory buffer in megabytes.
- setMessageId(Long) - Method in class org.apache.beam.sdk.extensions.sbe.SbeSchema.IrOptions.Builder
- setMessageId(String) - Method in class org.apache.beam.sdk.io.solace.data.Solace.CorrelationKey.Builder
- setMessageId(String) - Method in class org.apache.beam.sdk.io.solace.data.Solace.PublishResult.Builder
- setMessageId(String) - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record.Builder
- setMessageName(String) - Method in class org.apache.beam.sdk.extensions.sbe.SbeSchema.IrOptions.Builder
- setMessageName(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder
- setMessageName(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration.Builder
- setMessageName(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
- setMessageName(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration.Builder
- setMessageRecord(Object) - Method in class org.apache.beam.sdk.io.pulsar.PulsarMessage
- setMetadataTable(String) - Method in interface org.apache.beam.sdk.io.gcp.spanner.SpannerIO.SpannerChangeStreamOptions
-
Specifies the name of the metadata table.
- setMethod(Method) - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation.Builder
- setMetricsGraphiteHost(String) - Method in interface org.apache.beam.sdk.metrics.MetricsOptions
- setMetricsGraphitePort(Integer) - Method in interface org.apache.beam.sdk.metrics.MetricsOptions
- setMetricsHttpSinkUrl(String) - Method in interface org.apache.beam.sdk.metrics.MetricsOptions
- setMetricsPushPeriod(Long) - Method in interface org.apache.beam.sdk.metrics.MetricsOptions
- setMetricsSink(Class<? extends MetricsSink>) - Method in interface org.apache.beam.sdk.metrics.MetricsOptions
- setMetricsSupported(boolean) - Static method in class org.apache.beam.sdk.metrics.MetricsEnvironment
-
Called by the run to indicate whether metrics reporting is supported.
- setMimeType(String) - Method in class org.apache.beam.sdk.io.fs.CreateOptions.Builder
- setMinConnectionPoolConnections(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setMinCpuPlatform(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
- setMinPauseBetweenCheckpoints(Long) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- setMinReadTimeMillis(Long) - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
- setName(String) - Method in class org.apache.beam.runners.spark.translation.BoundedDataset
- setName(String) - Method in interface org.apache.beam.runners.spark.translation.Dataset
- setName(String) - Method in class org.apache.beam.runners.spark.translation.streaming.UnboundedDataset
- setName(String) - Method in class org.apache.beam.sdk.extensions.sql.example.model.Customer
- setName(String) - Method in class org.apache.beam.sdk.io.snowflake.data.SnowflakeColumn
- setName(String) - Method in class org.apache.beam.sdk.io.solace.data.Solace.Destination.Builder
- setName(String) - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation.Builder
- setName(String) - Method in class org.apache.beam.sdk.schemas.Schema.Field.Builder
- setName(String) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf.Configuration.Builder
- setName(String) - Method in class org.apache.beam.sdk.values.PCollection
-
Sets the name of this
PCollection
. - setName(String) - Method in class org.apache.beam.sdk.values.PValueBase
-
Sets the name of this
PValueBase
. - setNetwork(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
- setNetworkTimeout(Executor, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- setNoSpilling(Boolean) - Method in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformConfiguration.Builder
- setNullable(boolean) - Method in class org.apache.beam.sdk.io.snowflake.data.SnowflakeColumn
- setNullable(boolean) - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation.Builder
- setNumber(Integer) - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation.Builder
- setNumberOfBufferedEvents(long) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus.Builder
- setNumberOfExecutionRetries(Integer) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- setNumberOfReceivedEvents(long) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus.Builder
- setNumberOfWorkerHarnessThreads(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
- setNumConcurrentCheckpoints(int) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- setNumFailuresExpected(int) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
- setNumPartitions(Integer) - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration.Builder
- setNumSampledBytesPerFile(long) - Method in class org.apache.beam.sdk.io.TextRowCountEstimator.Builder
- setNumShards(int) - Method in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformConfiguration.Builder
- setNumShards(Integer) - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.Builder
-
The number of output shards produced; a value of 1 disables sharding.
- setNumStorageWriteApiStreamAppendClients(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setNumStorageWriteApiStreams(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setNumStreamingKeys(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setNumStreams(Integer) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.Builder
- setNumWorkers(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
- setOauthToken(String) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- setOAuthToken(String) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- setObjectReuse(Boolean) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- setOffsetDeduplication(Boolean) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
- setOnCreateMatcher(SerializableMatcher<PipelineResult>) - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
- setOneOfTypes(Map<String, FieldValueTypeInformation>) - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation.Builder
- setOnly(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.Builder
- setOnly(String) - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration.Builder
- setOnSuccessMatcher(SerializableMatcher<PipelineResult>) - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
- setOperation(String) - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo.Builder
- setOperatorChaining(Boolean) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- SetOperatorFilteringDoFn(String, String, BeamSetOperatorRelBase.OpType, boolean) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamSetOperatorsTransforms.SetOperatorFilteringDoFn
- setOption(String, Schema.FieldType, Object) - Method in class org.apache.beam.sdk.schemas.Schema.Options.Builder
- setOption(String, Schema.FieldType, Object) - Static method in class org.apache.beam.sdk.schemas.Schema.Options
- setOption(String, Row) - Method in class org.apache.beam.sdk.schemas.Schema.Options.Builder
- setOption(String, Row) - Static method in class org.apache.beam.sdk.schemas.Schema.Options
- setOptions(Schema.Options) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
-
Returns a copy of the Field with isNullable set.
- setOptions(Schema.Options) - Method in class org.apache.beam.sdk.schemas.Schema.Field.Builder
- setOptions(Schema.Options.Builder) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
- setOptions(Schema.Options.Builder) - Method in class org.apache.beam.sdk.schemas.Schema.Field.Builder
- setOptions(ImmutableMap<String, String>) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
- setOptionsId(long) - Method in interface org.apache.beam.sdk.options.PipelineOptions
- setOutput(String) - Method in interface org.apache.beam.runners.spark.structuredstreaming.examples.WordCount.WordCountOptions
- setOutput(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.ErrorHandling.Builder
- setOutput(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.ErrorHandling.Builder
- setOutput(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.ErrorHandling.Builder
- setOutput(String) - Method in class org.apache.beam.sdk.schemas.transforms.providers.ErrorHandling.Builder
- setOutputDataSet(PCollection<T>, TSet<WindowedValue<T>>) - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
- setOutputExecutablePath(String) - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
- setOutputFilePrefix(String) - Method in interface org.apache.beam.sdk.extensions.sql.example.BeamSqlDataCatalogExample.DCExamplePipelineOptions
- setOutputParallelization(Boolean) - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration.Builder
- setOutputParallelization(Boolean) - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadConfiguration.Builder
- setOutputPrefix(String) - Method in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformConfiguration.Builder
- setOverrideWindmillBinary(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- setParallelism(int) - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
- setParallelism(Integer) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- setParam(String, Object) - Method in class org.apache.beam.sdk.io.cdap.ConfigWrapper
-
Sets a
Plugin
single parameter. - setParameters(PreparedStatement) - Method in interface org.apache.beam.sdk.io.jdbc.JdbcIO.StatementPreparator
- setParameters(PreparedStatement) - Method in interface org.apache.beam.sdk.io.singlestore.SingleStoreIO.StatementPreparator
- setParameters(KV<PartitionT, PartitionT>, PreparedStatement) - Method in interface org.apache.beam.sdk.io.jdbc.JdbcReadWithPartitionsHelper
- setParameters(T, PreparedStatement) - Method in interface org.apache.beam.sdk.io.jdbc.JdbcIO.PreparedStatementSetter
- setParentId(Long) - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo.Builder
- setParentTokens(HashSet<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
-
Sets the collection of parent partition identifiers.
- setParquetConfiguration(FileWriteSchemaTransformConfiguration.ParquetConfiguration) - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.Builder
-
Configures extra details related to writing Parquet formatted files.
- setPartitionColumn(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration.Builder
- setPartitionFields(List<String>) - Method in class org.apache.beam.sdk.io.iceberg.IcebergTableCreateConfig.Builder
- setPartitionFields(List<String>) - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration.Builder
- setPartitionKey(String) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.WriteBuilder.Configuration
- setPartitionToken(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
-
Sets the unique partition identifier.
- setPassword(String) - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration.Builder
- setPassword(String) - Method in class org.apache.beam.io.debezium.DebeziumTransformRegistrar.ReadBuilder.Configuration
- setPassword(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration.Builder
- setPassword(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration.Builder
- setPassword(String) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- setPassword(ValueProvider<String>) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- setPath(String) - Method in class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider.CsvWriteConfiguration.Builder
- setPath(String) - Method in class org.apache.beam.sdk.io.json.providers.JsonWriteTransformProvider.JsonWriteConfiguration.Builder
- setPath(String) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf.Configuration.Builder
- setPathValidator(PathValidator) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
- setPathValidatorClass(Class<? extends PathValidator>) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
- setPayload(byte[]) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload.Builder
- setPayload(byte[]) - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record.Builder
- setPayload(byte[]) - Method in class org.apache.beam.sdk.schemas.io.Failure.Builder
- setPeriod(Long) - Method in class org.apache.beam.sdk.io.GenerateSequence.External.ExternalConfiguration
- setPeriodicStatusPageOutputDirectory(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- setPerWorkerMetricsUpdateReportingPeriodMillis(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- setPipelineOption(String, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
- setPipelineOptions(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv.BeamSqlEnvBuilder
- setPipelineOptions(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.sql.impl.JdbcConnection
- setPipelineOptionsMap(Map<String, String>) - Method in class org.apache.beam.sdk.extensions.sql.impl.JdbcConnection
-
Only called from the
BeamCalciteSchema
. - setPipelinePolicy(HttpPipelinePolicy) - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
- setPipelineUrl(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
- setPlannerName(String) - Method in interface org.apache.beam.sdk.extensions.sql.impl.BeamSqlPipelineOptions
- setPluginClass(Class<?>) - Method in class org.apache.beam.sdk.io.cdap.Plugin.Builder
- setPluginType(PluginConstants.PluginType) - Method in class org.apache.beam.sdk.io.cdap.Plugin.Builder
- setPollInterval(Duration) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
- setPollIntervalMillis(Long) - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration.Builder
- setPort(int) - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver.ServerConfiguration
- setPort(Integer) - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration.Builder
- setPort(String) - Method in class org.apache.beam.io.debezium.DebeziumTransformRegistrar.ReadBuilder.Configuration
- setPortNumber(String) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- setPreBundleCallback(Runnable) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- setPrecision(int) - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.Globally.Builder
- setPrecision(int) - Method in class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeNumber
- setPrecision(Integer) - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.PerKey.Builder
- setPredefinedCsvFormat(String) - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.CsvConfiguration.Builder
-
The
Enum.name()
of the written CSV file. - setPreferGroupByKeyToHandleHugeValues(Boolean) - Method in interface org.apache.beam.runners.spark.SparkCommonPipelineOptions
- setPrefix(String) - Method in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider.Configuration.Builder
- setPrimaryKey(TableReference, List<String>) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- setPrimaryKey(List<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.Builder
- setPriority(int) - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record.Builder
- setPrismLocation(String) - Method in interface org.apache.beam.runners.prism.PrismPipelineOptions
- setPrismLogLevel(String) - Method in interface org.apache.beam.runners.prism.PrismPipelineOptions
- setPrismVersionOverride(String) - Method in interface org.apache.beam.runners.prism.PrismPipelineOptions
- setPrivateKeyPassphrase(String) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- setPrivateKeyPassphrase(ValueProvider<String>) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- setPrivateKeyPath(String) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- setPrivateKeyPath(String) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- setProcessWideContainer(MetricsContainer) - Static method in class org.apache.beam.sdk.metrics.MetricsEnvironment
-
Set the
MetricsContainer
for the current process. - setProducerConfig(Map<String, String>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write.External.Configuration
- setProducerConfigUpdates(Map<String, String>) - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration.Builder
- setProduceStatusUpdateOnEveryEvent(boolean) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler
-
Sets the indicator of whether the status notification needs to be produced on every event.
- setProfilingAgentConfiguration(DataflowProfilingOptions.DataflowProfilingAgentConfiguration) - Method in interface org.apache.beam.runners.dataflow.options.DataflowProfilingOptions
- setProject(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
- setProject(String) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
- setProject(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder
- setProject(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration.Builder
- setProjectId(String) - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText.Builder
- setProjectId(String) - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText.Builder
- setProjectId(String) - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText.Builder
- setProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration.Builder
- setProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration.Builder
- setProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration.Builder
- setProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration.Builder
- setProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.CrossLanguageConfiguration
- setProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration.Builder
- setProvidedSparkContext(JavaSparkContext) - Method in interface org.apache.beam.runners.spark.SparkContextOptions
- setProvidedSparkContext(JavaSparkContext) - Static method in class org.apache.beam.runners.spark.translation.SparkContextFactory
-
Set an externally managed
JavaSparkContext
that will be used ifSparkPipelineOptions.getUsesProvidedSparkContext()
is set totrue
. - setProviderRuntimeValues(ValueProvider<Map<String, Object>>) - Method in interface org.apache.beam.sdk.testing.TestPipeline.TestValueProviderOptions
- setProxyConfiguration(ProxyConfiguration) - Method in interface org.apache.beam.sdk.io.aws2.options.AwsOptions
- setPublished(Boolean) - Method in class org.apache.beam.sdk.io.solace.data.Solace.PublishResult.Builder
- setPublishMonotonicNanos(long) - Method in class org.apache.beam.sdk.io.solace.data.Solace.CorrelationKey.Builder
- setPubsubRootUrl(String) - Method in interface org.apache.beam.sdk.io.gcp.pubsub.PubsubOptions
- setQualifiers(List<FieldAccessDescriptor.FieldDescriptor.Qualifier>) - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.Builder
- setQuery(String) - Method in class org.apache.beam.sdk.extensions.sql.expansion.ExternalSqlTransformRegistrar.Configuration
- setQuery(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration.Builder
-
Configures the BigQuery read job with the SQL query.
- setQuery(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration.Builder
- setQuery(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration.Builder
- setQuery(String) - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadConfiguration.Builder
- setQuery(String) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- setQuery(ValueProvider<String>) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- setQueryLocation(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration.Builder
-
BigQuery geographic location where the query job will be executed.
- setQueryPlannerClassName(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv.BeamSqlEnvBuilder
- setQueryString(String) - Method in interface org.apache.beam.sdk.extensions.sql.example.BeamSqlDataCatalogExample.DCExamplePipelineOptions
- setQueue(Queue) - Method in class org.apache.beam.sdk.io.solace.broker.SessionServiceFactory
-
This method is called in the
SolaceIO.Read.expand(org.apache.beam.sdk.values.PBegin)
method to set the Queue reference based onSolaceIO.Read.from(Solace.Queue)
orSolaceIO.Read.from(Solace.Topic)
. - setQueueUrl(String) - Method in class org.apache.beam.sdk.io.aws2.sqs.providers.SqsReadConfiguration.Builder
- setRamMegaBytes(int) - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
- setRate(GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Rate) - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Builder
- setRateLimit(Long) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.ReadDataBuilder.Configuration
- setRawPrivateKey(String) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- setRawPrivateKey(ValueProvider<String>) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- setRawType(Class<?>) - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation.Builder
- setReadChangeStreamTimeout(Duration) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.DaoFactory
- setReadChangeStreamTimeout(Duration) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.BigtableChangeStreamAccessor
- setReaderCacheTimeoutSec(Integer) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
- setReadOnly(boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- setReadQuery(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration.Builder
- setReadTimeout(int) - Method in class org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer
- setReadTimePercentage(Double) - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
- setReadTimestamp(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
- setReceiverClass(Class<? extends Receiver<V>>) - Method in class org.apache.beam.sdk.io.cdap.Plugin.Builder
- setReceiveTimestamp(Long) - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record.Builder
- setRecord(BadRecord.Record) - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Builder
- setRecordJfrOnGcThrashing(boolean) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
- setRedelivered(boolean) - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record.Builder
- setRedistribute(Boolean) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
- setRedistributeNumKeys(Integer) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
- setReference(byte[]) - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueComparator
- setRegion(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
- setRegion(String) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.ReadDataBuilder.Configuration
- setRegion(String) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.WriteBuilder.Configuration
- setReidentifyConfig(DeidentifyConfig) - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText.Builder
- setReidentifyTemplateName(String) - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText.Builder
- setReIterableGroupByKeyResult(Boolean) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- setRelative() - Method in interface org.apache.beam.sdk.state.Timer
-
Sets the timer relative to the current time, according to any offset and alignment specified.
- setRemoteHeapDumpLocation(String) - Method in interface org.apache.beam.sdk.options.MemoryMonitorOptions
- setReplicationGroupMessageId(String) - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record.Builder
- setReplyTo(Solace.Destination) - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record.Builder
- setReportCheckpointDuration(String) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- setRequestRecordsLimit(Long) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.ReadDataBuilder.Configuration
- setResourceHints(List<String>) - Method in interface org.apache.beam.sdk.transforms.resourcehints.ResourceHintsOptions
- setResourceHints(ResourceHints) - Method in class org.apache.beam.sdk.transforms.PTransform
-
Sets resource hints for the transform.
- setResourceId(ResourceId) - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata.Builder
- setResultCount(long) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus.Builder
- setRetainDockerContainers(boolean) - Method in interface org.apache.beam.sdk.options.ManualDockerEnvironmentOptions
- setRetainExternalizedCheckpointsOnCancellation(Boolean) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- setRole(String) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- setRole(ValueProvider<String>) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- setRootElement(String) - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.XmlConfiguration.Builder
-
Sets the enclosing root element for the generated XML files.
- setRowGroupSize(Integer) - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.ParquetConfiguration.Builder
-
Specify row-group size; if not set or zero, a default is used by the underlying writer.
- setRowRestriction(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration.Builder
- setRowSchema(Schema) - Method in class org.apache.beam.sdk.values.PCollection
-
Sets a schema on this PCollection.
- setRuleSets(Collection<RuleSet>) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv.BeamSqlEnvBuilder
-
Set the ruleSet used for query optimizer.
- setRunner(Class<? extends PipelineRunner<?>>) - Method in interface org.apache.beam.sdk.options.PipelineOptions
- setRunnerDeterminedSharding(boolean) - Method in interface org.apache.beam.runners.direct.DirectTestOptions
- setRunningAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
-
Sets the time at which the partition started running.
- sets(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
The
TypeDescriptor
forSet
. - Sets - Class in org.apache.beam.sdk.transforms
-
The
PTransform
s that allow to compute different set functions acrossPCollection
s. - Sets() - Constructor for class org.apache.beam.sdk.transforms.Sets
- setS3ClientBuilder(S3ClientBuilder) - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration.Builder
- setS3ClientFactoryClass(Class<? extends S3ClientBuilderFactory>) - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
- setS3StorageClass(String) - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
- setS3StorageClass(String) - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration.Builder
- setS3ThreadPoolSize(int) - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
- setS3ThreadPoolSize(int) - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration.Builder
- setS3UploadBufferSizeBytes(int) - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration.Builder
- setS3UploadBufferSizeBytes(Integer) - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
- setSamplingStrategy(TextRowCountEstimator.SamplingStrategy) - Method in class org.apache.beam.sdk.io.TextRowCountEstimator.Builder
- setSasToken(String) - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
- setSaveHeapDumpsToGcsPath(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
- setSavepoint() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- setSavepoint(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- setSavepointPath(String) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- setSaveProfilesToGcs(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowProfilingOptions
- setScale(int) - Method in class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeNumber
- setScanType(IcebergScanConfig.ScanType) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
- setScheduledAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
-
Sets the time at which the partition was scheduled.
- setScheduledExecutorService(ScheduledExecutorService) - Method in interface org.apache.beam.sdk.options.ExecutorOptions
- setSchema(byte[]) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
- setSchema(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- setSchema(String) - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration.Builder
- setSchema(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
- setSchema(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder
- setSchema(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration.Builder
- setSchema(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
- setSchema(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration.Builder
- setSchema(String) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- setSchema(ValueProvider<String>) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- setSchema(Schema) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
- setSchema(Schema) - Method in class org.apache.beam.sdk.io.iceberg.IcebergTableCreateConfig.Builder
- setSchema(Schema, TypeDescriptor<T>, SerializableFunction<T, Row>, SerializableFunction<Row, T>) - Method in class org.apache.beam.sdk.values.PCollection
-
Sets a
Schema
on thisPCollection
. - setSchemaId(Integer) - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo.Builder
- setSchemaIfNotPresent(String, Schema) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
- setSchematizedData(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
- setScheme(String) - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration.Builder
- setSdkContainerImage(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
- setSdkHarnessContainerImageOverrides(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
- setSdkHarnessLogLevelOverrides(SdkHarnessOptions.SdkHarnessLogLevelOverrides) - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
- setSdkWorkerParallelism(int) - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
- setSeconds(Long) - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Rate.Builder
- setSelectedFields(List<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration.Builder
- setSemiPersistDir(String) - Method in interface org.apache.beam.sdk.options.RemoteEnvironmentOptions
- setSenderTimestamp(Long) - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record.Builder
- setSequenceNumber(long) - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo.Builder
- setSequenceNumber(Long) - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record.Builder
- setSerializedWindowingStrategy(byte[]) - Method in class org.apache.beam.sdk.expansion.service.WindowIntoTransformProvider.Configuration.Builder
- setServerName(String) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- setServerName(ValueProvider<String>) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- setServiceAccount(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
- setServiceEndpoint(String) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.ReadDataBuilder.Configuration
- setServiceEndpoint(String) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.WriteBuilder.Configuration
- setSessionDurationSecs(Integer) - Method in class org.apache.beam.sdk.io.aws2.auth.StsAssumeRoleForFederatedCredentialsProvider.Builder
-
The session duration in seconds for the authentication request, by default this value is 3600.
- setShardNameTemplate(String) - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.Builder
-
Uses the given
ShardNameTemplate
for naming output files. - setShardTemplate(String) - Method in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformConfiguration.Builder
- setShouldFailRow(Function<TableRow, Boolean>) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- setShutdownSourcesAfterIdleMs(Long) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- setSideInput(PCollectionView<T>, BoundedWindow, T) - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
Deprecated.Use
TestPipeline
with theDirectRunner
. - setSideInputDataSet(String, BatchTSet<WindowedValue<ElemT>>) - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
- setSideInputs(Map<PCollectionView<?>, Map<BoundedWindow, ?>>) - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
Deprecated.Use
TestPipeline
with theDirectRunner
. - setSize(Long) - Method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeBinary
- setSizeBytes(long) - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata.Builder
- setSizeEstimator(CoderSizeEstimator<KV<ByteString, ChangeStreamRecord>>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.ReadChangeStreamPartitionDoFn
-
Sets the estimator to track throughput for each DoFn instance.
- setSkipHeaderLines(int) - Method in class org.apache.beam.sdk.io.TextRowCountEstimator.Builder
- setSnapshot(Long) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
- setSnapshotId(long) - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo.Builder
- setSnowPipe(ValueProvider<String>) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- setSorterType(ExternalSorter.Options.SorterType) - Method in class org.apache.beam.sdk.extensions.sorter.ExternalSorter.Options
-
Sets the sorter type.
- setSourceContext(SourceFunction.SourceContext<WindowedValue<ValueWithRecordId<OutputT>>>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.UnboundedSourceWrapper
-
Visible so that we can set this in tests.
- setSpannerConfig(SpannerConfig) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction.Builder
- setSparkMaster(String) - Method in interface org.apache.beam.runners.spark.SparkCommonPipelineOptions
- setSql(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
- setSqlScalarFunctions(ImmutableMap<List<String>, ResolvedNodes.ResolvedCreateFunctionStmt>) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.UserFunctionDefinitions.Builder
- setSqlTableValuedFunctions(ImmutableMap<List<String>, ResolvedNode>) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.UserFunctionDefinitions.Builder
- setSSEAlgorithm(String) - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration.Builder
- setSSEAlgorithm(String) - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
- setSSECustomerKey(SSECustomerKey) - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
- setSSECustomerKey(SSECustomerKey) - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration.Builder
- setSSEKMSKeyId(String) - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration.Builder
- setSSEKMSKeyId(String) - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
- setStableUniqueNames(PipelineOptions.CheckEnabled) - Method in interface org.apache.beam.sdk.options.PipelineOptions
- setStager(Stager) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
- setStagerClass(Class<? extends Stager>) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
- setStagingBucketName(String) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- setStagingBucketName(ValueProvider<String>) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- setStagingLocation(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
- setStaleness(Long) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
- setStart(Long) - Method in class org.apache.beam.sdk.io.GenerateSequence.External.ExternalConfiguration
- setStart(Long) - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Builder
- setStartAtTimestamp(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration.Builder
- setStartingStrategy(IcebergIO.ReadRows.StartingStrategy) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
- setStartOffset(Long) - Method in interface org.apache.beam.sdk.io.sparkreceiver.HasOffset
- setStartReadTime(Long) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
- setStartTimestamp(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
-
Sets the start time of the partition.
- setState(PartitionMetadata.State) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
-
Sets the current state of the partition.
- setState(PipelineResult.State) - Method in class org.apache.beam.runners.twister2.Twister2PipelineResult
- SetState<T> - Interface in org.apache.beam.sdk.state
-
A
ReadableState
cell containing a set of elements. - setStateBackend(String) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- setStateBackendFactory(Class<? extends FlinkStateBackendFactory>) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
-
Deprecated.Please use setStateBackend below.
- setStateBackendStoragePath(String) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- setStatistics(BeamTableStatistics) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestUnboundedTable
- setStatusDate(Instant) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus.Builder
- setStatusUpdateFrequency(Duration) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler
-
Changes the default status update frequency.
- setStop(Long) - Method in class org.apache.beam.sdk.io.GenerateSequence.External.ExternalConfiguration
- setStopPipelineWatermark(Long) - Method in interface org.apache.beam.runners.spark.TestSparkPipelineOptions
- setStopReadTime(Long) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
- setStorageApiAppendThresholdBytes(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setStorageApiAppendThresholdRecordCount(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setStorageClient(Storage) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
- setStorageIntegrationName(String) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- setStorageIntegrationName(ValueProvider<String>) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- setStorageLevel(String) - Method in interface org.apache.beam.runners.spark.SparkCommonPipelineOptions
- setStorageWriteApiMaxRequestSize(Long) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setStorageWriteApiMaxRetries(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setStorageWriteApiTriggeringFrequencySec(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setStorageWriteMaxInflightBytes(Long) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setStorageWriteMaxInflightRequests(Long) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setStreaming(boolean) - Method in interface org.apache.beam.sdk.options.StreamingOptions
- setStreaming(Boolean) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
- setStreamingSideInputCacheExpirationMillis(Integer) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- setStreamingSideInputCacheMb(Integer) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- setStreamingTimeoutMs(Long) - Method in interface org.apache.beam.runners.spark.SparkPortableStreamingPipelineOptions
- setStreamName(String) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.ReadDataBuilder.Configuration
- setStreamName(String) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.WriteBuilder.Configuration
- setStuckCommitDurationMillis(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- setSubmissionMode(SolaceIO.SubmissionMode) - Method in class org.apache.beam.sdk.io.solace.broker.SessionServiceFactory
-
Called by the write connector to set the submission mode used to create the message producers.
- setSubnetwork(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
- setSubscription(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead.Configuration
- setSubscription(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
- setSubscriptionName(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder
- setSubscriptionPath(SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.SubscriberOptions.Builder
- setSummary(Map<String, String>) - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo.Builder
- setSupportKafkaMetrics(boolean) - Static method in class org.apache.beam.sdk.io.kafka.KafkaSinkMetrics
- setSupportMetricsDeletion(boolean) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
- setSupportStreamingInsertsMetrics(boolean) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
- setTable(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
- setTable(String) - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration.Builder
- setTable(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.Builder
- setTable(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration.Builder
- setTable(String) - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration.Builder
- setTable(String) - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadConfiguration.Builder
- setTable(String) - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteConfiguration.Builder
- setTable(String) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- setTable(ValueProvider<String>) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- settableArguments - Variable in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
-
This should be set after
SubmitterLifecycle.prepareRun(Object)
call with passing this context object as a param. - setTableCreateConfig(IcebergTableCreateConfig) - Method in class org.apache.beam.sdk.io.iceberg.IcebergDestination.Builder
- setTableId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration.Builder
- setTableId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration.Builder
- setTableId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration.Builder
- setTableId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration.Builder
- setTableIdentifier(String) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
- setTableIdentifier(String...) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
- setTableIdentifier(TableIdentifier) - Method in class org.apache.beam.sdk.io.iceberg.IcebergDestination.Builder
- setTableIdentifier(TableIdentifier) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
- setTableProperties(Map<String, String>) - Method in class org.apache.beam.sdk.io.iceberg.IcebergTableCreateConfig.Builder
- setTableProperties(Map<String, String>) - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration.Builder
- setTableSchema(String) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.WriteBuilder.Configuration
- setTableSpec(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration.Builder
-
Specifies a table for a BigQuery read job.
- setTableSpec(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration.Builder
- setTag(String) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
- setTargetDataset(String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.TestBigQueryOptions
- setTargetParallelism(int) - Method in interface org.apache.beam.runners.direct.DirectOptions
- setTempDatasetId(String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setTemplateLocation(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
-
Sets the Cloud Storage path where the Dataflow template will be stored.
- setTempLocation(String) - Method in class org.apache.beam.sdk.extensions.sorter.ExternalSorter.Options
-
Sets the path to a temporary location where the sorter writes intermediate files.
- setTempLocation(String) - Method in interface org.apache.beam.sdk.options.PipelineOptions
- setTempRoot(String) - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
- SETTER_WITH_NULL_METHOD_ERROR - Static variable in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
- setTerminateAfterSecondsSinceNewOutput(Long) - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration.Builder
- SetterTypeSupplier() - Constructor for class org.apache.beam.sdk.schemas.JavaBeanSchema.SetterTypeSupplier
- setTestMode(boolean) - Method in interface org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingPipelineOptions
- setTestTimeoutSeconds(Long) - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
- setThroughputEstimator(BytesThroughputEstimator<DataChangeRecord>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
-
Sets the estimator to calculate the backlog of this function.
- setTimer(StateNamespace, String, String, Instant, Instant, TimeDomain) - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
- setTimer(TimerInternals.TimerData) - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
- setTimer(Instant, TimeDomain) - Method in interface org.apache.beam.sdk.state.Timers
-
Sets a timer to fire when the event time watermark, the current processing time, or the synchronized processing time watermark surpasses a given timestamp.
- setTimer(Instant, Instant, TimeDomain) - Method in interface org.apache.beam.sdk.state.Timers
-
Set a timer with outputTimestamp.
- setTimestamp(Long) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
- setTimestamp(Instant) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload.Builder
- setTimestampAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead.Configuration
- setTimestampAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite.Configuration
- setTimestampAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
- setTimestampAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.Builder
- setTimestampBound(TimestampBound) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction.Builder
- setTimestampBoundMode(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
- setTimestampMillis(long) - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo.Builder
- setTimestampPolicy(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
- setTimeSupplier(Supplier<Timestamp>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
- setTimeToLive(long) - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record.Builder
- setTimeUnit(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
- setTopic(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead.Configuration
- setTopic(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite.Configuration
- setTopic(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
- setTopic(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.Builder
- setTopic(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write.External.Configuration
- setTopic(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
-
Sets the topic from which to read.
- setTopic(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration.Builder
- setTopicName(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration.Builder
- setTopicPath(TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PublisherOptions.Builder
- setTopics(List<String>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
- setToSnapshot(Long) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
- setToSnapshotRef(String) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
- setToTimestamp(Long) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
- setTransactionIsolation(int) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- setTransformNameMapping(Map<String, String>) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
- setTriggeringFrequencySeconds(Integer) - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration.Builder
- setTriggeringFrequencySeconds(Long) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.Builder
- setTruncateTimestamps(boolean) - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogPipelineOptions
- setTruncateTimestamps(BigQueryUtils.ConversionOptions.TruncateTimestamps) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions.Builder
- setTSetEnvironment(TSetEnvironment) - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
- setTwister2Home(String) - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
- setType(Solace.DestinationType) - Method in class org.apache.beam.sdk.io.solace.data.Solace.Destination.Builder
- setType(Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.Field.Builder
- setType(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation.Builder
- setTypeDescriptor(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.values.PCollection
-
Sets the
TypeDescriptor<T>
for thisPCollection<T>
. - setTypeMap(Map<String, Class<?>>) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- setUint16Behavior(UnsignedOptions.Behavior) - Method in class org.apache.beam.sdk.extensions.sbe.UnsignedOptions.Builder
- setUint32Behavior(UnsignedOptions.Behavior) - Method in class org.apache.beam.sdk.extensions.sbe.UnsignedOptions.Builder
- setUint64Behavior(UnsignedOptions.Behavior) - Method in class org.apache.beam.sdk.extensions.sbe.UnsignedOptions.Builder
- setUint8Behavior(UnsignedOptions.Behavior) - Method in class org.apache.beam.sdk.extensions.sbe.UnsignedOptions.Builder
- setUnalignedCheckpointEnabled(boolean) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
- setUnboundedReaderMaxElements(Integer) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
- setUnboundedReaderMaxReadTimeMs(Integer) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
- setUnboundedReaderMaxReadTimeSec(Integer) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
- setUnboundedReaderMaxWaitForElementsMs(Integer) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
- setUnknownFieldsPayload(byte[]) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload.Builder
- setup() - Method in interface org.apache.beam.io.requestresponse.SetupTeardown
-
Called during the
DoFn
's setup lifecycle method. - setup() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datagen.DataGeneratorRowFn
- setup() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.DetectNewPartitionsDoFn
- setup() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.ReadChangeStreamPartitionDoFn
- setup() - Method in class org.apache.beam.sdk.io.gcp.datastore.RampupThrottlingFn
- setup() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
-
Obtains the instance of
DetectNewPartitionsAction
. - setup() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
-
Constructs instances for the
PartitionMetadataDao
,ChangeStreamDao
,ChangeStreamRecordMapper
,PartitionMetadataMapper
,DataChangeRecordAction
,HeartbeatRecordAction
,ChildPartitionsRecordAction
,PartitionStartRecordAction
,PartitionEndRecordAction
,PartitionEventRecordAction
andQueryChangeStreamAction
. - setup() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadSpannerSchema
- setup() - Method in interface org.apache.beam.sdk.io.kafka.CheckStopReadingFn
- setup() - Method in class org.apache.beam.sdk.io.kafka.jmh.KafkaIOUtilsBenchmark.AtomicAccumulatorState
- setup() - Method in class org.apache.beam.sdk.io.kafka.jmh.KafkaIOUtilsBenchmark.PlainAccumulatorState
- setup() - Method in class org.apache.beam.sdk.io.kafka.jmh.KafkaIOUtilsBenchmark.VolatileAccumulatorState
- setup() - Method in class org.apache.beam.sdk.io.pulsar.WriteToPulsarDoFn
- setup() - Method in class org.apache.beam.sdk.jmh.schemas.RowBundle
- setup() - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark.NewVsCopy.ArrayCopyState
- setup() - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark.NewVsCopy.ArrayNewState
- setup() - Method in class org.apache.beam.sdk.jmh.util.VarIntBenchmark.Bytes
- setup() - Method in class org.apache.beam.sdk.jmh.util.VarIntBenchmark.Longs
- setup() - Method in class org.apache.beam.sdk.transforms.Reshuffle.AssignShardFn
- setup(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.BatchStatefulParDoOverrides.BatchStatefulDoFn
- setup(StreamTask<?, ?>, StreamConfig, Output<StreamRecord<WindowedValue<OutputT>>>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- setup(Blackhole) - Method in class org.apache.beam.sdk.jmh.util.VarIntBenchmark.BlackholeOutput
- setup(IterationParams, ThreadParams) - Method in class org.apache.beam.sdk.io.kafka.jmh.KafkaIOUtilsBenchmark.ProducerState
- setUp() - Static method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- setUp() - Static method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
- setUp(Schema) - Method in interface org.apache.beam.sdk.extensions.sql.BeamSqlSeekableTable
-
prepare the instance.
- setUpdate(boolean) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
- setUpdateCompatibilityVersion(String) - Method in interface org.apache.beam.sdk.options.StreamingOptions
- setUploadBufferSizeBytes(int) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.CreateOptions.Builder
- setupModule(Module.SetupContext) - Method in class org.apache.beam.sdk.io.aws2.options.AwsModule
- SetupTeardown - Interface in org.apache.beam.io.requestresponse
-
Provided by user and called within
DoFn.Setup
and @{link org.apache.beam.sdk.transforms.DoFn.Teardown} lifecycle methods ofCall
'sDoFn
. - setUpToDateThreshold(Long) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.ReadDataBuilder.Configuration
- setUrl(String) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- setUseActiveSparkSession(boolean) - Method in interface org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingPipelineOptions
- setUseAltsServer(boolean) - Method in interface org.apache.beam.sdk.expansion.service.ExpansionServiceOptions
- setUseAtLeastOnceSemantics(Boolean) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.Builder
- setUseCatalog(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv.BeamSqlEnvBuilder
-
Uses the specified
InMemoryCatalog
. - setUseCdc(boolean) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
- setUseCdcWrites(Boolean) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.Builder
- setUseDataStreamForBatch(Boolean) - Method in interface org.apache.beam.runners.flink.VersionDependentFlinkPipelineOptions
- setUsePublicIps(Boolean) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
- setUserAgent(String) - Method in interface org.apache.beam.sdk.options.PipelineOptions
- setUsername(String) - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration.Builder
- setUsername(String) - Method in class org.apache.beam.io.debezium.DebeziumTransformRegistrar.ReadBuilder.Configuration
- setUsername(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration.Builder
- setUsername(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration.Builder
- setUsername(String) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- setUsername(ValueProvider<String>) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- setUseSeparateWindmillHeartbeatStreams(Boolean) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- setUsesProvidedSparkContext(boolean) - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
- setUseStandardSql(Boolean) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration.Builder
-
Enables BigQuery's Standard SQL dialect when reading from a query.
- setUseStorageApiConnectionPool(Boolean) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setUseStorageWriteApi(Boolean) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setUseStorageWriteApiAtLeastOnce(Boolean) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setUseTransformService(boolean) - Method in interface org.apache.beam.sdk.extensions.python.PythonExternalTransformOptions
- setUseWindmillIsolatedChannels(Boolean) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- setUuid(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
- setUUID(UUID) - Method in class org.apache.beam.sdk.schemas.Schema
-
Set this schema's UUID.
- setUuidExtractor(SerializableFunction<SequencedMessage, Uuid>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions.Builder
- SetUuidFn(String) - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.SetUuidFromPubSubMessage.SetUuidFn
- SetUuidFromPubSubMessage(String) - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.SetUuidFromPubSubMessage
- setValidate(boolean) - Method in class org.apache.beam.sdk.io.TFRecordReadSchemaTransformConfiguration.Builder
- setValueDeserializer(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
- setValueSerializer(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write.External.Configuration
- setVerifyCertificate(Boolean) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.ReadDataBuilder.Configuration
- setVerifyCertificate(Boolean) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.WriteBuilder.Configuration
- setVerifyRowValues(Boolean) - Method in interface org.apache.beam.sdk.extensions.sql.impl.BeamSqlPipelineOptions
- setWarehouse(String) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
- setWarehouse(ValueProvider<String>) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
- setWatermark(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
-
Sets the watermark (last processed timestamp) for the partition.
- setWatermark(Instant) - Method in interface org.apache.beam.sdk.transforms.splittabledofn.ManualWatermarkEstimator
-
Sets a timestamp before or at the timestamps of all future elements produced by the associated DoFn.
- setWatermark(Instant) - Method in class org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimators.Manual
- setWatermarkIdleDurationThreshold(Long) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.ReadDataBuilder.Configuration
- setWatermarkPolicy(String) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.ReadDataBuilder.Configuration
- setWebIdTokenProviderFQCN(String) - Method in class org.apache.beam.sdk.io.aws2.auth.StsAssumeRoleForFederatedCredentialsProvider.Builder
-
The fully qualified class name for the web id token provider.
- setWindmillGetDataStreamCount(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- setWindmillHarnessUpdateReportingPeriod(Duration) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- setWindmillMessagesBetweenIsReadyChecks(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- setWindmillRequestBatchedGetWorkResponse(boolean) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- setWindmillServiceCommitThreads(Integer) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- setWindmillServiceEndpoint(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- setWindmillServicePort(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- setWindmillServiceRpcChannelAliveTimeoutSec(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- setWindmillServiceStreamingLogEveryNStreamFailures(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- setWindmillServiceStreamingRpcBatchLimit(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- setWindmillServiceStreamingRpcHealthCheckPeriodMs(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- setWindmillServiceStreamMaxBackoffMillis(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
- setWindowedWrites() - Method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
-
Indicates that the operation will be performing windowed writes.
- setWindowingStrategyInternal(WindowingStrategy<?, ?>) - Method in class org.apache.beam.sdk.values.PCollection
-
For internal use only; no backwards-compatibility guarantees.
- setWithAttributes(Boolean) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead.Configuration
- setWithPartitions(Boolean) - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadConfiguration.Builder
- setWorkerCacheMb(Integer) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
- setWorkerCPUs(int) - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
- setWorkerDiskType(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
- setWorkerHarnessContainerImage(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
-
Deprecated.
- setWorkerId(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerHarnessOptions
- setWorkerLogLevelOverrides(DataflowWorkerLoggingOptions.WorkerLogLevelOverrides) - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
-
Deprecated.
- setWorkerMachineType(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
- setWorkerPool(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerHarnessOptions
- setWorkerRegion(String) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
- setWorkerSystemErrMessageLevel(DataflowWorkerLoggingOptions.Level) - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
-
Deprecated.
- setWorkerSystemOutMessageLevel(DataflowWorkerLoggingOptions.Level) - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
-
Deprecated.
- setWorkerZone(String) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
- setWriteDisposition(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.Builder
- setWriteDisposition(String) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.WriteBuilder.Configuration
- setWriteStatement(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration.Builder
- setWriteTimeout(int) - Method in class org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer
- setXmlConfiguration(FileWriteSchemaTransformConfiguration.XmlConfiguration) - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.Builder
-
Configures extra details related to writing XML formatted files.
- setZetaSqlDefaultTimezone(String) - Method in interface org.apache.beam.sdk.extensions.sql.impl.BeamSqlPipelineOptions
- setZone(String) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
-
Deprecated.Use
GcpOptions.setWorkerZone(java.lang.String)
instead. - sha1Bytes(byte[]) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinHashFunctions
-
SHA1(X)
- sha1String(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinHashFunctions
-
SHA1(X)
- sha256Bytes(byte[]) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinHashFunctions
-
SHA256(X)
- sha256String(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinHashFunctions
-
SHA256(X)
- sha512Bytes(byte[]) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinHashFunctions
-
SHA512(X)
- sha512String(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinHashFunctions
-
SHA512(X)
- SHARD_BITS - Static variable in class org.apache.beam.runners.dataflow.internal.IsmFormat
- ShardedKey<K> - Class in org.apache.beam.sdk.values
-
A key and a shard number.
- ShardedKeyCoder<KeyT> - Class in org.apache.beam.sdk.coders
- ShardedKeyCoder(Coder<KeyT>) - Constructor for class org.apache.beam.sdk.coders.ShardedKeyCoder
- ShardingFunction<UserT,
DestinationT> - Interface in org.apache.beam.sdk.io -
Function for assigning
ShardedKey
s to input elements for shardedWriteFiles
. - ShardNameTemplate - Class in org.apache.beam.sdk.io
-
Standard shard naming templates.
- ShardNameTemplate() - Constructor for class org.apache.beam.sdk.io.ShardNameTemplate
- shardRefreshInterval(Duration) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.RecordAggregation.Builder
-
Refresh interval for shards.
- shortCircuitReturnNull(StackManipulation, StackManipulation) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
- shorts() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
The
TypeDescriptor
for Short. - shoudBundleElements() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- shoudBundleElements() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.PartialReduceBundleOperator
- shouldCache(PTransform<?, ? extends PValue>, PValue) - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
-
Cache PCollection if SparkPipelineOptions.isCacheDisabled is false or transform isn't GroupByKey transformation and PCollection is used more then once in Pipeline.
- shouldConvertRaggedUnionTypesToVarying() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRelDataTypeSystem
- shouldDefer(TransformHierarchy.Node) - Method in class org.apache.beam.runners.spark.SparkRunner.Evaluator
- shouldPublishLatencyMetrics() - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
- shouldRepeat() - Method in exception class org.apache.beam.io.requestresponse.UserCodeExecutionException
-
Reports whether when thrown warrants repeat execution.
- shouldRepeat() - Method in exception class org.apache.beam.io.requestresponse.UserCodeQuotaException
-
Reports that quota errors should be repeated.
- shouldRepeat() - Method in exception class org.apache.beam.io.requestresponse.UserCodeRemoteSystemException
-
Reports that remote system errors should be repeated.
- shouldRepeat() - Method in exception class org.apache.beam.io.requestresponse.UserCodeTimeoutException
-
Reports that timeouts should be repeated.
- shouldResume() - Method in class org.apache.beam.sdk.transforms.DoFn.ProcessContinuation
-
If false, the
DoFn
promises that there is no more work remaining for the current element, so the runner should not resume theDoFn.ProcessElement
call. - shouldRetry(InsertRetryPolicy.Context) - Method in class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy
-
Return true if this failure should be retried.
- shutdown() - Method in class org.apache.beam.sdk.transformservice.launcher.TransformServiceLauncher
- SHUTDOWN - Enum constant in enum class org.apache.beam.runners.local.ExecutionDriver.DriverState
- sideInput(PCollectionView<SideInputT>) - Method in class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
-
Returns the value of a given side input.
- sideInput(PCollectionView<SideInputT>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
-
Returns the value of a given side input.
- sideInput(PCollectionView<T>) - Method in interface org.apache.beam.sdk.state.StateContext
-
Returns the value of the side input for the corresponding state window.
- sideInput(PCollectionView<T>) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.Context
-
Returns the value of the side input for the window corresponding to the main input's window in which values are being combined.
- sideInput(PCollectionView<T>) - Method in class org.apache.beam.sdk.transforms.Contextful.Fn.Context
-
Accesses the given side input.
- sideInput(PCollectionView<T>) - Method in class org.apache.beam.sdk.transforms.DoFn.ProcessContext
-
Returns the value of the side input.
- SideInputBroadcast<T> - Class in org.apache.beam.runners.spark.util
-
Broadcast helper for side inputs.
- sideInputHandler - Variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- sideInputId() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.SideInputSpec
- SideInputInitializer<ViewT> - Class in org.apache.beam.runners.flink.translation.functions
-
BroadcastVariableInitializer
that initializes the broadcast input as aMap
from window to side input. - SideInputInitializer(PCollectionView<ViewT>) - Constructor for class org.apache.beam.runners.flink.translation.functions.SideInputInitializer
- sideInputJoin(PCollection<Row>, PCollection<Row>, FieldAccessDescriptor, FieldAccessDescriptor) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSideInputJoinRel
- SideInputMetadata - Class in org.apache.beam.runners.spark.translation
-
Metadata class for side inputs in Spark runner.
- sideInputReader - Variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- SideInputReaderFactory - Class in org.apache.beam.runners.spark.util
-
Utility class for creating and managing side input readers in the Spark runner.
- SideInputReaderFactory() - Constructor for class org.apache.beam.runners.spark.util.SideInputReaderFactory
- sideInputs - Variable in class org.apache.beam.runners.flink.translation.functions.FlinkPartialReduceFunction
- sideInputs - Variable in class org.apache.beam.runners.flink.translation.functions.FlinkReduceFunction
- sideInputs - Variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- Side Inputs - Search tag in class org.apache.beam.sdk.transforms.ParDo
- Section
- SideInputSpec() - Constructor for class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.SideInputSpec
- sideInputTagMapping - Variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- SideInputValues<T> - Interface in org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions
-
SideInputValues
serves as a Kryo serializable container that contains a materialized view of side inputs. - SideInputValues.BaseSideInputValues<BinaryT,
ValuesT, - Class in org.apache.beam.runners.spark.structuredstreaming.translation.batch.functionsT> - SideInputValues.ByWindow<T> - Class in org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions
-
General
SideInputValues
forBoundedWindows
in two possible states. - SideInputValues.Global<T> - Class in org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions
-
Specialized
SideInputValues
for use with theGlobalWindow
in two possible states. - SideInputValues.Loader<T> - Interface in org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions
-
Factory function for load
SideInputValues
from aDataset
. - signalStart() - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal
-
Outputs a message that the pipeline has started.
- signalSuccessWhen(Coder<T>, SerializableFunction<Set<T>, Boolean>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal
-
Invocation of
TestPubsubSignal.signalSuccessWhen(Coder, SerializableFunction, SerializableFunction)
withObject.toString()
as the formatter. - signalSuccessWhen(Coder<T>, SerializableFunction<T, String>, SerializableFunction<Set<T>, Boolean>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal
-
Outputs a success message when
successPredicate
is evaluated to true. - SimpleCombineFn(SerializableFunction<Iterable<V>, V>) - Constructor for class org.apache.beam.sdk.transforms.Combine.SimpleCombineFn
-
Deprecated.
- SimpleFunction<InputT,
OutputT> - Class in org.apache.beam.sdk.transforms -
A
SerializableFunction
which is not a functional interface. - SimpleFunction() - Constructor for class org.apache.beam.sdk.transforms.SimpleFunction
- SimpleFunction(SerializableFunction<InputT, OutputT>) - Constructor for class org.apache.beam.sdk.transforms.SimpleFunction
- SimpleIdentifierContext(FieldSpecifierNotationParser.DotExpressionComponentContext) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.SimpleIdentifierContext
- SimpleRateLimitPolicy(double) - Constructor for class org.apache.beam.sdk.io.googleads.GoogleAdsV19.SimpleRateLimitPolicy
- SimpleRateLimitPolicy(double, long, TimeUnit) - Constructor for class org.apache.beam.sdk.io.googleads.GoogleAdsV19.SimpleRateLimitPolicy
- SimpleRemoteEnvironment() - Constructor for class org.apache.beam.runners.fnexecution.environment.RemoteEnvironment.SimpleRemoteEnvironment
- SINGLE_FILE_OR_SUBRANGE - Enum constant in enum class org.apache.beam.sdk.io.FileBasedSource.Mode
- SINGLE_WINDOW - Enum constant in enum class org.apache.beam.runners.spark.translation.SparkCombineFn.WindowedAccumulator.Type
- singleByteEncodeDoLoopByteString(VarIntBenchmark.Bytes, VarIntBenchmark.ByteStringOutput) - Method in class org.apache.beam.sdk.jmh.util.VarIntBenchmark
- singleByteEncodeDoLoopTwiddleByteString(VarIntBenchmark.Bytes, VarIntBenchmark.ByteStringOutput) - Method in class org.apache.beam.sdk.jmh.util.VarIntBenchmark
- singleByteEncodeLoopByteString(VarIntBenchmark.Bytes, VarIntBenchmark.ByteStringOutput) - Method in class org.apache.beam.sdk.jmh.util.VarIntBenchmark
- singleByteEncodeUnrolledByteString(VarIntBenchmark.Bytes, VarIntBenchmark.ByteStringOutput) - Method in class org.apache.beam.sdk.jmh.util.VarIntBenchmark
- SingleEmitInputDStream<T> - Class in org.apache.beam.runners.spark.translation
-
A specialized
ConstantInputDStream
that emits its RDD exactly once. - SingleEmitInputDStream(StreamingContext, RDD<T>) - Constructor for class org.apache.beam.runners.spark.translation.SingleEmitInputDStream
- SingleEnvironmentInstanceJobBundleFactory - Class in org.apache.beam.runners.fnexecution.control
-
Deprecated.replace with a
DefaultJobBundleFactory
when appropriate if theEnvironmentFactory
is aDockerEnvironmentFactory
, or create anInProcessJobBundleFactory
and inline the creation of the environment if appropriate. - singleOutputOverrideFactory() - Static method in class org.apache.beam.runners.dataflow.BatchStatefulParDoOverrides
-
Returns a
PTransformOverrideFactory
that replaces a single-outputParDo
with a composite transform specialized for theDataflowRunner
. - SingleStoreIO - Class in org.apache.beam.sdk.io.singlestore
-
IO to read and write data on SingleStoreDB.
- SingleStoreIO() - Constructor for class org.apache.beam.sdk.io.singlestore.SingleStoreIO
- SingleStoreIO.DataSourceConfiguration - Class in org.apache.beam.sdk.io.singlestore
-
A POJO describing a SingleStoreDB
DataSource
by providing all properties needed to create it. - SingleStoreIO.Read<T> - Class in org.apache.beam.sdk.io.singlestore
-
A
PTransform
for reading data from SingleStoreDB. - SingleStoreIO.Read.SingleStoreRowMapperInitializationException - Exception Class in org.apache.beam.sdk.io.singlestore
- SingleStoreIO.ReadWithPartitions<T> - Class in org.apache.beam.sdk.io.singlestore
-
A
PTransform
for reading data from SingleStoreDB. - SingleStoreIO.RowMapper<T> - Interface in org.apache.beam.sdk.io.singlestore
-
An interface used by
SingleStoreIO.Read
andSingleStoreIO.ReadWithPartitions
for converting each row of theResultSet
into an element of the resultingPCollection
. - SingleStoreIO.RowMapperWithCoder<T> - Interface in org.apache.beam.sdk.io.singlestore
-
A RowMapper that provides a Coder for resulting PCollection.
- SingleStoreIO.RowMapperWithInit<T> - Interface in org.apache.beam.sdk.io.singlestore
-
A RowMapper that requires initialization.
- SingleStoreIO.StatementPreparator - Interface in org.apache.beam.sdk.io.singlestore
-
An interface used by the SingleStoreIO
SingleStoreIO.Read
to set the parameters of thePreparedStatement
. - SingleStoreIO.UserDataMapper<T> - Interface in org.apache.beam.sdk.io.singlestore
-
An interface used by the SingleStoreIO
SingleStoreIO.Write
to map a data from each element ofPCollection
to a List of Strings. - SingleStoreIO.Write<T> - Class in org.apache.beam.sdk.io.singlestore
-
A
PTransform
for writing data to SingleStoreDB. - SingleStoreSchemaTransformReadConfiguration - Class in org.apache.beam.sdk.io.singlestore.schematransform
-
Configuration for reading from SingleStoreDB.
- SingleStoreSchemaTransformReadConfiguration() - Constructor for class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadConfiguration
- SingleStoreSchemaTransformReadConfiguration.Builder - Class in org.apache.beam.sdk.io.singlestore.schematransform
- SingleStoreSchemaTransformReadProvider - Class in org.apache.beam.sdk.io.singlestore.schematransform
-
An implementation of
TypedSchemaTransformProvider
for SingleStoreDB read jobs configured usingSingleStoreSchemaTransformReadConfiguration
. - SingleStoreSchemaTransformReadProvider() - Constructor for class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadProvider
- SingleStoreSchemaTransformWriteConfiguration - Class in org.apache.beam.sdk.io.singlestore.schematransform
-
Configuration for writing to SingleStoreDB.
- SingleStoreSchemaTransformWriteConfiguration() - Constructor for class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteConfiguration
- SingleStoreSchemaTransformWriteConfiguration.Builder - Class in org.apache.beam.sdk.io.singlestore.schematransform
- SingleStoreSchemaTransformWriteProvider - Class in org.apache.beam.sdk.io.singlestore.schematransform
-
An implementation of
TypedSchemaTransformProvider
for SingleStoreDB write jobs configured usingSingleStoreSchemaTransformWriteConfiguration
. - SingleStoreSchemaTransformWriteProvider() - Constructor for class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteProvider
- singleTable(TableIdentifier, Schema) - Static method in interface org.apache.beam.sdk.io.iceberg.DynamicDestinations
- SingletonKeyedWorkItem<K,
ElemT> - Class in org.apache.beam.runners.flink.translation.wrappers.streaming -
Singleton keyed word item.
- SingletonKeyedWorkItem(K, WindowedValue<ElemT>) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.SingletonKeyedWorkItem
- SingletonKeyedWorkItemCoder<K,
ElemT> - Class in org.apache.beam.runners.flink.translation.wrappers.streaming -
Singleton keyed work item coder.
- singletonView(PCollection<T>, PCollectionViews.TypeDescriptorSupplier<T>, WindowingStrategy<?, W>, boolean, T, Coder<T>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
-
Returns a
PCollectionView<T>
capable of processing elements windowed using the providedWindowingStrategy
. - singletonViewUsingVoidKey(TupleTag<Materializations.MultimapView<Void, T>>, PCollection<KV<Void, T>>, PCollectionViews.TypeDescriptorSupplier<T>, WindowingStrategy<?, W>, boolean, T, Coder<T>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
- SingleWindowFlinkCombineRunner<K,
InputT, - Class in org.apache.beam.runners.flink.translation.functionsAccumT, OutputT, W> -
A Flink combine runner takes elements pre-grouped by window and produces output after seeing all input.
- SingleWindowFlinkCombineRunner() - Constructor for class org.apache.beam.runners.flink.translation.functions.SingleWindowFlinkCombineRunner
- sinh(Double) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinTrigonometricFunctions
-
SINH(X)
- sink - Variable in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
-
The Sink that this WriteOperation will write to.
- sink() - Static method in class org.apache.beam.sdk.io.TextIO
-
Creates a
TextIO.Sink
that writes newline-delimited strings in UTF-8, for use withFileIO.write()
. - sink() - Static method in class org.apache.beam.sdk.io.TFRecordIO
- sink(Class<ElementT>) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
-
A
AvroIO.Sink
for use withFileIO.write()
andFileIO.writeDynamic()
, writing elements of the given generated class, likeAvroIO.write(Class)
. - sink(Class<T>) - Static method in class org.apache.beam.sdk.io.xml.XmlIO
-
Outputs records as XML-formatted elements using JAXB.
- sink(String) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
-
A
AvroIO.Sink
for use withFileIO.write()
andFileIO.writeDynamic()
, writing elements with a given (common) schema, likeAvroIO.writeGenericRecords(String)
. - sink(String) - Static method in class org.apache.beam.sdk.extensions.yaml.YamlTransform
-
Creates a new YamlTransform mapping a single input PCollection<Row> to a single PCollection<Row> output.
- sink(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
-
A
AvroIO.Sink
for use withFileIO.write()
andFileIO.writeDynamic()
, writing elements with a given (common) schema, likeAvroIO.writeGenericRecords(Schema)
. - sink(Schema) - Static method in class org.apache.beam.sdk.io.parquet.ParquetIO
-
Creates a
ParquetIO.Sink
that, for use withFileIO.write()
. - sink(TProtocolFactory) - Static method in class org.apache.beam.sdk.io.thrift.ThriftIO
- Sink() - Constructor for class org.apache.beam.sdk.extensions.avro.io.AvroIO.Sink
- Sink() - Constructor for class org.apache.beam.sdk.io.parquet.ParquetIO.Sink
- Sink() - Constructor for class org.apache.beam.sdk.io.TextIO.Sink
- Sink() - Constructor for class org.apache.beam.sdk.io.TFRecordIO.Sink
- Sink() - Constructor for class org.apache.beam.sdk.io.thrift.ThriftIO.Sink
- Sink() - Constructor for class org.apache.beam.sdk.io.xml.XmlIO.Sink
- SINK - Enum constant in enum class org.apache.beam.sdk.io.cdap.PluginConstants.Hadoop
- SINK - Enum constant in enum class org.apache.beam.sdk.io.cdap.PluginConstants.PluginType
- SINK - Enum constant in enum class org.apache.beam.sdk.metrics.Lineage.Type
- SinkMetrics - Class in org.apache.beam.sdk.metrics
-
Standard Sink Metrics.
- SinkMetrics() - Constructor for class org.apache.beam.sdk.metrics.SinkMetrics
- SINKV2 - Enum constant in enum class org.apache.beam.sdk.metrics.Lineage.Type
- sinkViaGenericRecords(Schema, AvroIO.RecordFormatter<ElementT>) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
-
Deprecated.RecordFormatter will be removed in future versions.
- size() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
-
Returns the number of bytes in the backing array that are valid.
- size() - Method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingSeekableByteChannel
- size() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel.WrappedList
- size() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel.WrappedRow
- size() - Method in class org.apache.beam.sdk.fn.data.WeightedList
- size() - Method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
- size() - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
- size() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
-
Returns the number of columns for this schema.
- size() - Method in class org.apache.beam.sdk.values.PCollectionList
-
Returns the number of
PCollections
in thisPCollectionList
. - size() - Method in class org.apache.beam.sdk.values.TupleTagList
-
Returns the number of TupleTags in this TupleTagList.
- sizeBytes() - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata
- SizeEstimator<T> - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator
-
This class is used to estimate the size in bytes of a given element.
- SizeEstimator<T> - Interface in org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator
- SizeEstimator(Coder<T>) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.SizeEstimator
- sizeOf(T) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator.CoderSizeEstimator
-
Estimates the size in bytes of the given element with the configured
Coder
. - sizeOf(T) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator.NullSizeEstimator
- sizeOf(T) - Method in interface org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator.SizeEstimator
- sizeOf(T) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.SizeEstimator
-
Estimates the size in bytes of the given element with the configured
Coder
. - Sketch() - Constructor for class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.Sketch
- SketchFrequencies - Class in org.apache.beam.sdk.extensions.sketching
-
PTransform
s to compute the estimate frequency of each element in a stream. - SketchFrequencies() - Constructor for class org.apache.beam.sdk.extensions.sketching.SketchFrequencies
- SketchFrequencies.CountMinSketchFn<InputT> - Class in org.apache.beam.sdk.extensions.sketching
-
Implements the
Combine.CombineFn
ofSketchFrequencies
transforms. - SketchFrequencies.GlobalSketch<InputT> - Class in org.apache.beam.sdk.extensions.sketching
-
Implementation of
SketchFrequencies.globally()
. - SketchFrequencies.PerKeySketch<K,
V> - Class in org.apache.beam.sdk.extensions.sketching -
Implementation of
SketchFrequencies.perKey()
. - SketchFrequencies.Sketch<T> - Class in org.apache.beam.sdk.extensions.sketching
-
Wrap StreamLib's Count-Min Sketch to support counting all user types by hashing the encoded user type using the supplied deterministic coder.
- SKIP - Enum constant in enum class org.apache.beam.sdk.io.FileIO.ReadMatches.DirectoryTreatment
- SKIP_CLEANUP - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ExistingPipelineOptions
- SKIP_IF_DESTINATION_EXISTS - Enum constant in enum class org.apache.beam.sdk.io.fs.MoveOptions.StandardMoveOptions
- skipAssignWindows(Window.Assign<T>, EvaluationContext) - Static method in class org.apache.beam.runners.spark.translation.TranslationUtils
-
Checks if the window transformation should be applied or skipped.
- skipCertificateVerification() - Method in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration
-
Optional flag to skip certificate verification.
- skipCertificateVerification(boolean) - Method in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration.Builder
-
Optional flag to skip certificate verification.
- SkipCertificateVerificationTrustManagerProvider() - Constructor for class org.apache.beam.sdk.io.aws2.common.ClientBuilderFactory.SkipCertificateVerificationTrustManagerProvider
- skipIfEmpty() - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
-
Don't write any output files if the PCollection is empty.
- skipInvalidRows() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Insert all valid rows of a request, even if invalid rows exist.
- Slf4jLogWriter - Class in org.apache.beam.runners.fnexecution.logging
-
A
LogWriter
which uses anSLF4J Logger
as the underlying log backend. - SLIDING_WINDOW_TVF - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.TVFStreamingUtils
- SlidingWindows - Class in org.apache.beam.sdk.transforms.windowing
-
A
WindowFn
that windows values into possibly overlapping fixed-size timestamp-based windows. - SMALL_INT - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
- smallest(int) - Static method in class org.apache.beam.sdk.transforms.Top
-
Returns a
PTransform
that takes an inputPCollection<T>
and returns aPCollection<List<T>>
with a single element containing the smallestcount
elements of the inputPCollection<T>
, in increasing order, sorted according to their natural order. - Smallest() - Constructor for class org.apache.beam.sdk.transforms.Top.Smallest
-
Deprecated.
- smallestDoublesFn(int) - Static method in class org.apache.beam.sdk.transforms.Top
-
Returns a
Top.TopCombineFn
that aggregates the smallest count double values. - smallestFn(int) - Static method in class org.apache.beam.sdk.transforms.Top
-
Returns a
Top.TopCombineFn
that aggregates the smallest count values. - smallestIntsFn(int) - Static method in class org.apache.beam.sdk.transforms.Top
-
Returns a
Top.TopCombineFn
that aggregates the smallest count int values. - smallestLongsFn(int) - Static method in class org.apache.beam.sdk.transforms.Top
-
Returns a
Top.TopCombineFn
that aggregates the smallest count long values. - smallestPerKey(int) - Static method in class org.apache.beam.sdk.transforms.Top
-
Returns a
PTransform
that takes an inputPCollection<KV<K, V>>
and returns aPCollection<KV<K, List<V>>>
that contains an output element mapping each distinct key in the inputPCollection
to the smallestcount
values associated with that key in the inputPCollection<KV<K, V>>
, in increasing order, sorted according to their natural order. - SNAPPY - Enum constant in enum class org.apache.beam.sdk.io.CompressedSource.CompressionMode
-
Deprecated.
- SNAPPY - Enum constant in enum class org.apache.beam.sdk.io.Compression
-
Google Snappy compression.
- SNAPPY - Enum constant in enum class org.apache.beam.sdk.io.FileBasedSink.CompressionType
-
Deprecated.
- SnappyCoder<T> - Class in org.apache.beam.sdk.coders
-
Wraps an existing coder with Snappy compression.
- snapshot(SchemaVersion) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
- snapshotConfiguration() - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeSerializer
- snapshotConfiguration() - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueSerializer
- snapshotConfiguration() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.FlinkStateNamespaceKeySerializer
- SnapshotInfo - Class in org.apache.beam.sdk.io.iceberg
-
This is an AutoValue representation of an Iceberg
Snapshot
. - SnapshotInfo() - Constructor for class org.apache.beam.sdk.io.iceberg.SnapshotInfo
- SnapshotInfo.Builder - Class in org.apache.beam.sdk.io.iceberg
- snapshotStart(long) - Method in class org.apache.beam.runners.flink.translation.utils.CheckpointStats
- snapshotState() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.LazyFlinkSourceSplitEnumerator
- snapshotState(long) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- snapshotState(long) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceSplitEnumerator
- snapshotState(long) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.LazyFlinkSourceSplitEnumerator
- snapshotState(FunctionSnapshotContext) - Method in class org.apache.beam.runners.flink.translation.functions.ImpulseSourceFunction
- snapshotState(FunctionSnapshotContext) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.UnboundedSourceWrapper
- snapshotState(StateSnapshotContext) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- snapshotState(StateSnapshotContext) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.PartialReduceBundleOperator
- SnowflakeArray - Class in org.apache.beam.sdk.io.snowflake.data.structured
- SnowflakeArray() - Constructor for class org.apache.beam.sdk.io.snowflake.data.structured.SnowflakeArray
- SnowflakeBatchServiceConfig - Class in org.apache.beam.sdk.io.snowflake.services
-
Class for preparing configuration for batch write and read.
- SnowflakeBatchServiceConfig(SerializableFunction<Void, DataSource>, String, String, String, String, String, String, String) - Constructor for class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
-
Creating a batch configuration for reading.
- SnowflakeBatchServiceConfig(SerializableFunction<Void, DataSource>, List<String>, SnowflakeTableSchema, String, String, String, String, CreateDisposition, WriteDisposition, String, String, String) - Constructor for class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
-
Creating a batch configuration for writing.
- SnowflakeBatchServiceImpl - Class in org.apache.beam.sdk.io.snowflake.services
-
Implemenation of
SnowflakeServices.BatchService
used in production. - SnowflakeBatchServiceImpl() - Constructor for class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceImpl
- SnowflakeBinary - Class in org.apache.beam.sdk.io.snowflake.data.text
- SnowflakeBinary() - Constructor for class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeBinary
- SnowflakeBinary(long) - Constructor for class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeBinary
- SnowflakeBoolean - Class in org.apache.beam.sdk.io.snowflake.data.logical
- SnowflakeBoolean() - Constructor for class org.apache.beam.sdk.io.snowflake.data.logical.SnowflakeBoolean
- SnowflakeChar - Class in org.apache.beam.sdk.io.snowflake.data.text
- SnowflakeChar() - Constructor for class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeChar
- SnowflakeColumn - Class in org.apache.beam.sdk.io.snowflake.data
-
POJO describing single Column within Snowflake Table.
- SnowflakeColumn() - Constructor for class org.apache.beam.sdk.io.snowflake.data.SnowflakeColumn
- SnowflakeColumn(String, SnowflakeDataType) - Constructor for class org.apache.beam.sdk.io.snowflake.data.SnowflakeColumn
- SnowflakeColumn(String, SnowflakeDataType, boolean) - Constructor for class org.apache.beam.sdk.io.snowflake.data.SnowflakeColumn
- SnowflakeDataType - Interface in org.apache.beam.sdk.io.snowflake.data
-
Interface for data types to provide SQLs for themselves.
- SnowflakeDate - Class in org.apache.beam.sdk.io.snowflake.data.datetime
- SnowflakeDate() - Constructor for class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeDate
- SnowflakeDateTime - Class in org.apache.beam.sdk.io.snowflake.data.datetime
- SnowflakeDateTime() - Constructor for class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeDateTime
- SnowflakeDecimal - Class in org.apache.beam.sdk.io.snowflake.data.numeric
- SnowflakeDecimal() - Constructor for class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeDecimal
- SnowflakeDecimal(int, int) - Constructor for class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeDecimal
- SnowflakeDouble - Class in org.apache.beam.sdk.io.snowflake.data.numeric
- SnowflakeDouble() - Constructor for class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeDouble
- SnowflakeFloat - Class in org.apache.beam.sdk.io.snowflake.data.numeric
- SnowflakeFloat() - Constructor for class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeFloat
- SnowflakeGeography - Class in org.apache.beam.sdk.io.snowflake.data.geospatial
- SnowflakeGeography() - Constructor for class org.apache.beam.sdk.io.snowflake.data.geospatial.SnowflakeGeography
- SnowflakeInteger - Class in org.apache.beam.sdk.io.snowflake.data.numeric
- SnowflakeInteger() - Constructor for class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeInteger
- SnowflakeIO - Class in org.apache.beam.sdk.io.snowflake
-
IO to read and write data on Snowflake.
- SnowflakeIO() - Constructor for class org.apache.beam.sdk.io.snowflake.SnowflakeIO
- SnowflakeIO.Concatenate - Class in org.apache.beam.sdk.io.snowflake
-
Combines list of
String
to provide oneString
with paths where files were staged for write. - SnowflakeIO.CsvMapper<T> - Interface in org.apache.beam.sdk.io.snowflake
-
Interface for user-defined function mapping parts of CSV line into T.
- SnowflakeIO.DataSourceConfiguration - Class in org.apache.beam.sdk.io.snowflake
-
A POJO describing a
DataSource
, providing all properties allowing to create aDataSource
. - SnowflakeIO.DataSourceProviderFromDataSourceConfiguration - Class in org.apache.beam.sdk.io.snowflake
-
Wraps
SnowflakeIO.DataSourceConfiguration
to provide DataSource. - SnowflakeIO.Read<T> - Class in org.apache.beam.sdk.io.snowflake
-
Implementation of
SnowflakeIO.read()
. - SnowflakeIO.Read.CleanTmpFilesFromGcsFn - Class in org.apache.beam.sdk.io.snowflake
-
Removes temporary staged files after reading.
- SnowflakeIO.Read.MapCsvToStringArrayFn - Class in org.apache.beam.sdk.io.snowflake
-
Parses
String
from incoming data inPCollection
to have proper format for CSV files. - SnowflakeIO.UserDataMapper<T> - Interface in org.apache.beam.sdk.io.snowflake
-
Interface for user-defined function mapping T into array of Objects.
- SnowflakeIO.Write<T> - Class in org.apache.beam.sdk.io.snowflake
-
Implementation of
SnowflakeIO.write()
. - SnowflakeNumber - Class in org.apache.beam.sdk.io.snowflake.data.numeric
- SnowflakeNumber() - Constructor for class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeNumber
- SnowflakeNumber(int, int) - Constructor for class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeNumber
- SnowflakeNumeric - Class in org.apache.beam.sdk.io.snowflake.data.numeric
- SnowflakeNumeric() - Constructor for class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeNumeric
- SnowflakeNumeric(int, int) - Constructor for class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeNumeric
- SnowflakeObject - Class in org.apache.beam.sdk.io.snowflake.data.structured
- SnowflakeObject() - Constructor for class org.apache.beam.sdk.io.snowflake.data.structured.SnowflakeObject
- SnowflakePipelineOptions - Interface in org.apache.beam.sdk.io.snowflake
- SnowflakeReal - Class in org.apache.beam.sdk.io.snowflake.data.numeric
- SnowflakeReal() - Constructor for class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeReal
- SnowflakeServices - Interface in org.apache.beam.sdk.io.snowflake.services
-
Interface which defines common methods for interacting with Snowflake.
- SnowflakeServices.BatchService - Interface in org.apache.beam.sdk.io.snowflake.services
- SnowflakeServices.StreamingService - Interface in org.apache.beam.sdk.io.snowflake.services
- SnowflakeServicesImpl - Class in org.apache.beam.sdk.io.snowflake.services
- SnowflakeServicesImpl() - Constructor for class org.apache.beam.sdk.io.snowflake.services.SnowflakeServicesImpl
- SnowflakeStreamingServiceConfig - Class in org.apache.beam.sdk.io.snowflake.services
-
Class for preparing configuration for streaming write.
- SnowflakeStreamingServiceConfig(List<String>, String, SimpleIngestManager) - Constructor for class org.apache.beam.sdk.io.snowflake.services.SnowflakeStreamingServiceConfig
-
Constructor to create configuration for streaming write.
- SnowflakeStreamingServiceImpl - Class in org.apache.beam.sdk.io.snowflake.services
-
Implementation of
SnowflakeServices.StreamingService
used in production. - SnowflakeStreamingServiceImpl() - Constructor for class org.apache.beam.sdk.io.snowflake.services.SnowflakeStreamingServiceImpl
- SnowflakeString - Class in org.apache.beam.sdk.io.snowflake.data.text
- SnowflakeString() - Constructor for class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeString
- SnowflakeString(long) - Constructor for class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeString
- SnowflakeTableSchema - Class in org.apache.beam.sdk.io.snowflake.data
-
POJO representing schema of Table in Snowflake.
- SnowflakeTableSchema() - Constructor for class org.apache.beam.sdk.io.snowflake.data.SnowflakeTableSchema
- SnowflakeTableSchema(SnowflakeColumn...) - Constructor for class org.apache.beam.sdk.io.snowflake.data.SnowflakeTableSchema
- SnowflakeText - Class in org.apache.beam.sdk.io.snowflake.data.text
- SnowflakeText() - Constructor for class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeText
- SnowflakeText(long) - Constructor for class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeText
- SnowflakeTime - Class in org.apache.beam.sdk.io.snowflake.data.datetime
- SnowflakeTime() - Constructor for class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeTime
- SnowflakeTimestamp - Class in org.apache.beam.sdk.io.snowflake.data.datetime
- SnowflakeTimestamp() - Constructor for class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeTimestamp
- SnowflakeTimestampLTZ - Class in org.apache.beam.sdk.io.snowflake.data.datetime
- SnowflakeTimestampLTZ() - Constructor for class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeTimestampLTZ
- SnowflakeTimestampNTZ - Class in org.apache.beam.sdk.io.snowflake.data.datetime
- SnowflakeTimestampNTZ() - Constructor for class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeTimestampNTZ
- SnowflakeTimestampTZ - Class in org.apache.beam.sdk.io.snowflake.data.datetime
- SnowflakeTimestampTZ() - Constructor for class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeTimestampTZ
- SnowflakeTransformRegistrar - Class in org.apache.beam.sdk.io.snowflake.crosslanguage
-
Exposes
SnowflakeIO.Read
andSnowflakeIO.Write
as an external transform for cross-language usage. - SnowflakeTransformRegistrar() - Constructor for class org.apache.beam.sdk.io.snowflake.crosslanguage.SnowflakeTransformRegistrar
- SnowflakeVarBinary - Class in org.apache.beam.sdk.io.snowflake.data.text
- SnowflakeVarBinary() - Constructor for class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeVarBinary
- SnowflakeVarchar - Class in org.apache.beam.sdk.io.snowflake.data.text
- SnowflakeVarchar() - Constructor for class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeVarchar
- SnowflakeVarchar(long) - Constructor for class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeVarchar
- SnowflakeVariant - Class in org.apache.beam.sdk.io.snowflake.data.structured
- SnowflakeVariant() - Constructor for class org.apache.beam.sdk.io.snowflake.data.structured.SnowflakeVariant
- SnsIO - Class in org.apache.beam.sdk.io.aws2.sns
-
IO to send notifications via SNS.
- SnsIO() - Constructor for class org.apache.beam.sdk.io.aws2.sns.SnsIO
- SnsIO.Write<T> - Class in org.apache.beam.sdk.io.aws2.sns
-
Implementation of
SnsIO.write()
. - SocketAddressFactory - Class in org.apache.beam.sdk.fn.channel
-
Creates a
SocketAddress
based upon a supplied string. - SocketAddressFactory() - Constructor for class org.apache.beam.sdk.fn.channel.SocketAddressFactory
- socketTimeout() - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration
-
Milliseconds to wait for data to be transferred over an established, open connection before the connection is timed out.
- socketTimeout(Integer) - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration.Builder
-
Milliseconds to wait for data to be transferred over an established, open connection before the connection is timed out.
- Solace - Class in org.apache.beam.sdk.io.solace.data
-
Provides core data models and utilities for working with Solace messages in the context of Apache Beam pipelines.
- Solace() - Constructor for class org.apache.beam.sdk.io.solace.data.Solace
- Solace.CorrelationKey - Class in org.apache.beam.sdk.io.solace.data
-
The correlation key is an object that is passed back to the client during the event broker ack or nack.
- Solace.CorrelationKey.Builder - Class in org.apache.beam.sdk.io.solace.data
- Solace.Destination - Class in org.apache.beam.sdk.io.solace.data
-
Represents a Solace message destination (either a Topic or a Queue).
- Solace.Destination.Builder - Class in org.apache.beam.sdk.io.solace.data
- Solace.DestinationType - Enum Class in org.apache.beam.sdk.io.solace.data
-
Represents a Solace destination type.
- Solace.PublishResult - Class in org.apache.beam.sdk.io.solace.data
-
The result of writing a message to Solace.
- Solace.PublishResult.Builder - Class in org.apache.beam.sdk.io.solace.data
- Solace.Queue - Class in org.apache.beam.sdk.io.solace.data
-
Represents a Solace queue.
- Solace.Record - Class in org.apache.beam.sdk.io.solace.data
-
Represents a Solace message record with its associated metadata.
- Solace.Record.Builder - Class in org.apache.beam.sdk.io.solace.data
- Solace.SolaceRecordMapper - Class in org.apache.beam.sdk.io.solace.data
-
A utility class for mapping
BytesXMLMessage
instances toSolace.Record
objects. - Solace.Topic - Class in org.apache.beam.sdk.io.solace.data
-
Represents a Solace topic.
- SolaceCheckpointMark - Class in org.apache.beam.sdk.io.solace.read
-
Checkpoint for an unbounded Solace source.
- SolaceIO - Class in org.apache.beam.sdk.io.solace
-
A
PTransform
to read and write from/to Solace event broker. - SolaceIO() - Constructor for class org.apache.beam.sdk.io.solace.SolaceIO
- SolaceIO.Read<T> - Class in org.apache.beam.sdk.io.solace
- SolaceIO.SubmissionMode - Enum Class in org.apache.beam.sdk.io.solace
- SolaceIO.Write<T> - Class in org.apache.beam.sdk.io.solace
- SolaceIO.WriterType - Enum Class in org.apache.beam.sdk.io.solace
- SolaceMessageProducer - Class in org.apache.beam.sdk.io.solace.broker
- SolaceMessageProducer(XMLMessageProducer) - Constructor for class org.apache.beam.sdk.io.solace.broker.SolaceMessageProducer
- SolaceMessageReceiver - Class in org.apache.beam.sdk.io.solace.broker
- SolaceMessageReceiver(FlowReceiver) - Constructor for class org.apache.beam.sdk.io.solace.broker.SolaceMessageReceiver
- SolaceOutput - Class in org.apache.beam.sdk.io.solace.write
-
The
SolaceIO.Write
transform's output return this type, containing the successful publishes (SolaceOutput.getSuccessfulPublish()
). - SolaceRecordMapper() - Constructor for class org.apache.beam.sdk.io.solace.data.Solace.SolaceRecordMapper
- solaceSessionServiceWithProducer() - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
- SolrIO - Class in org.apache.beam.sdk.io.solr
-
Transforms for reading and writing data from/to Solr.
- SolrIO.ConnectionConfiguration - Class in org.apache.beam.sdk.io.solr
-
A POJO describing a connection configuration to Solr.
- SolrIO.Read - Class in org.apache.beam.sdk.io.solr
-
A
PTransform
reading data from Solr. - SolrIO.ReadAll - Class in org.apache.beam.sdk.io.solr
- SolrIO.ReplicaInfo - Class in org.apache.beam.sdk.io.solr
-
A POJO describing a replica of Solr.
- SolrIO.RetryConfiguration - Class in org.apache.beam.sdk.io.solr
-
A POJO encapsulating a configuration for retry behavior when issuing requests to Solr.
- SolrIO.Write - Class in org.apache.beam.sdk.io.solr
-
A
PTransform
writing data to Solr. - sort() - Method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter
- sort() - Method in class org.apache.beam.sdk.extensions.sorter.ExternalSorter
-
Sorts the added elements and returns an
Iterable
over the sorted elements. - SORT_VALUES - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- sortBySchema(List<FieldValueTypeInformation>, Schema) - Static method in class org.apache.beam.sdk.schemas.utils.StaticSchemaInference
- sorted() - Method in class org.apache.beam.sdk.schemas.Schema
-
Returns an identical Schema with lexicographically sorted fields.
- sorted() - Method in class org.apache.beam.sdk.values.Row
-
Returns an equivalent
Row
with fields lexicographically sorted by their name. - SortedMapCoder<K,
V> - Class in org.apache.beam.sdk.coders - SortingFlinkCombineRunner<K,
InputT, - Class in org.apache.beam.runners.flink.translation.functionsAccumT, OutputT, W> -
A Flink combine runner that first sorts the elements by window and then does one pass that merges windows and outputs results.
- SortingFlinkCombineRunner() - Constructor for class org.apache.beam.runners.flink.translation.functions.SortingFlinkCombineRunner
- SortValues<PrimaryKeyT,
SecondaryKeyT, - Class in org.apache.beam.sdk.extensions.sorterValueT> -
SortValues<PrimaryKeyT, SecondaryKeyT, ValueT>
takes aPCollection<KV<PrimaryKeyT, Iterable<KV<SecondaryKeyT, ValueT>>>>
with elements consisting of a primary key and iterables over<secondary key, value>
pairs, and returns aPCollection<KV<PrimaryKeyT, Iterable<KV<SecondaryKeyT, ValueT>>>
of the same elements but with values sorted by a secondary key. - source() - Method in class org.apache.beam.sdk.io.splunk.SplunkEvent
- source(String) - Static method in class org.apache.beam.sdk.extensions.yaml.YamlTransform
-
Creates a new YamlTransform PBegin a single PCollection<Row> output.
- Source<T> - Class in org.apache.beam.sdk.io
-
Base class for defining input formats and creating a
Source
for reading the input. - Source() - Constructor for class org.apache.beam.sdk.io.Source
- SOURCE - Enum constant in enum class org.apache.beam.sdk.io.cdap.PluginConstants.Hadoop
- SOURCE - Enum constant in enum class org.apache.beam.sdk.io.cdap.PluginConstants.PluginType
- SOURCE - Enum constant in enum class org.apache.beam.sdk.metrics.Lineage.Type
- SOURCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner.NonCumulativeCostImpl
- SOURCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.planner.RelMdNodeStats
- SOURCE_DOES_NOT_NEED_SPLITTING - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- SOURCE_ESTIMATED_SIZE_BYTES - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- SOURCE_IS_INFINITE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- SOURCE_METADATA - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- SOURCE_SPEC - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- SOURCE_STEP_INPUT - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- Source.Reader<T> - Class in org.apache.beam.sdk.io
-
The interface that readers of custom input sources must implement.
- SourceInputFormat<T> - Class in org.apache.beam.runners.flink.translation.wrappers
-
Wrapper for executing a
Source
as a FlinkInputFormat
. - SourceInputFormat(String, BoundedSource<T>, PipelineOptions) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.SourceInputFormat
- SourceInputSplit<T> - Class in org.apache.beam.runners.flink.translation.wrappers
-
InputSplit
forSourceInputFormat
. - SourceInputSplit() - Constructor for class org.apache.beam.runners.flink.translation.wrappers.SourceInputSplit
- SourceInputSplit(Source<T>, int) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.SourceInputSplit
- SourceMetrics - Class in org.apache.beam.sdk.metrics
-
Standard
Source
Metrics. - SourceMetrics() - Constructor for class org.apache.beam.sdk.metrics.SourceMetrics
- sourceName() - Method in class org.apache.beam.runners.spark.metrics.SparkBeamMetricSource
- sourceName() - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.SparkBeamMetricSource
- sourceOutput() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase.ReaderAndOutput
- SourceRDD - Class in org.apache.beam.runners.spark.io
-
Classes implementing Beam
Source
RDD
s. - SourceRDD() - Constructor for class org.apache.beam.runners.spark.io.SourceRDD
- SourceRDD.Bounded<T> - Class in org.apache.beam.runners.spark.io
- SourceRDD.Unbounded<T,
CheckpointMarkT> - Class in org.apache.beam.runners.spark.io -
A
SourceRDD.Unbounded
is the implementation of a micro-batch in aSourceDStream
. - SourceRecordJson - Class in org.apache.beam.io.debezium
-
This class can be used as a mapper for each
SourceRecord
retrieved. - SourceRecordJson(SourceRecord) - Constructor for class org.apache.beam.io.debezium.SourceRecordJson
-
Initializer.
- SourceRecordJson.SourceRecordJsonMapper - Class in org.apache.beam.io.debezium
-
SourceRecordJson
implementation. - SourceRecordJsonMapper() - Constructor for class org.apache.beam.io.debezium.SourceRecordJson.SourceRecordJsonMapper
- SourceRecordMapper<T> - Interface in org.apache.beam.io.debezium
-
Interface used to map a Kafka source record.
- sourceSplits() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- SourceTestUtils - Class in org.apache.beam.sdk.testing
-
Helper functions and test harnesses for checking correctness of
Source
implementations. - SourceTestUtils() - Constructor for class org.apache.beam.sdk.testing.SourceTestUtils
- SourceTestUtils.ExpectedSplitOutcome - Enum Class in org.apache.beam.sdk.testing
-
Expected outcome of
BoundedSource.BoundedReader.splitAtFraction(double)
. - sourceType() - Method in class org.apache.beam.sdk.io.splunk.SplunkEvent
- SOURCEV2 - Enum constant in enum class org.apache.beam.sdk.metrics.Lineage.Type
- span(IntervalWindow) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
-
Returns the minimal window that includes both this window and the given window.
- SpannerAccessor - Class in org.apache.beam.sdk.io.gcp.spanner
-
Manages lifecycle of
DatabaseClient
andSpanner
instances. - SpannerChangestreamsReadConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration
- SpannerChangestreamsReadSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams
- SpannerChangestreamsReadSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider
- SpannerChangestreamsReadSchemaTransformProvider.DataChangeRecordToRow - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams
- SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams
- SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams
- SpannerConfig - Class in org.apache.beam.sdk.io.gcp.spanner
-
Configuration for a Cloud Spanner client.
- SpannerConfig() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- SpannerConfig.Builder - Class in org.apache.beam.sdk.io.gcp.spanner
-
Builder for
SpannerConfig
. - SpannerIO - Class in org.apache.beam.sdk.io.gcp.spanner
-
Reading from Cloud Spanner
- SpannerIO.CreateTransaction - Class in org.apache.beam.sdk.io.gcp.spanner
-
A
PTransform
that create a transaction. - SpannerIO.CreateTransaction.Builder - Class in org.apache.beam.sdk.io.gcp.spanner
-
A builder for
SpannerIO.CreateTransaction
. - SpannerIO.FailureMode - Enum Class in org.apache.beam.sdk.io.gcp.spanner
-
A failure handling strategy.
- SpannerIO.Read - Class in org.apache.beam.sdk.io.gcp.spanner
-
Implementation of
SpannerIO.read()
. - SpannerIO.ReadAll - Class in org.apache.beam.sdk.io.gcp.spanner
-
Implementation of
SpannerIO.readAll()
. - SpannerIO.ReadChangeStream - Class in org.apache.beam.sdk.io.gcp.spanner
- SpannerIO.SpannerChangeStreamOptions - Interface in org.apache.beam.sdk.io.gcp.spanner
-
Interface to display the name of the metadata table on Dataflow UI.
- SpannerIO.Write - Class in org.apache.beam.sdk.io.gcp.spanner
-
A
PTransform
that writesMutation
objects to Google Cloud Spanner. - SpannerIO.WriteGrouped - Class in org.apache.beam.sdk.io.gcp.spanner
-
Same as
SpannerIO.Write
but supports grouped mutations. - SpannerReadSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration
- SpannerReadSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.spanner
-
A provider for reading from Cloud Spanner using a Schema Transform Provider.
- SpannerReadSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider
- SpannerReadSchemaTransformProvider.ErrorFn - Class in org.apache.beam.sdk.io.gcp.spanner
- SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.gcp.spanner
- SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.spanner
- SpannerSchema - Class in org.apache.beam.sdk.io.gcp.spanner
-
Encapsulates Cloud Spanner Schema.
- SpannerSchema() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema
- SpannerSchema.Column - Class in org.apache.beam.sdk.io.gcp.spanner
- SpannerSchema.KeyPart - Class in org.apache.beam.sdk.io.gcp.spanner
- SpannerSchemaRetrievalException - Exception Class in org.apache.beam.sdk.io.gcp.spanner
-
Exception to signal that Spanner schema retrieval failed.
- SpannerTransformRegistrar - Class in org.apache.beam.sdk.io.gcp.spanner
-
Exposes
SpannerIO.WriteRows
andSpannerIO.ReadRows
as an external transform for cross-language usage. - SpannerTransformRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar
- SpannerTransformRegistrar.CrossLanguageConfiguration - Class in org.apache.beam.sdk.io.gcp.spanner
- SpannerTransformRegistrar.DeleteBuilder - Class in org.apache.beam.sdk.io.gcp.spanner
- SpannerTransformRegistrar.InsertBuilder - Class in org.apache.beam.sdk.io.gcp.spanner
- SpannerTransformRegistrar.InsertOrUpdateBuilder - Class in org.apache.beam.sdk.io.gcp.spanner
- SpannerTransformRegistrar.ReadBuilder - Class in org.apache.beam.sdk.io.gcp.spanner
- SpannerTransformRegistrar.ReadBuilder.Configuration - Class in org.apache.beam.sdk.io.gcp.spanner
- SpannerTransformRegistrar.ReplaceBuilder - Class in org.apache.beam.sdk.io.gcp.spanner
- SpannerTransformRegistrar.UpdateBuilder - Class in org.apache.beam.sdk.io.gcp.spanner
- SpannerWriteResult - Search tag in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
- Section
- SpannerWriteResult - Class in org.apache.beam.sdk.io.gcp.spanner
-
The results of a
SpannerIO.write()
transform. - SpannerWriteResult(Pipeline, PCollection<Void>, PCollection<MutationGroup>, TupleTag<MutationGroup>) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteResult
- SpannerWriteSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration
- SpannerWriteSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.spanner
- SpannerWriteSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider
- SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.gcp.spanner
- SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.spanner
- SparkAssignWindowFn<T,
W> - Class in org.apache.beam.runners.spark.translation -
An implementation of
Window.Assign
for the Spark runner. - SparkAssignWindowFn(WindowFn<? super T, W>) - Constructor for class org.apache.beam.runners.spark.translation.SparkAssignWindowFn
- SparkBatchPortablePipelineTranslator - Class in org.apache.beam.runners.spark.translation
-
Translates a bounded portable pipeline into a Spark job.
- SparkBatchPortablePipelineTranslator() - Constructor for class org.apache.beam.runners.spark.translation.SparkBatchPortablePipelineTranslator
- SparkBatchPortablePipelineTranslator.IsSparkNativeTransform - Class in org.apache.beam.runners.spark.translation
-
Predicate to determine whether a URN is a Spark native transform.
- SparkBeamMetricSource - Class in org.apache.beam.runners.spark.metrics
-
A Spark
Source
that is tailored to expose aSparkBeamMetric
, wrapping an underlyingMetricResults
instance. - SparkBeamMetricSource - Class in org.apache.beam.runners.spark.structuredstreaming.metrics
-
A Spark
Source
that is tailored to expose aSparkBeamMetric
, wrapping an underlyingMetricResults
instance. - SparkBeamMetricSource(String) - Constructor for class org.apache.beam.runners.spark.metrics.SparkBeamMetricSource
- SparkBeamMetricSource(String, MetricsAccumulator) - Constructor for class org.apache.beam.runners.spark.structuredstreaming.metrics.SparkBeamMetricSource
- SparkCombineFn<InputT,
ValueT, - Class in org.apache.beam.runners.spark.translationAccumT, OutputT> -
A
CombineFnBase.GlobalCombineFn
with aCombineWithContext.Context
for the SparkRunner. - SparkCombineFn.WindowedAccumulator<InputT,
ValueT, - Interface in org.apache.beam.runners.spark.translationAccumT, ImplT> -
Accumulator of WindowedValues holding values for different windows.
- SparkCombineFn.WindowedAccumulator.Type - Enum Class in org.apache.beam.runners.spark.translation
-
Type of the accumulator.
- SparkCommonPipelineOptions - Interface in org.apache.beam.runners.spark
-
Spark runner
PipelineOptions
handles Spark execution-related configurations, such as the master address, and other user-related knobs. - SparkCommonPipelineOptions.StorageLevelFactory - Class in org.apache.beam.runners.spark
-
Returns Spark's default storage level for the Dataset or RDD API based on the respective runner.
- SparkCommonPipelineOptions.TmpCheckpointDirFactory - Class in org.apache.beam.runners.spark
-
Returns the default checkpoint directory of /tmp/${job.name}.
- SparkContextFactory - Class in org.apache.beam.runners.spark.translation
- SparkContextOptions - Interface in org.apache.beam.runners.spark
-
A custom
PipelineOptions
to work with properties related toJavaSparkContext
. - SparkContextOptions.EmptyListenersList - Class in org.apache.beam.runners.spark
-
Returns an empty list, to avoid handling null.
- SparkExecutableStageContextFactory - Class in org.apache.beam.runners.spark.translation
-
Singleton class that contains one
ExecutableStageContext.Factory
per job. - SparkGroupAlsoByWindowViaWindowSet - Class in org.apache.beam.runners.spark.stateful
-
An implementation of
GroupByKeyViaGroupByKeyOnly.GroupAlsoByWindow
logic for grouping by windows and controlling trigger firings and pane accumulation. - SparkGroupAlsoByWindowViaWindowSet() - Constructor for class org.apache.beam.runners.spark.stateful.SparkGroupAlsoByWindowViaWindowSet
- SparkInputDataProcessor<FnInputT,
FnOutputT, - Interface in org.apache.beam.runners.spark.translationOutputT> -
Processes Spark's input data iterators using Beam's
DoFnRunner
. - SparkJobInvoker - Class in org.apache.beam.runners.spark
-
Creates a job invocation to manage the Spark runner's execution of a portable pipeline.
- SparkJobServerDriver - Class in org.apache.beam.runners.spark
-
Driver program that starts a job server for the Spark runner.
- SparkJobServerDriver.SparkServerConfiguration - Class in org.apache.beam.runners.spark
-
Spark runner-specific Configuration for the jobServer.
- SparkKryoRegistrator() - Constructor for class org.apache.beam.runners.spark.structuredstreaming.translation.SparkSessionFactory.SparkKryoRegistrator
- SparkNativePipelineVisitor - Class in org.apache.beam.runners.spark
-
Pipeline visitor for translating a Beam pipeline into equivalent Spark operations.
- SparkPCollectionView - Class in org.apache.beam.runners.spark.translation
-
SparkPCollectionView is used to pass serialized views to lambdas.
- SparkPCollectionView() - Constructor for class org.apache.beam.runners.spark.translation.SparkPCollectionView
- SparkPCollectionView.Type - Enum Class in org.apache.beam.runners.spark.translation
-
Type of side input.
- SparkPipelineOptions - Interface in org.apache.beam.runners.spark
-
Spark runner
PipelineOptions
handles Spark execution-related configurations, such as the master address, batch-interval, and other user-related knobs. - SparkPipelineResult - Class in org.apache.beam.runners.spark
-
Represents a Spark pipeline execution result.
- SparkPipelineRunner - Class in org.apache.beam.runners.spark
-
Runs a portable pipeline on Apache Spark.
- SparkPipelineRunner(SparkPipelineOptions) - Constructor for class org.apache.beam.runners.spark.SparkPipelineRunner
- SparkPipelineTranslator - Interface in org.apache.beam.runners.spark.translation
-
Translator to support translation between Beam transformations and Spark transformations.
- SparkPortablePipelineTranslator<T> - Interface in org.apache.beam.runners.spark.translation
-
Interface for portable Spark translators.
- SparkPortableStreamingPipelineOptions - Interface in org.apache.beam.runners.spark
-
Pipeline options specific to the Spark portable runner running a streaming job.
- SparkProcessContext<K,
InputT, - Class in org.apache.beam.runners.spark.translationOutputT> -
Holds current processing context for
SparkInputDataProcessor
. - SparkProcessContext(String, DoFn<InputT, OutputT>, DoFnRunner<InputT, OutputT>, K, Iterator<TimerInternals.TimerData>) - Constructor for class org.apache.beam.runners.spark.translation.SparkProcessContext
- SparkReceiverIO - Class in org.apache.beam.sdk.io.sparkreceiver
-
Streaming sources for Spark
Receiver
. - SparkReceiverIO() - Constructor for class org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIO
- SparkReceiverIO.Read<V> - Class in org.apache.beam.sdk.io.sparkreceiver
-
A
PTransform
to read from SparkReceiver
. - SparkRunner - Class in org.apache.beam.runners.spark
-
The SparkRunner translate operations defined on a pipeline to a representation executable by Spark, and then submitting the job to Spark to be executed.
- SparkRunner.Evaluator - Class in org.apache.beam.runners.spark
-
Evaluator on the pipeline.
- SparkRunnerDebugger - Class in org.apache.beam.runners.spark
-
Pipeline runner which translates a Beam pipeline into equivalent Spark operations, without running them.
- SparkRunnerDebugger.DebugSparkPipelineResult - Class in org.apache.beam.runners.spark
-
PipelineResult of running a
Pipeline
usingSparkRunnerDebugger
UseSparkRunnerDebugger.DebugSparkPipelineResult.getDebugString()
to get aString
representation of thePipeline
translated into Spark native operations. - SparkRunnerKryoRegistrator - Class in org.apache.beam.runners.spark.coders
-
Custom
KryoRegistrator
s for Beam's Spark runner needs and registering used class in spark translation for better serialization performance. - SparkRunnerKryoRegistrator() - Constructor for class org.apache.beam.runners.spark.coders.SparkRunnerKryoRegistrator
- SparkRunnerRegistrar - Class in org.apache.beam.runners.spark
- SparkRunnerRegistrar.Options - Class in org.apache.beam.runners.spark
-
Registers the
SparkPipelineOptions
. - SparkRunnerRegistrar.Runner - Class in org.apache.beam.runners.spark
-
Registers the
SparkRunner
. - SparkRunnerStreamingContextFactory - Class in org.apache.beam.runners.spark.translation.streaming
-
A
JavaStreamingContext
factory for resilience. - SparkRunnerStreamingContextFactory(Pipeline, SparkPipelineOptions, Checkpoint.CheckpointDir) - Constructor for class org.apache.beam.runners.spark.translation.streaming.SparkRunnerStreamingContextFactory
- SparkServerConfiguration() - Constructor for class org.apache.beam.runners.spark.SparkJobServerDriver.SparkServerConfiguration
- SparkSessionFactory - Class in org.apache.beam.runners.spark.structuredstreaming.translation
- SparkSessionFactory() - Constructor for class org.apache.beam.runners.spark.structuredstreaming.translation.SparkSessionFactory
- SparkSessionFactory.SparkKryoRegistrator - Class in org.apache.beam.runners.spark.structuredstreaming.translation
-
KryoRegistrator
for Spark to serialize broadcast variables used for side-inputs. - SparkSideInputReader - Class in org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions
-
SideInputReader using broadcasted
SideInputValues
. - SparkSideInputReader - Class in org.apache.beam.runners.spark.util
-
A
SideInputReader
for the SparkRunner. - SparkSideInputReader(Map<TupleTag<?>, KV<WindowingStrategy<?, ?>, SideInputBroadcast<?>>>) - Constructor for class org.apache.beam.runners.spark.util.SparkSideInputReader
- SparkStateInternals<K> - Class in org.apache.beam.runners.spark.stateful
-
An implementation of
StateInternals
for the SparkRunner. - SparkStreamingPortablePipelineTranslator - Class in org.apache.beam.runners.spark.translation
-
Translates an unbounded portable pipeline into a Spark job.
- SparkStreamingPortablePipelineTranslator() - Constructor for class org.apache.beam.runners.spark.translation.SparkStreamingPortablePipelineTranslator
- SparkStreamingTranslationContext - Class in org.apache.beam.runners.spark.translation
-
Translation context used to lazily store Spark datasets during streaming portable pipeline translation and compute them after translation.
- SparkStreamingTranslationContext(JavaSparkContext, SparkPipelineOptions, JobInfo) - Constructor for class org.apache.beam.runners.spark.translation.SparkStreamingTranslationContext
- SparkStructuredStreamingPipelineOptions - Interface in org.apache.beam.runners.spark.structuredstreaming
-
Spark runner
PipelineOptions
handles Spark execution-related configurations, such as the master address, and other user-related knobs. - SparkStructuredStreamingPipelineResult - Class in org.apache.beam.runners.spark.structuredstreaming
- SparkStructuredStreamingRunner - Class in org.apache.beam.runners.spark.structuredstreaming
-
A Spark runner build on top of Spark's SQL Engine (Structured Streaming framework).
- SparkStructuredStreamingRunnerRegistrar - Class in org.apache.beam.runners.spark.structuredstreaming
-
Contains the
PipelineRunnerRegistrar
andPipelineOptionsRegistrar
for theSparkStructuredStreamingRunner
. - SparkStructuredStreamingRunnerRegistrar.Options - Class in org.apache.beam.runners.spark.structuredstreaming
-
Registers the
SparkStructuredStreamingPipelineOptions
. - SparkStructuredStreamingRunnerRegistrar.Runner - Class in org.apache.beam.runners.spark.structuredstreaming
-
Registers the
SparkStructuredStreamingRunner
. - SparkTimerInternals - Class in org.apache.beam.runners.spark.stateful
-
An implementation of
TimerInternals
for the SparkRunner. - SparkTransformOverrides - Class in org.apache.beam.runners.spark
-
PTransform
overrides for Spark runner. - SparkTransformOverrides() - Constructor for class org.apache.beam.runners.spark.SparkTransformOverrides
- SparkTransformsRegistrar() - Constructor for class org.apache.beam.runners.spark.translation.streaming.StreamingTransformTranslator.SparkTransformsRegistrar
- SparkTranslationContext - Class in org.apache.beam.runners.spark.translation
-
Translation context used to lazily store Spark data sets during portable pipeline translation and compute them after translation.
- SparkTranslationContext(JavaSparkContext, PipelineOptions, JobInfo) - Constructor for class org.apache.beam.runners.spark.translation.SparkTranslationContext
- SparkUnboundedSource - Class in org.apache.beam.runners.spark.io
-
A "composite" InputDStream implementation for
UnboundedSource
s. - SparkUnboundedSource() - Constructor for class org.apache.beam.runners.spark.io.SparkUnboundedSource
- SparkUnboundedSource.Metadata - Class in org.apache.beam.runners.spark.io
-
A metadata holder for an input stream partition.
- SparkWatermarks(Instant, Instant, Instant) - Constructor for class org.apache.beam.runners.spark.util.GlobalWatermarkHolder.SparkWatermarks
- specific(Class<T>) - Static method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
-
Returns an
AvroCoder
instance for the provided element type respecting Avro's Specific* suite for encoding and decoding. - specific(Class<T>) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory
-
Returns an
AvroDatumFactory
instance for the provided element type respecting Avro's Specific* suite for encoding and decoding. - specific(Class<T>, Schema) - Static method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
-
Returns an
AvroCoder
instance for the provided element type respecting Avro's Specific* suite for encoding and decoding. - specific(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
-
Returns an
AvroCoder
instance for the provided element type respecting Avro's Specific* suite for encoding and decoding. - SpecificDatumFactory(Class<T>) - Constructor for class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory.SpecificDatumFactory
- SPEED_OPTIMIZED - Enum constant in enum class org.apache.beam.runners.dataflow.options.DataflowPipelineOptions.FlexResourceSchedulingGoal
-
Optimize for lower execution time.
- split(double) - Method in interface org.apache.beam.runners.fnexecution.control.RemoteBundle
-
Ask the remote bundle to split its current processing based upon its knowledge of remaining work.
- split(double) - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient.BundleProcessor.ActiveBundle
- split(int) - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
-
Returns a list of up to
numSplits + 1
ByteKeys
in ascending order, where the keys have been interpolated to form roughly equal sub-ranges of thisByteKeyRange
, assuming a uniform distribution of keys within this range. - split(int, PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedSourceImpl
- split(int, PipelineOptions) - Method in class org.apache.beam.sdk.io.solace.read.UnboundedSolaceSource
- split(int, PipelineOptions) - Method in class org.apache.beam.sdk.io.UnboundedSource
-
Returns a list of
UnboundedSource
objects representing the instances of this source that should be used when executing the workflow. - split(long, long) - Method in class org.apache.beam.sdk.io.range.OffsetRange
- split(long, PipelineOptions) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.impulse.BeamImpulseSource
- split(long, PipelineOptions) - Method in class org.apache.beam.sdk.io.azure.cosmos.CosmosIO.BoundedCosmosBDSource
- split(long, PipelineOptions) - Method in class org.apache.beam.sdk.io.BoundedSource
-
Splits the source into bundles of approximately
desiredBundleSizeBytes
. - split(long, PipelineOptions) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BoundedElasticsearchSource
- split(long, PipelineOptions) - Method in class org.apache.beam.sdk.io.FileBasedSource
- split(long, PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
- split(long, PipelineOptions) - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.HadoopInputFormatBoundedSource
- split(long, PipelineOptions) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read.BoundedGridFSSource
- split(long, PipelineOptions) - Method in class org.apache.beam.sdk.io.OffsetBasedSource
- split(String) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.Split
PTransform
that splits a string on the regular expression and then outputs each item. - split(String, boolean) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.Split
PTransform
that splits a string on the regular expression and then outputs each item. - split(Pattern) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.Split
PTransform
that splits a string on the regular expression and then outputs each item. - split(Pattern, boolean) - Static method in class org.apache.beam.sdk.transforms.Regex
-
Returns a
Regex.Split
PTransform
that splits a string on the regular expression and then outputs each item. - split(BeamFnApi.ProcessBundleSplitResponse) - Method in interface org.apache.beam.runners.fnexecution.control.BundleSplitHandler
- Split(Pattern, boolean) - Constructor for class org.apache.beam.sdk.transforms.Regex.Split
- SPLIT_POINTS_UNKNOWN - Static variable in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
-
A constant to use as the return value for
BoundedSource.BoundedReader.getSplitPointsConsumed()
orBoundedSource.BoundedReader.getSplitPointsRemaining()
when the exact value is unknown. - splitAtFraction(double) - Method in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
-
Tells the reader to narrow the range of the input it's going to read and give up the remainder, so that the new range would contain approximately the given fraction of the amount of data in the current range.
- splitAtFraction(double) - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
- splitId - Variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase.ReaderAndOutput
- splitId() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceSplit
- splitIndex() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceSplit
- SplitIntoRangesFn(long) - Constructor for class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceTransform.SplitIntoRangesFn
- Split points - Search tag in interface org.apache.beam.sdk.io.range.RangeTracker
- Section
- splitReadStream(SplitReadStreamRequest) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StorageClient
- splitReadStream(SplitReadStreamRequest, String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StorageClient
- SplitResult<RestrictionT> - Class in org.apache.beam.sdk.transforms.splittabledofn
-
A representation of a split result.
- SplitResult() - Constructor for class org.apache.beam.sdk.transforms.splittabledofn.SplitResult
- Splittable DoFn's - Search tag in annotation interface org.apache.beam.sdk.transforms.DoFn.ProcessElement
- Section
- SplittableDoFnOperator<InputT,
OutputT, - Class in org.apache.beam.runners.flink.translation.wrappers.streamingRestrictionT> -
Flink operator for executing splittable
DoFns
. - SplittableDoFnOperator(DoFn<KeyedWorkItem<byte[], KV<InputT, RestrictionT>>, OutputT>, String, Coder<WindowedValue<KeyedWorkItem<byte[], KV<InputT, RestrictionT>>>>, Map<TupleTag<?>, Coder<?>>, TupleTag<OutputT>, List<TupleTag<?>>, DoFnOperator.OutputManagerFactory<OutputT>, WindowingStrategy<?, ?>, Map<Integer, PCollectionView<?>>, Collection<PCollectionView<?>>, PipelineOptions, Coder<?>, KeySelector<WindowedValue<KeyedWorkItem<byte[], KV<InputT, RestrictionT>>>, ?>) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.SplittableDoFnOperator
- SplunkEvent - Class in org.apache.beam.sdk.io.splunk
-
A
SplunkEvent
describes a single payload sent to Splunk's Http Event Collector (HEC) endpoint. - SplunkEvent() - Constructor for class org.apache.beam.sdk.io.splunk.SplunkEvent
- SplunkEvent.Builder - Class in org.apache.beam.sdk.io.splunk
-
A builder class for creating a
SplunkEvent
. - SplunkEventCoder - Class in org.apache.beam.sdk.io.splunk
-
A
Coder
forSplunkEvent
objects. - SplunkEventCoder() - Constructor for class org.apache.beam.sdk.io.splunk.SplunkEventCoder
- SplunkIO - Class in org.apache.beam.sdk.io.splunk
-
An unbounded sink for Splunk's Http Event Collector (HEC).
- SplunkIO.Write - Class in org.apache.beam.sdk.io.splunk
-
Class
SplunkIO.Write
provides aPTransform
that allows writingSplunkEvent
records into a Splunk HTTP Event Collector end-point using HTTP POST requests. - SplunkWriteError - Class in org.apache.beam.sdk.io.splunk
-
A class for capturing errors that occur while writing
SplunkEvent
to Splunk's Http Event Collector (HEC) end point. - SplunkWriteError() - Constructor for class org.apache.beam.sdk.io.splunk.SplunkWriteError
- SplunkWriteError.Builder - Class in org.apache.beam.sdk.io.splunk
-
A builder class for creating a
SplunkWriteError
. - sql() - Method in class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeDate
- sql() - Method in class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeTime
- sql() - Method in class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeTimestampLTZ
- sql() - Method in class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeTimestampNTZ
- sql() - Method in class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeTimestampTZ
- sql() - Method in class org.apache.beam.sdk.io.snowflake.data.geospatial.SnowflakeGeography
- sql() - Method in class org.apache.beam.sdk.io.snowflake.data.logical.SnowflakeBoolean
- sql() - Method in class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeFloat
- sql() - Method in class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeNumber
- sql() - Method in class org.apache.beam.sdk.io.snowflake.data.SnowflakeColumn
- sql() - Method in interface org.apache.beam.sdk.io.snowflake.data.SnowflakeDataType
- sql() - Method in class org.apache.beam.sdk.io.snowflake.data.SnowflakeTableSchema
- sql() - Method in class org.apache.beam.sdk.io.snowflake.data.structured.SnowflakeArray
- sql() - Method in class org.apache.beam.sdk.io.snowflake.data.structured.SnowflakeObject
- sql() - Method in class org.apache.beam.sdk.io.snowflake.data.structured.SnowflakeVariant
- sql() - Method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeBinary
- sql() - Method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeVarchar
- SqlAnalyzer - Class in org.apache.beam.sdk.extensions.sql.zetasql
-
Adapter for
Analyzer
to simplify the API for parsing the query and resolving the AST. - SqlCheckConstraint - Class in org.apache.beam.sdk.extensions.sql.impl.parser
-
Parse tree for
UNIQUE
,PRIMARY KEY
constraints. - SqlColumnDeclaration - Class in org.apache.beam.sdk.extensions.sql.impl.parser
-
Parse tree for column.
- SqlConversionException - Exception Class in org.apache.beam.sdk.extensions.sql.impl
-
Exception thrown when BeamSQL cannot convert sql to BeamRelNode.
- SqlConversionException(Throwable) - Constructor for exception class org.apache.beam.sdk.extensions.sql.impl.SqlConversionException
- SqlConversionException(String) - Constructor for exception class org.apache.beam.sdk.extensions.sql.impl.SqlConversionException
- SqlConversionException(String, Throwable) - Constructor for exception class org.apache.beam.sdk.extensions.sql.impl.SqlConversionException
- SqlCreateCatalog - Class in org.apache.beam.sdk.extensions.sql.impl.parser
- SqlCreateCatalog(SqlParserPos, boolean, boolean, SqlNode, SqlNode, SqlNodeList) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateCatalog
- SqlCreateDatabase - Class in org.apache.beam.sdk.extensions.sql.impl.parser
- SqlCreateDatabase(SqlParserPos, boolean, boolean, SqlNode) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateDatabase
- SqlCreateExternalTable - Class in org.apache.beam.sdk.extensions.sql.impl.parser
-
Parse tree for
CREATE EXTERNAL TABLE
statement. - SqlCreateExternalTable(SqlParserPos, boolean, boolean, SqlIdentifier, List<Schema.Field>, SqlNode, SqlNodeList, SqlNode, SqlNode, SqlNode) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateExternalTable
-
Creates a SqlCreateExternalTable.
- SqlCreateFunction - Class in org.apache.beam.sdk.extensions.sql.impl.parser
-
Parse tree for
CREATE FUNCTION
statement. - SqlCreateFunction(SqlParserPos, boolean, SqlIdentifier, SqlNode, boolean) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateFunction
-
Creates a SqlCreateFunction.
- SqlDdlNodes - Class in org.apache.beam.sdk.extensions.sql.impl.parser
-
Utilities concerning
SqlNode
for DDL. - SqlDropCatalog - Class in org.apache.beam.sdk.extensions.sql.impl.parser
- SqlDropCatalog(SqlParserPos, boolean, SqlNode) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.parser.SqlDropCatalog
- SqlDropDatabase - Class in org.apache.beam.sdk.extensions.sql.impl.parser
- SqlDropDatabase(SqlParserPos, boolean, SqlNode, boolean) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.parser.SqlDropDatabase
- SqlDropTable - Class in org.apache.beam.sdk.extensions.sql.impl.parser
-
Parse tree for
DROP TABLE
statement. - SqlOperators - Class in org.apache.beam.sdk.extensions.sql.zetasql.translation
-
A separate SqlOperators table for those functions that do not exist or not compatible with Calcite.
- SqlOperators() - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
- sqlScalarFunctions() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.UserFunctionDefinitions
- SQLSERVER - Enum constant in enum class org.apache.beam.io.debezium.Connectors
- SqlSetOptionBeam - Class in org.apache.beam.sdk.extensions.sql.impl.parser
-
SQL parse tree node to represent
SET
andRESET
statements. - SqlSetOptionBeam(SqlParserPos, String, SqlIdentifier, SqlNode) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.parser.SqlSetOptionBeam
- sqlTableValuedFunctions() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.UserFunctionDefinitions
-
SQL native user-defined table-valued function can be resolved by Analyzer.
- SqlTransform - Class in org.apache.beam.sdk.extensions.sql
-
SqlTransform
is the DSL interface of Beam SQL. - SqlTransform() - Constructor for class org.apache.beam.sdk.extensions.sql.SqlTransform
- SqlTransformSchemaTransformProvider - Class in org.apache.beam.sdk.extensions.sql.expansion
- SqlTransformSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.expansion.SqlTransformSchemaTransformProvider
- SqlTypes - Class in org.apache.beam.sdk.schemas.logicaltypes
-
Beam
Schema.LogicalType
s corresponding to SQL data types. - sqlTypeWithAutoCast(RelDataTypeFactory, Type) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
-
SQL-Java type mapping, with specified Beam rules:
1. - SqlUseCatalog - Class in org.apache.beam.sdk.extensions.sql.impl.parser
- SqlUseCatalog(SqlParserPos, String, SqlNode) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.parser.SqlUseCatalog
- SqlUseDatabase - Class in org.apache.beam.sdk.extensions.sql.impl.parser
- SqlUseDatabase(SqlParserPos, String, SqlNode) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.parser.SqlUseDatabase
- SqsIO - Class in org.apache.beam.sdk.io.aws2.sqs
-
IO to read (unbounded) from and write to SQS queues.
- SqsIO.Read - Class in org.apache.beam.sdk.io.aws2.sqs
-
A
PTransform
to read/receive messages from SQS. - SqsIO.Write - Class in org.apache.beam.sdk.io.aws2.sqs
-
Deprecated.superseded by
SqsIO.WriteBatches
- SqsIO.WriteBatches<T> - Class in org.apache.beam.sdk.io.aws2.sqs
-
A
PTransform
to send messages to SQS. - SqsIO.WriteBatches.DynamicDestination<T> - Interface in org.apache.beam.sdk.io.aws2.sqs
- SqsIO.WriteBatches.EntryMapperFn<T> - Interface in org.apache.beam.sdk.io.aws2.sqs
-
Mapper to create a
SendMessageBatchRequestEntry
from a unique batch entry id and the inputT
. - SqsIO.WriteBatches.EntryMapperFn.Builder<T> - Interface in org.apache.beam.sdk.io.aws2.sqs
-
A more convenient
SqsIO.WriteBatches.EntryMapperFn
variant that already sets the entry id. - SqsIO.WriteBatches.Result - Class in org.apache.beam.sdk.io.aws2.sqs
-
Result of
SqsIO.writeBatches()
. - SqsMessage - Class in org.apache.beam.sdk.io.aws2.sqs
- SqsMessage() - Constructor for class org.apache.beam.sdk.io.aws2.sqs.SqsMessage
- SqsMessageToBeamRow() - Constructor for class org.apache.beam.sdk.io.aws2.sqs.providers.SqsReadSchemaTransformProvider.SqsMessageToBeamRow
- SqsReadConfiguration - Class in org.apache.beam.sdk.io.aws2.sqs.providers
-
Configuration class for reading data from an AWS SQS queue.
- SqsReadConfiguration() - Constructor for class org.apache.beam.sdk.io.aws2.sqs.providers.SqsReadConfiguration
- SqsReadConfiguration.Builder - Class in org.apache.beam.sdk.io.aws2.sqs.providers
- SqsReadSchemaTransformProvider - Class in org.apache.beam.sdk.io.aws2.sqs.providers
-
An implementation of
TypedSchemaTransformProvider
for jobs reading data from AWS SQS queues and configured viaSqsReadConfiguration
. - SqsReadSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.aws2.sqs.providers.SqsReadSchemaTransformProvider
- SqsReadSchemaTransformProvider.SqsMessageToBeamRow - Class in org.apache.beam.sdk.io.aws2.sqs.providers
- src - Variable in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark.NewVsCopy.ArrayCopyState
- src - Variable in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark.NewVsCopy.ArrayNewState
- SSECustomerKey - Class in org.apache.beam.sdk.io.aws2.s3
-
Customer provided key for use with Amazon S3 server-side encryption.
- SSECustomerKey.Builder - Class in org.apache.beam.sdk.io.aws2.s3
- SSECustomerKeyFactory() - Constructor for class org.apache.beam.sdk.io.aws2.options.S3Options.SSECustomerKeyFactory
- stageArtifacts(RunnerApi.Pipeline) - Method in class org.apache.beam.runners.dataflow.DataflowRunner
- StageBundleFactory - Interface in org.apache.beam.runners.fnexecution.control
-
A bundle factory scoped to a particular
ExecutableStage
, which has all of the resources it needs to provide newRemoteBundles
. - StagedFile() - Constructor for class org.apache.beam.runners.dataflow.util.PackageUtil.StagedFile
- stageFiles(List<PackageUtil.StagedFile>) - Method in class org.apache.beam.runners.dataflow.util.GcsStager
-
Stages files to
DataflowPipelineOptions.getStagingLocation()
, suffixed with their md5 hash to avoid collisions. - stageFiles(List<PackageUtil.StagedFile>) - Method in interface org.apache.beam.runners.dataflow.util.Stager
-
Stage files and return a list of packages
DataflowPackage
objects describing th actual location at which each file was staged. - stagePackage(PackageUtil.PackageAttributes, Sleeper, CreateOptions) - Method in class org.apache.beam.runners.dataflow.util.PackageUtil
-
Stages one file ("package") if necessary.
- Stager - Interface in org.apache.beam.runners.dataflow.util
-
Interface for staging files needed for running a Dataflow pipeline.
- StagerFactory() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions.StagerFactory
- stageToFile(byte[], String) - Method in class org.apache.beam.runners.dataflow.util.GcsStager
- stageToFile(byte[], String) - Method in interface org.apache.beam.runners.dataflow.util.Stager
-
Stage bytes to a target file name wherever this stager stages things.
- stageToFile(byte[], String, String, CreateOptions) - Method in class org.apache.beam.runners.dataflow.util.PackageUtil
- STAGING_TO_ARTIFACT_URN - Static variable in class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
- StagingLocationFactory() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowPipelineOptions.StagingLocationFactory
- StandardCreateOptions() - Constructor for class org.apache.beam.sdk.io.fs.CreateOptions.StandardCreateOptions
- start() - Method in class org.apache.beam.io.debezium.KafkaSourceConsumerFn.DebeziumSDFDatabaseHistory
- start() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- start() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceSplitEnumerator
- start() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.LazyFlinkSourceSplitEnumerator
- start() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.unbounded.FlinkUnboundedSourceReader
- start() - Method in class org.apache.beam.runners.jobsubmission.JobInvocation
-
Start the job.
- start() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver
- start() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
- start() - Method in class org.apache.beam.runners.spark.metrics.sink.CsvSink
- start() - Method in class org.apache.beam.runners.spark.metrics.sink.GraphiteSink
- start() - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.sink.CodahaleCsvSink
- start() - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.sink.CodahaleGraphiteSink
- start() - Method in class org.apache.beam.runners.spark.translation.streaming.TestDStream
- start() - Method in class org.apache.beam.sdk.extensions.python.PythonService
- start() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator
-
Starts the flushing daemon thread if data_buffer_time_limit_ms is set.
- start() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
- start() - Method in interface org.apache.beam.sdk.io.jms.AutoScaler
-
The
AutoScaler
is started when theJmsIO.UnboundedJmsReader
is started. - start() - Method in class org.apache.beam.sdk.io.jms.DefaultAutoscaler
- start() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
- start() - Method in interface org.apache.beam.sdk.io.solace.broker.MessageReceiver
-
Starts the message receiver.
- start() - Method in class org.apache.beam.sdk.io.solace.broker.SolaceMessageReceiver
- start() - Method in class org.apache.beam.sdk.io.Source.Reader
-
Initializes the reader and advances the reader to the first record.
- start() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
-
Initializes the reader and advances the reader to the first record.
- start() - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
-
Returns the start of this window, inclusive.
- start() - Method in class org.apache.beam.sdk.transformservice.launcher.TransformServiceLauncher
- START_WITHS - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
- startAt(Instant) - Method in class org.apache.beam.sdk.transforms.PeriodicImpulse
-
Assign a timestamp when the pipeliene starts to produce data.
- startBundle() - Method in class org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate
- startBundle() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.stableinput.BufferingDoFnRunner
- startBundle() - Method in class org.apache.beam.runners.spark.translation.DoFnRunnerWithMetrics
- startBundle() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.PubsubLiteSink
- startBundle() - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
- startBundle() - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
Deprecated.Use
TestPipeline
with theDirectRunner
. - startBundle(DoFn.StartBundleContext) - Method in class org.apache.beam.sdk.io.gcp.bigquery.UpdateSchemaDestination
- startBundle(DoFn.StartBundleContext) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write.PubsubBoundedWriter
- startBundle(DoFn.StartBundleContext, PipelineOptions) - Method in interface org.apache.beam.sdk.extensions.sql.BeamSqlSeekableTable
- StartBundleContext() - Constructor for class org.apache.beam.sdk.transforms.DoFn.StartBundleContext
- startCopyJob(JobReference, JobConfigurationTableCopy) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
-
Start a BigQuery copy job.
- startCopyJob(JobReference, JobConfigurationTableCopy) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
- startExtractJob(JobReference, JobConfigurationExtract) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
-
Start a BigQuery extract job.
- startExtractJob(JobReference, JobConfigurationExtract) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
- startImpl() - Method in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
- startImpl() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
-
Initializes the
OffsetBasedSource.OffsetBasedReader
and advances to the first record, returningtrue
if there is a record available to be read. - startLoadJob(JobReference, JobConfigurationLoad) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
-
Start a BigQuery load job.
- startLoadJob(JobReference, JobConfigurationLoad) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
- startLoadJob(JobReference, JobConfigurationLoad, AbstractInputStreamContent) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
-
Start a BigQuery load job with stream content.
- startLoadJob(JobReference, JobConfigurationLoad, AbstractInputStreamContent) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
- startOrAdvance(ReaderOutput<OutputT>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase.ReaderAndOutput
- startProcess(String, String, List<String>, Map<String, String>) - Method in class org.apache.beam.runners.fnexecution.environment.ProcessManager
-
Forks a process with the given command, arguments, and additional environment variables.
- startProcess(String, String, List<String>, Map<String, String>, File) - Method in class org.apache.beam.runners.fnexecution.environment.ProcessManager
- startQueryJob(JobReference, JobConfigurationQuery) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
-
Start a BigQuery query job.
- startQueryJob(JobReference, JobConfigurationQuery) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
- startReading(ReadableByteChannel) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource.AvroReader
- startReading(ReadableByteChannel) - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
-
Creates a decompressing channel from the input channel and passes it to its delegate reader's
FileBasedSource.FileBasedReader.startReading(ReadableByteChannel)
. - startReading(ReadableByteChannel) - Method in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
-
Performs any initialization of the subclass of
FileBasedReader
that involves IO operations. - startRunnerBundle(DoFnRunner<InputT, OutputT>) - Method in class org.apache.beam.runners.jet.processors.ParDoP
- startRunnerBundle(DoFnRunner<KV<?, ?>, OutputT>) - Method in class org.apache.beam.runners.jet.processors.StatefulParDoP
- STARTS_WITH_METHOD - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.BeamBuiltinMethods
- startsWith(String) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- startsWith(String) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
A
SerializableMatcher
with identical criteria toMatchers.startsWith(java.lang.String)
. - startsWith(String, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinStringFunctions
- startsWith(String, String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
- startsWith(Path) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- state(StateNamespace, StateTag<T>, StateContext<?>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkBroadcastStateInternals
- state(StateNamespace, StateTag<T>, StateContext<?>) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals
- state(StateNamespace, StateTag<T>, StateContext<?>) - Method in class org.apache.beam.runners.spark.stateful.SparkStateInternals
- state(StreamObserver<BeamFnApi.StateResponse>) - Method in class org.apache.beam.runners.fnexecution.state.GrpcStateService
- State - Interface in org.apache.beam.sdk.state
-
A state cell, supporting a
State.clear()
operation. - STATE_CACHE_SIZE - Static variable in interface org.apache.beam.sdk.options.ExperimentalOptions
- STATE_SAMPLING_PERIOD_MILLIS - Static variable in interface org.apache.beam.sdk.options.ExperimentalOptions
- StateAndTimerBundleCheckpointHandler(TimerInternalsFactory<T>, StateInternalsFactory<T>, Coder<WindowedValue<T>>, Coder) - Constructor for class org.apache.beam.runners.fnexecution.control.BundleCheckpointHandlers.StateAndTimerBundleCheckpointHandler
- StateAndTimers - Class in org.apache.beam.runners.spark.stateful
-
State and Timers wrapper.
- StateAndTimers() - Constructor for class org.apache.beam.runners.spark.stateful.StateAndTimers
- StateBinder - Interface in org.apache.beam.sdk.state
-
For internal use only; no backwards-compatibility guarantees.
- StateContext<W> - Interface in org.apache.beam.sdk.state
-
For internal use only; no backwards-compatibility guarantees.
- StateContexts - Class in org.apache.beam.sdk.state
-
For internal use only; no backwards-compatibility guarantees.
- StateContexts() - Constructor for class org.apache.beam.sdk.state.StateContexts
- StateDelegator - Interface in org.apache.beam.runners.fnexecution.state
-
The
StateDelegator
is able to delegateBeamFnApi.StateRequest
s to a set of registered handlers. - StateDelegator.Registration - Interface in org.apache.beam.runners.fnexecution.state
-
Allows callers to deregister from receiving further state requests.
- Statefulness - Search tag in org.apache.beam.sdk.io.BoundedSource.BoundedReader.splitAtFraction(double)
- Section
- Statefulness - Search tag in org.apache.beam.sdk.io.BoundedSource.BoundedReader.splitAtFraction(double)
- Section
- StatefulParDoP<OutputT> - Class in org.apache.beam.runners.jet.processors
-
Jet
Processor
implementation for Beam's stateful ParDo primitive. - StatefulParDoP.Supplier<OutputT> - Class in org.apache.beam.runners.jet.processors
-
Jet
Processor
supplier that will provide instances ofStatefulParDoP
. - StatefulStreamingParDoEvaluator<KeyT,
ValueT, - Class in org.apache.beam.runners.spark.translation.streamingOutputT> -
A specialized evaluator for ParDo operations in Spark Streaming context that is invoked when stateful streaming is detected in the DoFn.
- StatefulStreamingParDoEvaluator() - Constructor for class org.apache.beam.runners.spark.translation.streaming.StatefulStreamingParDoEvaluator
- stateInternals() - Method in class org.apache.beam.runners.flink.translation.functions.FlinkNoOpStepContext
- stateInternals() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator.FlinkStepContext
- stateInternals() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.NoOpStepContext
- stateInternals() - Method in class org.apache.beam.runners.twister2.utils.NoOpStepContext
- StateKeySpec - Class in org.apache.beam.sdk.state
- StateRequestHandler - Interface in org.apache.beam.runners.fnexecution.state
-
Handler for
StateRequests
. - StateRequestHandlers - Class in org.apache.beam.runners.fnexecution.state
-
A set of utility methods which construct
StateRequestHandler
s. - StateRequestHandlers() - Constructor for class org.apache.beam.runners.fnexecution.state.StateRequestHandlers
- StateRequestHandlers.BagUserStateHandler<K,
V, - Interface in org.apache.beam.runners.fnexecution.stateW> -
A handler for bag user state.
- StateRequestHandlers.BagUserStateHandlerFactory<K,
V, - Interface in org.apache.beam.runners.fnexecution.stateW> -
A factory which constructs
StateRequestHandlers.BagUserStateHandler
s. - StateRequestHandlers.IterableSideInputHandler<V,
W> - Interface in org.apache.beam.runners.fnexecution.state -
A handler for iterable side inputs.
- StateRequestHandlers.MultimapSideInputHandler<K,
V, - Interface in org.apache.beam.runners.fnexecution.stateW> -
A handler for multimap side inputs.
- StateRequestHandlers.SideInputHandler - Interface in org.apache.beam.runners.fnexecution.state
-
Marker interface that denotes some type of side input handler.
- StateRequestHandlers.SideInputHandlerFactory - Interface in org.apache.beam.runners.fnexecution.state
-
A factory which constructs
StateRequestHandlers.MultimapSideInputHandler
s. - StateSpec<StateT> - Interface in org.apache.beam.sdk.state
-
A specification of a persistent state cell.
- StateSpec.Cases<ResultT> - Interface in org.apache.beam.sdk.state
-
Cases for doing a "switch" on the type of
StateSpec
. - StateSpec.Cases.WithDefault<ResultT> - Class in org.apache.beam.sdk.state
-
A base class for a visitor with a default method for cases it is not interested in.
- StateSpecFunctions - Class in org.apache.beam.runners.spark.stateful
-
A class containing
StateSpec
mappingFunctions. - StateSpecFunctions() - Constructor for class org.apache.beam.runners.spark.stateful.StateSpecFunctions
- StateSpecs - Class in org.apache.beam.sdk.state
-
Static methods for working with
StateSpecs
. - STATIC - Enum constant in enum class org.apache.beam.runners.spark.translation.SparkPCollectionView.Type
-
for fixed inputs.
- StaticGrpcProvisionService - Class in org.apache.beam.runners.fnexecution.provisioning
-
A
provision service
that returns a static response to all calls. - StaticRemoteEnvironment - Class in org.apache.beam.runners.fnexecution.environment
-
A
RemoteEnvironment
that connects to Dataflow runner harness. - StaticRemoteEnvironmentFactory - Class in org.apache.beam.runners.fnexecution.environment
-
An
EnvironmentFactory
that creates StaticRemoteEnvironment used by a runner harness that would like to use an existing InstructionRequestHandler. - StaticRemoteEnvironmentFactory.Provider - Class in org.apache.beam.runners.fnexecution.environment
-
Provider for StaticRemoteEnvironmentFactory.
- StaticSchemaInference - Class in org.apache.beam.sdk.schemas.utils
-
A set of utilities for inferring a Beam
Schema
from static Java types. - StaticSchemaInference() - Constructor for class org.apache.beam.sdk.schemas.utils.StaticSchemaInference
- status() - Method in class org.apache.beam.sdk.io.fs.MatchResult
-
Status of the
MatchResult
. - status() - Method in class org.apache.beam.sdk.transformservice.launcher.TransformServiceLauncher
- STATUS_BACKOFF_FACTORY - Static variable in class org.apache.beam.runners.dataflow.DataflowPipelineJob
- statusCode() - Method in class org.apache.beam.sdk.io.splunk.SplunkWriteError
- statusMessage() - Method in class org.apache.beam.sdk.io.splunk.SplunkWriteError
- stepName - Variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- stepName - Variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSource
- stepName() - Method in class org.apache.beam.sdk.metrics.MetricKey
-
The step name that is associated with this metric or Null if none is associated.
- steps() - Method in class org.apache.beam.sdk.metrics.MetricsFilter
- stop() - Method in interface org.apache.beam.runners.flink.translation.wrappers.streaming.io.BeamStoppableFunction
-
Unused method for backward compatibility.
- stop() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.UnboundedSourceWrapper
- stop() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver
- stop() - Method in class org.apache.beam.runners.spark.metrics.sink.CsvSink
- stop() - Method in class org.apache.beam.runners.spark.metrics.sink.GraphiteSink
- stop() - Method in class org.apache.beam.runners.spark.SparkPipelineResult
- stop() - Method in class org.apache.beam.runners.spark.SparkRunnerDebugger.DebugSparkPipelineResult
- stop() - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.sink.CodahaleCsvSink
- stop() - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.sink.CodahaleGraphiteSink
- stop() - Method in class org.apache.beam.runners.spark.translation.streaming.TestDStream
- stop() - Method in interface org.apache.beam.sdk.io.jms.AutoScaler
-
The
AutoScaler
is stopped when theJmsIO.UnboundedJmsReader
is closed. - stop() - Method in class org.apache.beam.sdk.io.jms.DefaultAutoscaler
- stop() - Static method in class org.apache.beam.sdk.transforms.DoFn.ProcessContinuation
-
Indicates that there is no more work to be done for the current element.
- stopAfter(Duration) - Method in class org.apache.beam.sdk.transforms.PeriodicImpulse
-
For internal use only; no backwards-compatibility guarantees.
- stopAt(Instant) - Method in class org.apache.beam.sdk.transforms.PeriodicImpulse
-
Assign a timestamp when the pipeliene stops producing data.
- STOPPED - Enum constant in enum class org.apache.beam.sdk.PipelineResult.State
-
The job has been paused, or has not yet started.
- stopProcess(String) - Method in class org.apache.beam.runners.fnexecution.environment.ProcessManager
-
Stops a previously started process identified by its unique id.
- stopSampling(int, long) - Method in class org.apache.beam.sdk.io.TextRowCountEstimator.LimitNumberOfFiles
- stopSampling(int, long) - Method in class org.apache.beam.sdk.io.TextRowCountEstimator.LimitNumberOfTotalBytes
- stopSampling(int, long) - Method in class org.apache.beam.sdk.io.TextRowCountEstimator.SampleAllFiles
- stopSampling(int, long) - Method in interface org.apache.beam.sdk.io.TextRowCountEstimator.SamplingStrategy
- stopSparkContext(JavaSparkContext) - Static method in class org.apache.beam.runners.spark.translation.SparkContextFactory
- STORAGE_API_AT_LEAST_ONCE - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.Method
-
Use the new, Storage Write API without exactly once enabled.
- STORAGE_STATS - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService.TableMetadataView
- STORAGE_WRITE_API - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.Method
-
Use the new, exactly-once Storage Write API.
- StorageApiCDC - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Constants and variables for CDC support.
- StorageApiCDC() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StorageApiCDC
- StorageApiConvertMessages<DestinationT,
ElementT> - Class in org.apache.beam.sdk.io.gcp.bigquery -
A transform that converts messages to protocol buffers in preparation for writing to BigQuery.
- StorageApiConvertMessages(StorageApiDynamicDestinations<ElementT, DestinationT>, BigQueryServices, TupleTag<BigQueryStorageApiInsertError>, TupleTag<KV<DestinationT, StorageApiWritePayload>>, Coder<BigQueryStorageApiInsertError>, Coder<KV<DestinationT, StorageApiWritePayload>>, SerializableFunction<ElementT, RowMutationInformation>, BadRecordRouter) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StorageApiConvertMessages
- StorageApiConvertMessages.ConvertMessagesDoFn<DestinationT,
ElementT> - Class in org.apache.beam.sdk.io.gcp.bigquery - StorageApiDynamicDestinationsTableRow<T,
DestinationT> - Class in org.apache.beam.sdk.io.gcp.bigquery - StorageApiFlushAndFinalizeDoFn - Class in org.apache.beam.sdk.io.gcp.bigquery
-
This DoFn flushes and optionally (if requested) finalizes Storage API streams.
- StorageApiFlushAndFinalizeDoFn(BigQueryServices) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StorageApiFlushAndFinalizeDoFn
- StorageApiLoads<DestinationT,
ElementT> - Class in org.apache.beam.sdk.io.gcp.bigquery -
This
PTransform
manages loads into BigQuery using the Storage API. - StorageApiLoads(Coder<DestinationT>, StorageApiDynamicDestinations<ElementT, DestinationT>, SerializableFunction<ElementT, RowMutationInformation>, BigQueryIO.Write.CreateDisposition, String, Duration, BigQueryServices, int, boolean, boolean, boolean, boolean, boolean, Predicate<String>, boolean, AppendRowsRequest.MissingValueInterpretation, Map<String, String>, BadRecordRouter, ErrorHandler<BadRecord, ?>) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StorageApiLoads
- StorageApiWritePayload - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Class used to wrap elements being sent to the Storage API sinks.
- StorageApiWritePayload() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload
- StorageApiWritePayload.Builder - Class in org.apache.beam.sdk.io.gcp.bigquery
- StorageApiWriteRecordsInconsistent<DestinationT,
ElementT> - Class in org.apache.beam.sdk.io.gcp.bigquery -
A transform to write sharded records to BigQuery using the Storage API.
- StorageApiWriteRecordsInconsistent(StorageApiDynamicDestinations<ElementT, DestinationT>, BigQueryServices, TupleTag<BigQueryStorageApiInsertError>, TupleTag<TableRow>, Predicate<String>, Coder<BigQueryStorageApiInsertError>, Coder<TableRow>, boolean, boolean, BigQueryIO.Write.CreateDisposition, String, boolean, AppendRowsRequest.MissingValueInterpretation, Map<String, String>) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteRecordsInconsistent
- StorageApiWritesShardedRecords<DestinationT,
ElementT> - Class in org.apache.beam.sdk.io.gcp.bigquery -
A transform to write sharded records to BigQuery using the Storage API (Streaming).
- StorageApiWritesShardedRecords(StorageApiDynamicDestinations<ElementT, DestinationT>, BigQueryIO.Write.CreateDisposition, String, BigQueryServices, Coder<DestinationT>, Coder<BigQueryStorageApiInsertError>, Coder<TableRow>, TupleTag<BigQueryStorageApiInsertError>, TupleTag<TableRow>, Predicate<String>, boolean, boolean, AppendRowsRequest.MissingValueInterpretation, Map<String, String>) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritesShardedRecords
- StorageApiWriteUnshardedRecords<DestinationT,
ElementT> - Class in org.apache.beam.sdk.io.gcp.bigquery -
Write records to the Storage API using a standard batch approach.
- StorageApiWriteUnshardedRecords(StorageApiDynamicDestinations<ElementT, DestinationT>, BigQueryServices, TupleTag<BigQueryStorageApiInsertError>, TupleTag<TableRow>, Predicate<String>, Coder<BigQueryStorageApiInsertError>, Coder<TableRow>, boolean, boolean, BigQueryIO.Write.CreateDisposition, String, boolean, AppendRowsRequest.MissingValueInterpretation, Map<String, String>) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords
- storageExists() - Method in class org.apache.beam.io.debezium.KafkaSourceConsumerFn.DebeziumSDFDatabaseHistory
- storageLevel() - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
- StorageLevelFactory() - Constructor for class org.apache.beam.runners.spark.SparkCommonPipelineOptions.StorageLevelFactory
- storageObject() - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.StorageObjectOrIOException
-
Returns the
StorageObject
. - StorageObjectOrIOException() - Constructor for class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.StorageObjectOrIOException
- storeId - Variable in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
- storeRecord(HistoryRecord) - Method in class org.apache.beam.io.debezium.KafkaSourceConsumerFn.DebeziumSDFDatabaseHistory
- STREAM_PARTITION_PREFIX - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
- streaming(Boolean) - Method in class org.apache.beam.sdk.io.iceberg.IcebergIO.ReadRows
- STREAMING - Enum constant in enum class org.apache.beam.runners.spark.translation.SparkPCollectionView.Type
-
for dynamically updated inputs.
- STREAMING - Enum constant in enum class org.apache.beam.sdk.io.solace.SolaceIO.WriterType
- STREAMING_ENGINE_EXPERIMENT - Static variable in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
-
Experiment to turn on the Streaming Engine experiment.
- STREAMING_INSERTS - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.Method
-
Use the BigQuery streaming insert API to insert data.
- STREAMING_INSERTS - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics.RpcMethod
- Streaming Changes from Cloud Bigtable - Search tag in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO
- Section
- StreamingImpulseSource - Class in org.apache.beam.runners.flink.translation.wrappers.streaming.io
-
Deprecated.Legacy non-portable source which can be replaced by a DoFn with timers. https://jira.apache.org/jira/browse/BEAM-8353
- StreamingImpulseSource(int, int) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.io.StreamingImpulseSource
-
Deprecated.
- StreamingInserts<DestinationT,
ElementT> - Class in org.apache.beam.sdk.io.gcp.bigquery -
PTransform that performs streaming BigQuery write.
- StreamingInserts(BigQueryIO.Write.CreateDisposition, DynamicDestinations<?, DestinationT>, Coder<ElementT>, SerializableFunction<ElementT, TableRow>, SerializableFunction<ElementT, TableRow>) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StreamingInserts
-
Constructor.
- StreamingInsertsMetrics - Interface in org.apache.beam.sdk.io.gcp.bigquery
-
Stores and exports metrics for a batch of Streaming Inserts RPCs.
- StreamingInsertsMetrics.NoOpStreamingInsertsMetrics - Class in org.apache.beam.sdk.io.gcp.bigquery
-
No-op implementation of
StreamingInsertsResults
. - StreamingInsertsMetrics.StreamingInsertsMetricsImpl - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Metrics of a batch of InsertAll RPCs.
- StreamingInsertsMetricsImpl() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StreamingInsertsMetrics.StreamingInsertsMetricsImpl
- StreamingIT - Interface in org.apache.beam.sdk.testing
-
Deprecated.tests which use unbounded PCollections should be in the category
UsesUnboundedPCollections
. Beyond that, it is up to the runner and test configuration to decide whether to run in streaming mode. - StreamingLogLevel - Enum Class in org.apache.beam.sdk.io.snowflake.enums
- Streaming new files matching a filepattern - Search tag in class org.apache.beam.sdk.extensions.avro.io.AvroIO
- Section
- StreamingOptions - Interface in org.apache.beam.sdk.options
-
Options used to configure streaming.
- StreamingSideInputHandlerFactory - Class in org.apache.beam.runners.fnexecution.translation
-
StateRequestHandler
that usesSideInputHandler
to access the broadcast state that represents side inputs. - StreamingSourceContextImpl - Class in org.apache.beam.sdk.io.cdap.context
-
Class for creating context object of different CDAP classes with stream source type.
- StreamingSourceContextImpl() - Constructor for class org.apache.beam.sdk.io.cdap.context.StreamingSourceContextImpl
- Streaming Support - Search tag in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
- Section
- StreamingTransformTranslator - Class in org.apache.beam.runners.spark.translation.streaming
-
Supports translation between a Beam transform, and Spark's operations on DStreams.
- StreamingTransformTranslator.SparkTransformsRegistrar - Class in org.apache.beam.runners.spark.translation.streaming
-
Registers classes specialized by the Spark runner.
- StreamingTransformTranslator.Translator - Class in org.apache.beam.runners.spark.translation.streaming
-
Translator matches Beam transformation with the appropriate evaluator.
- StreamingWriteTables<ElementT> - Class in org.apache.beam.sdk.io.gcp.bigquery
-
This transform takes in key-value pairs of
TableRow
entries and theTableDestination
it should be written to. - StreamingWriteTables() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StreamingWriteTables
- StreamPartitionWithWatermark - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.model
- StreamPartitionWithWatermark(Range.ByteStringRange, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.StreamPartitionWithWatermark
- StreamProgress - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction
-
Position for
ReadChangeStreamPartitionProgressTracker
. - StreamProgress() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
- StreamProgress(ChangeStreamContinuationToken, Instant, BigDecimal, Instant, boolean) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
- StreamProgress(CloseStream) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
- streamToTable(SnowflakeServices, ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
- StreamTransformTranslator<TransformT> - Interface in org.apache.beam.runners.twister2.translators
-
Stream TransformTranslator interface.
- Stream writing - Search tag in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO
- Section
- STRING - Enum constant in enum class org.apache.beam.sdk.io.clickhouse.TableSchema.TypeName
- STRING - Enum constant in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- STRING - Enum constant in enum class org.apache.beam.sdk.transforms.display.DisplayData.Type
- STRING - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- STRING - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
-
The type of string fields.
- STRING_TYPES - Static variable in enum class org.apache.beam.sdk.schemas.Schema.TypeName
- StringAgg - Class in org.apache.beam.sdk.extensions.sql.impl.udaf
-
Combine.CombineFn
s for aggregating strings or bytes with an optional delimiter (default comma). - StringAgg() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg
- StringAgg.StringAggByte - Class in org.apache.beam.sdk.extensions.sql.impl.udaf
-
A
Combine.CombineFn
that aggregates bytes with a byte array as delimiter. - StringAgg.StringAggString - Class in org.apache.beam.sdk.extensions.sql.impl.udaf
-
A
Combine.CombineFn
that aggregates strings with a string as delimiter. - StringAggByte(byte[]) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggByte
- StringAggString(String) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggString
- StringBuilderBundle() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.StringBuilderBundle
- StringBundle() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.StringBundle
- StringCompiler - Class in org.apache.beam.sdk.schemas.transforms.providers
- StringCompiler() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.StringCompiler
- StringCompiler.CompileException - Exception Class in org.apache.beam.sdk.schemas.transforms.providers
- StringDelegateCoder<T> - Class in org.apache.beam.sdk.coders
-
A
Coder
that wraps aCoder<String>
and encodes/decodes values via string representations. - StringDelegateCoder(Class<T>, TypeDescriptor<T>) - Constructor for class org.apache.beam.sdk.coders.StringDelegateCoder
- StringFunctions - Class in org.apache.beam.sdk.extensions.sql.zetasql.translation.impl
-
StringFunctions.
- StringFunctions() - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
- strings() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
The
TypeDescriptor
for String. - stringSet(Class<?>, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
-
Create a metric that accumulates and reports set of unique string values.
- stringSet(String, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
-
Create a metric that accumulates and reports set of unique string values.
- StringSet - Interface in org.apache.beam.sdk.metrics
-
A metric that reports set of unique string values.
- StringSetImpl - Class in org.apache.beam.runners.jet.metrics
-
Implementation of
StringSet
. - StringSetImpl(MetricName) - Constructor for class org.apache.beam.runners.jet.metrics.StringSetImpl
- StringSetResult - Class in org.apache.beam.sdk.metrics
-
The result of a
StringSet
metric. - StringSetResult() - Constructor for class org.apache.beam.sdk.metrics.StringSetResult
- StringSetResult.EmptyStringSetResult - Class in org.apache.beam.sdk.metrics
-
Empty
StringSetResult
, representing no values reported and is immutable. - StringUtf8Coder - Class in org.apache.beam.sdk.coders
- stripGetterPrefix(String) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
- StripIdsDoFn() - Constructor for class org.apache.beam.sdk.values.ValueWithRecordId.StripIdsDoFn
- stripPartitionDecorator(String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
-
Strip off any partition decorator information from a tablespec.
- stripPrefix(String, String) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
- stripSetterPrefix(String) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
- Structs - Class in org.apache.beam.runners.dataflow.util
-
A collection of static methods for manipulating datastructure representations transferred via the Dataflow API.
- StructuralByteArray - Class in org.apache.beam.sdk.coders
-
A wrapper around a byte[] that uses structural, value-based equality rather than byte[]'s normal object identity.
- StructuralByteArray(byte[]) - Constructor for class org.apache.beam.sdk.coders.StructuralByteArray
- StructuralKey<K> - Class in org.apache.beam.runners.local
-
A (Key, Coder) pair that uses the structural value of the key (as provided by
Coder.structuralValue(Object)
) to perform equality and hashing. - structuralValue(byte[]) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
-
Returns an object with an
Object.equals()
method that represents structural equality on the argument. - structuralValue(T) - Method in class org.apache.beam.sdk.coders.NullableCoder
- structuralValue(Iterable<T>) - Method in class org.apache.beam.sdk.coders.IterableCoder
- structuralValue(Void) - Method in class org.apache.beam.sdk.coders.VoidCoder
- structuralValue(Deque<T>) - Method in class org.apache.beam.sdk.coders.DequeCoder
- structuralValue(List<T>) - Method in class org.apache.beam.sdk.coders.ListCoder
- structuralValue(Map<K, V>) - Method in class org.apache.beam.sdk.coders.MapCoder
- structuralValue(Optional<T>) - Method in class org.apache.beam.sdk.coders.OptionalCoder
- structuralValue(SortedMap<K, V>) - Method in class org.apache.beam.sdk.coders.SortedMapCoder
- structuralValue(IsmFormat.IsmRecord<V>) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmRecordCoder
- structuralValue(KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
- structuralValue(KV<K, V>) - Method in class org.apache.beam.sdk.coders.KvCoder
- structuralValue(TimestampedValue<T>) - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
- structuralValue(ProducerRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.ProducerRecordCoder
- structuralValue(T) - Method in class org.apache.beam.sdk.coders.Coder
-
Returns an object with an
Object.equals()
method that represents structural equality on the argument. - structuralValue(T) - Method in class org.apache.beam.sdk.coders.DelegateCoder
-
Returns an object with an
Object.equals()
method that represents structural equality on the argument. - structuralValue(T) - Method in class org.apache.beam.sdk.coders.SerializableCoder
-
The structural value of the object is the object itself.
- structuralValue(T) - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
- structuralValue(T) - Method in class org.apache.beam.sdk.coders.ZstdCoder
-
Returns an object with an
Object.equals()
method that represents structural equality on the argument. - structuralValueConsistentWithEquals(Coder<T>, T, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given
Coder<T>
and values of typeT
, the structural values are equal if and only if the encoded bytes are equal. - structuralValueConsistentWithEqualsInContext(Coder<T>, Coder.Context, T, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given
Coder<T>
,Coder.Context
, and values of typeT
, the structural values are equal if and only if the encoded bytes are equal, in anyCoder.Context
. - structuralValueDecodeEncodeEqual(Coder<T>, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given
Coder<T>
and value of typeT
, the structural value is equal to the structural value yield by encoding and decoding the original value. - structuralValueDecodeEncodeEqualInContext(Coder<T>, Coder.Context, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given
Coder<T>
,Coder.Context
, and value of typeT
, the structural value is equal to the structural value yield by encoding and decoding the original value, in anyCoder.Context
. - structuralValueDecodeEncodeEqualIterable(Coder<T>, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given
Coder<T>
and value of typeT
, the structural value of the content of the Iterable is equal to the structural value yield by encoding and decoding the original value. - structuralValueDecodeEncodeEqualIterableInContext(Coder<T>, Coder.Context, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given
Coder<T>
,Coder.Context
, and value of typeT
, the structural content of the Iterable of the value is equal to the structural value yield by encoding and decoding the original value, in anyCoder.Context
. - StructuredCoder<T> - Class in org.apache.beam.sdk.coders
-
An abstract base class to implement a
Coder
that defines equality, hashing, and printing via the class name and recursively usingStructuredCoder.getComponents()
. - StructuredCoder() - Constructor for class org.apache.beam.sdk.coders.StructuredCoder
- StsAssumeRoleForFederatedCredentialsProvider - Class in org.apache.beam.sdk.io.aws2.auth
-
An implementation of AwsCredentialsProvider that periodically sends an
AssumeRoleWithWebIdentityRequest
to the AWS Security Token Service to maintain short-lived sessions to use for authentication. - StsAssumeRoleForFederatedCredentialsProvider() - Constructor for class org.apache.beam.sdk.io.aws2.auth.StsAssumeRoleForFederatedCredentialsProvider
- StsAssumeRoleForFederatedCredentialsProvider.Builder - Class in org.apache.beam.sdk.io.aws2.auth
-
Builder class for
StsAssumeRoleForFederatedCredentialsProvider
. - studyId - Variable in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
- subclassGetterInterface(ByteBuddy, Type, Type) - Static method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils
- subclassSetterInterface(ByteBuddy, Type, Type) - Static method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils
- submitFn - Variable in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
- subpath(int, int) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- subPathMatches(String, String) - Static method in class org.apache.beam.sdk.metrics.MetricFiltering
-
subPathMatches(haystack, needle)
returns true ifneedle
represents a path withinhaystack
. - SubscriberOptions - Class in org.apache.beam.sdk.io.gcp.pubsublite
- SubscriberOptions() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.SubscriberOptions
- SubscriberOptions.Builder - Class in org.apache.beam.sdk.io.gcp.pubsublite
- SubscribeTransform - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
- SubscribeTransform(SubscriberOptions) - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.SubscribeTransform
- SubscriptionPartition - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
- SubscriptionPartition() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.SubscriptionPartition
- SubscriptionPartitionCoder - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
- SubscriptionPartitionCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.SubscriptionPartitionCoder
- subscriptionPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
-
Subscription path used to listen for messages on
TestPubsub.topicPath()
. - subscriptionPath() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.SubscriberOptions
- subscriptionPathFromName(String, String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
- subscriptionPathFromPath(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
- substr(String, long, long) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
- SUBSTR - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
- SUBSTR_METHOD - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.BeamBuiltinMethods
- SUBSTR_PARAMETER_EXCEED_INTEGER - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
- subTriggers - Variable in class org.apache.beam.sdk.transforms.windowing.Trigger
- subTriggers() - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
- success() - Static method in class org.apache.beam.sdk.testing.SuccessOrFailure
- success(String, String) - Static method in class org.apache.beam.sdk.io.tika.ParseResult
- success(String, String, Metadata) - Static method in class org.apache.beam.sdk.io.tika.ParseResult
- SUCCESS - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write
-
The tag for the successful writes to HL7v2 store`.
- SUCCESS_COUNTER - Static variable in class org.apache.beam.sdk.testing.PAssert
- SUCCESS_TAG - Static variable in class org.apache.beam.sdk.extensions.ml.RecommendationAICreateCatalogItem
- SUCCESS_TAG - Static variable in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportCatalogItems
- SUCCESS_TAG - Static variable in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportUserEvents
- SUCCESS_TAG - Static variable in class org.apache.beam.sdk.extensions.ml.RecommendationAIPredict
- SUCCESS_TAG - Static variable in class org.apache.beam.sdk.extensions.ml.RecommendationAIWriteUserEvent
- SUCCESSFUL_BODY - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
-
The tag for successful writes to FHIR store.
- SUCCESSFUL_BUNDLES - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundles
-
The TupleTag used for bundles that were executed successfully.
- SUCCESSFUL_PUBLISH_TAG - Static variable in class org.apache.beam.sdk.io.solace.SolaceIO.Write
- SUCCESSFUL_WRITES - Static variable in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
- SuccessOrFailure - Class in org.apache.beam.sdk.testing
-
Output of
PAssert
. - Sum - Class in org.apache.beam.sdk.transforms
-
PTransform
s for computing the sum of the elements in aPCollection
, or the sum of the values associated with each key in aPCollection
ofKV
s. - Sum() - Constructor for class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.Sum
- SUM - Enum constant in enum class org.apache.beam.sdk.extensions.sql.impl.cep.CEPKind
- supplier(SerializablePipelineOptions, WindowedValues.WindowedValueCoder<KV<K, V>>, Coder, WindowingStrategy, String) - Static method in class org.apache.beam.runners.jet.processors.WindowGroupP
- supplier(Coder, String) - Static method in class org.apache.beam.runners.jet.processors.ImpulseP
- supplier(Coder, Coder, WindowingStrategy<?, ?>, String) - Static method in class org.apache.beam.runners.jet.processors.ViewP
- supplier(Coder, Coder, WindowingStrategy<T, BoundedWindow>, String) - Static method in class org.apache.beam.runners.jet.processors.AssignWindowP
- supplier(BoundedSource<T>, SerializablePipelineOptions, Coder, String) - Static method in class org.apache.beam.runners.jet.processors.BoundedSourceP
- supplier(UnboundedSource<T, CmT>, SerializablePipelineOptions, Coder, String) - Static method in class org.apache.beam.runners.jet.processors.UnboundedSourceP
- Supplier(String, String, DoFn<InputT, OutputT>, WindowingStrategy<?, ?>, DoFnSchemaInformation, SerializablePipelineOptions, TupleTag<OutputT>, Set<TupleTag<OutputT>>, Coder<InputT>, Map<PCollectionView<?>, Coder<?>>, Map<TupleTag<?>, Coder<?>>, Coder<InputT>, Map<TupleTag<?>, Coder<?>>, Collection<PCollectionView<?>>, Map<String, PCollectionView<?>>) - Constructor for class org.apache.beam.runners.jet.processors.ParDoP.Supplier
- Supplier(String, String, DoFn<KV<?, ?>, OutputT>, WindowingStrategy<?, ?>, DoFnSchemaInformation, SerializablePipelineOptions, TupleTag<OutputT>, Set<TupleTag<OutputT>>, Coder<KV<?, ?>>, Map<PCollectionView<?>, Coder<?>>, Map<TupleTag<?>, Coder<?>>, Coder<KV<?, ?>>, Map<TupleTag<?>, Coder<?>>, Collection<PCollectionView<?>>, Map<String, PCollectionView<?>>) - Constructor for class org.apache.beam.runners.jet.processors.StatefulParDoP.Supplier
- Supplier(Map<String, Coder>, Coder, String) - Constructor for class org.apache.beam.runners.jet.processors.FlattenP.Supplier
- SUPPORTED_FORMATS - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
- SUPPORTED_FORMATS - Static variable in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider
- SUPPORTED_FORMATS_STR - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
- SUPPORTED_FORMATS_STR - Static variable in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider
- SUPPORTED_OPS - Static variable in class org.apache.beam.sdk.io.iceberg.FilterUtils
- Supported Kafka Client Versions - Search tag in class org.apache.beam.sdk.io.kafka.KafkaIO
- Section
- supportsCondition() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter.RelType
- supportsNormalizedKey() - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueComparator
- supportsPartitioning(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.iceberg.IcebergTableProvider
- supportsPartitioning(Table) - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider
- supportsPartitioning(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.store.InMemoryMetaStore
- supportsProjectionPushdown() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- supportsProjectionPushdown() - Method in interface org.apache.beam.sdk.schemas.ProjectionProducer
-
Whether
this
supports projection pushdown. - supportsProjects() - Method in class org.apache.beam.sdk.extensions.sql.meta.BaseBeamTable
- supportsProjects() - Method in interface org.apache.beam.sdk.extensions.sql.meta.BeamSqlTable
-
Whether project push-down is supported by the IO API.
- supportsProjects() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTable
- supportsSerializationWithKeyNormalization() - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueComparator
- synchronize(RestrictionTracker<RestrictionT, PositionT>) - Static method in class org.apache.beam.sdk.fn.splittabledofn.RestrictionTrackers
- SYNCHRONIZED_PROCESSING_TIME - Enum constant in enum class org.apache.beam.sdk.state.TimeDomain
-
For internal use only; no backwards compatibility guarantees.
- synchronizedPlainRead(KafkaIOUtilsBenchmark.PlainAccumulatorState) - Method in class org.apache.beam.sdk.io.kafka.jmh.KafkaIOUtilsBenchmark
- synchronizedPlainReadWhileWriting(KafkaIOUtilsBenchmark.PlainAccumulatorState) - Method in class org.apache.beam.sdk.io.kafka.jmh.KafkaIOUtilsBenchmark
- synchronizedPlainWrite(KafkaIOUtilsBenchmark.PlainAccumulatorState, KafkaIOUtilsBenchmark.ProducerState) - Method in class org.apache.beam.sdk.io.kafka.jmh.KafkaIOUtilsBenchmark
- synchronizedPlainWriteWhileReading(KafkaIOUtilsBenchmark.PlainAccumulatorState, KafkaIOUtilsBenchmark.ProducerState) - Method in class org.apache.beam.sdk.io.kafka.jmh.KafkaIOUtilsBenchmark
- SynchronizedStreamObserver<V> - Class in org.apache.beam.sdk.fn.stream
-
A
StreamObserver
which provides synchronous access access to an underlyingStreamObserver
. - SystemReduceFnBuffering<K,
T, - Class in org.apache.beam.runners.twister2.translators.functions.internalW> - SystemReduceFnBuffering() - Constructor for class org.apache.beam.runners.twister2.translators.functions.internal.SystemReduceFnBuffering
- SystemReduceFnBuffering(Coder<T>) - Constructor for class org.apache.beam.runners.twister2.translators.functions.internal.SystemReduceFnBuffering
T
- T__0 - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
- T__0 - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- T__1 - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
- T__1 - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- T__2 - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
- T__2 - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- T__3 - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
- T__3 - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- T__4 - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
- T__4 - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- table() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.hcatalog.HCatalogTable
- table() - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
- Table - Class in org.apache.beam.sdk.extensions.sql.meta
-
Represents the metadata of a
BeamSqlTable
. - Table() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.Table
- TABLE - Enum constant in enum class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.ScanType
- TABLE_FIELD_SCHEMAS - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOErrorToTableRow
- TABLE_METADATA_VIEW_UNSPECIFIED - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService.TableMetadataView
- TABLE_ROW_ERROR_CONTAINER - Static variable in interface org.apache.beam.sdk.io.gcp.bigquery.ErrorContainer
- Table.Builder - Class in org.apache.beam.sdk.extensions.sql.meta
-
Builder class for
Table
. - TableAlreadyExistsException - Exception Class in org.apache.beam.sdk.io.iceberg
- TableAlreadyExistsException(Throwable) - Constructor for exception class org.apache.beam.sdk.io.iceberg.TableAlreadyExistsException
- TableAndQuery() - Constructor for class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher.TableAndQuery
- TableAndRecord<T> - Class in org.apache.beam.sdk.io.kudu
-
A wrapper for a
KuduTable
and theTableAndRecord
representing a typed record. - TableAndRecord(KuduTable, T) - Constructor for class org.apache.beam.sdk.io.kudu.TableAndRecord
- TableDestination - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Encapsulates a BigQuery table destination.
- TableDestination(TableReference, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- TableDestination(TableReference, String, TimePartitioning) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- TableDestination(TableReference, String, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- TableDestination(TableReference, String, String, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- TableDestination(String, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- TableDestination(String, String, TimePartitioning, Clustering) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- TableDestination(String, String, TimePartitioning) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- TableDestination(String, String, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- TableDestination(String, String, String, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- TableDestinationCoder - Class in org.apache.beam.sdk.io.gcp.bigquery
-
A coder for
TableDestination
objects. - TableDestinationCoderV2 - Class in org.apache.beam.sdk.io.gcp.bigquery
-
A
Coder
forTableDestination
that includes time partitioning information. - TableDestinationCoderV2() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV2
- TableDestinationCoderV3 - Class in org.apache.beam.sdk.io.gcp.bigquery
-
A
Coder
forTableDestination
that includes time partitioning and clustering information. - TableDestinationCoderV3() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV3
- tableExists() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
-
Checks whether the metadata table already exists in the database.
- tableFieldToProtoTableField(TableFieldSchema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
- TableName - Class in org.apache.beam.sdk.extensions.sql.impl
-
Represents a parsed table name that is specified in a FROM clause (and other places).
- TableName() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.TableName
- TableNameExtractionUtils - Class in org.apache.beam.sdk.extensions.sql
-
Helper class to extract table identifiers from the query.
- TableNameExtractionUtils() - Constructor for class org.apache.beam.sdk.extensions.sql.TableNameExtractionUtils
- TableProvider - Interface in org.apache.beam.sdk.extensions.sql.meta.provider
-
A
TableProvider
handles the metadata CRUD of a specified kind of tables. - tableReference() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
- Table References - Search tag in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
- Section
- TableResolution - Class in org.apache.beam.sdk.extensions.sql.zetasql
-
Utility methods to resolve a table, given a top-level Calcite schema and a table path.
- TableResolution() - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.TableResolution
- tableRowFromBeamRow() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
- tableRowFromMessage(Message, boolean, Predicate<String>) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
- tableRowFromMessage(Message, boolean, Predicate<String>, String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
- TableRowJsonCoder - Class in org.apache.beam.sdk.io.gcp.bigquery
- tableRows(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider
- tableRowToBeamRow() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
- TableRowToStorageApiProto - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Utility methods for converting JSON
TableRow
objects to dynamic protocol message, for use with the Storage write API. - TableRowToStorageApiProto() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
- TableRowToStorageApiProto.SchemaDoesntMatchException - Exception Class in org.apache.beam.sdk.io.gcp.bigquery
- TableRowToStorageApiProto.SchemaTooNarrowException - Exception Class in org.apache.beam.sdk.io.gcp.bigquery
- TableRowToStorageApiProto.SingleValueConversionException - Exception Class in org.apache.beam.sdk.io.gcp.bigquery
- tables() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider
- tableSchema() - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
- TableSchema - Class in org.apache.beam.sdk.io.clickhouse
-
A descriptor for ClickHouse table schema.
- TableSchema() - Constructor for class org.apache.beam.sdk.io.clickhouse.TableSchema
- TableSchema.Column - Class in org.apache.beam.sdk.io.clickhouse
-
A column in ClickHouse table.
- TableSchema.ColumnType - Class in org.apache.beam.sdk.io.clickhouse
-
A descriptor for a column type.
- TableSchema.DefaultType - Enum Class in org.apache.beam.sdk.io.clickhouse
-
An enumeration of possible kinds of default values in ClickHouse.
- TableSchema.TypeName - Enum Class in org.apache.beam.sdk.io.clickhouse
-
An enumeration of possible types in ClickHouse.
- TableSchemaCache - Class in org.apache.beam.sdk.io.gcp.bigquery
-
An updatable cache for table schemas.
- TableSchemaUpdateUtils - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Helper utilities for handling schema-update responses.
- TableSchemaUpdateUtils() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableSchemaUpdateUtils
- tableSpec() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
- TableUtils - Class in org.apache.beam.sdk.extensions.sql
- TableWithRows(long, Table) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider.TableWithRows
- TaggedKeyedPCollection(TupleTag<V>, PCollection<KV<K, V>>) - Constructor for class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple.TaggedKeyedPCollection
- TaggedPValue - Class in org.apache.beam.sdk.values
-
For internal use only; no backwards-compatibility guarantees.
- TaggedPValue() - Constructor for class org.apache.beam.sdk.values.TaggedPValue
- take() - Method in class org.apache.beam.sdk.fn.CancellableQueue
-
Takes an element from this queue.
- take(String, Duration) - Method in interface org.apache.beam.runners.fnexecution.control.ControlClientPool.Source
-
Retrieves the
InstructionRequestHandler
for the given worker id, blocking until available or the request times out. - takeOutputElements() - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
Deprecated.Use
TestPipeline
with theDirectRunner
. - takeOutputElements(TupleTag<T>) - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
Deprecated.Use
TestPipeline
with theDirectRunner
. - takeOutputElementsWithTimestamp() - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
Deprecated.Use
TestPipeline
with theDirectRunner
. - tanh(Double) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinTrigonometricFunctions
-
TANH(X)
- targetForRootUrl(String) - Static method in interface org.apache.beam.sdk.io.gcp.pubsub.PubsubOptions
-
Internal only utility for converting
PubsubOptions.getPubsubRootUrl()
(e.g. - TDigestQuantiles - Class in org.apache.beam.sdk.extensions.sketching
-
PTransform
s for getting information about quantiles in a stream. - TDigestQuantiles() - Constructor for class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles
- TDigestQuantiles.GlobalDigest - Class in org.apache.beam.sdk.extensions.sketching
-
Implementation of
TDigestQuantiles.globally()
. - TDigestQuantiles.PerKeyDigest<K> - Class in org.apache.beam.sdk.extensions.sketching
-
Implementation of
TDigestQuantiles.perKey()
. - TDigestQuantiles.TDigestQuantilesFn - Class in org.apache.beam.sdk.extensions.sketching
-
Implements the
Combine.CombineFn
ofTDigestQuantiles
transforms. - teardown() - Method in interface org.apache.beam.io.requestresponse.SetupTeardown
-
Called during the
DoFn
's teardown lifecycle method. - teardown() - Method in class org.apache.beam.runners.dataflow.BatchStatefulParDoOverrides.BatchStatefulDoFn
- teardown() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadSpannerSchema
- teardown() - Method in interface org.apache.beam.sdk.io.kafka.CheckStopReadingFn
- teardown() - Method in class org.apache.beam.sdk.io.pulsar.ReadFromPulsarDoFn
- teardown() - Method in class org.apache.beam.sdk.io.pulsar.WriteToPulsarDoFn
- teardown() - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
- tearDown() - Method in interface org.apache.beam.sdk.extensions.sql.BeamSqlSeekableTable
-
cleanup resources of the instance.
- tearDown() - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark.ProtobufByteStringOutputStream
- tearDown() - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark.SdkCoreByteStringOutputStream
- tearDown(Blackhole) - Method in class org.apache.beam.sdk.jmh.util.VarIntBenchmark.ByteStringOutput
- Tee<T> - Class in org.apache.beam.sdk.transforms
-
A PTransform that returns its input, but also applies its input to an auxiliary PTransform, akin to the shell
tee
command, which is named after the T-splitter used in plumbing. - TEMP_FILES - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
-
The tag for temp files for import to FHIR store.
- Temporary and Output File Naming: - Search tag in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
- Section
- test(RunnerApi.PTransform) - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.IsFlinkNativeTransform
- test(RunnerApi.PTransform) - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator.IsFlinkNativeTransform
- test(RunnerApi.PTransform) - Method in class org.apache.beam.runners.spark.translation.SparkBatchPortablePipelineTranslator.IsSparkNativeTransform
- TEST_REUSE_SPARK_CONTEXT - Static variable in class org.apache.beam.runners.spark.translation.SparkContextFactory
-
Deprecated.This will leak your SparkContext, any attempt to create a new SparkContext later will fail. Please use
SparkContextFactory.setProvidedSparkContext(JavaSparkContext)
/SparkContextFactory.clearProvidedSparkContext()
instead to properly control the lifecycle of your context. Alternatively you may also provide a SparkContext usingSparkPipelineOptions.setUsesProvidedSparkContext(boolean)
together withSparkContextOptions.setProvidedSparkContext(JavaSparkContext)
and close that one appropriately. Tests of this module should useSparkContextRule
. - TestBigQuery - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Test rule which creates a new table with specified schema, with randomized name and exposes few APIs to work with it.
- TestBigQuery.PollingAssertion - Interface in org.apache.beam.sdk.io.gcp.bigquery
-
Interface to implement a polling assertion.
- TestBigQuery.RowsAssertion - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Interface for creating a polling eventual assertion.
- TestBigQueryOptions - Interface in org.apache.beam.sdk.io.gcp.bigquery
- TestBoundedTable - Class in org.apache.beam.sdk.extensions.sql.meta.provider.test
-
Mocked table for bounded data sources.
- TestBoundedTable(Schema) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestBoundedTable
- testByteCount(Coder<T>, Coder.Context, T[]) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
A utility method that passes the given (unencoded) elements through coder's registerByteSizeObserver() and encode() methods, and confirms they are mutually consistent.
- testCombineFn(Combine.CombineFn<InputT, AccumT, OutputT>, List<InputT>, Matcher<? super OutputT>) - Static method in class org.apache.beam.sdk.testing.CombineFnTester
- testCombineFn(Combine.CombineFn<InputT, AccumT, OutputT>, List<InputT>, OutputT) - Static method in class org.apache.beam.sdk.testing.CombineFnTester
-
Tests that the
Combine.CombineFn
, when applied to the provided input, produces the provided output. - testCopyArray(ByteStringOutputStreamBenchmark.NewVsCopy.ArrayCopyState, Blackhole) - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark.NewVsCopy
- TestDataflowPipelineOptions - Interface in org.apache.beam.runners.dataflow
-
A set of options used to configure the
TestPipeline
. - TestDataflowRunner - Class in org.apache.beam.runners.dataflow
-
TestDataflowRunner
is a pipeline runner that wraps aDataflowRunner
when running tests against theTestPipeline
. - TestDStream<T> - Class in org.apache.beam.runners.spark.translation.streaming
- TestDStream(TestStream<T>, StreamingContext) - Constructor for class org.apache.beam.runners.spark.translation.streaming.TestDStream
- TestElementByteSizeObserver() - Constructor for class org.apache.beam.sdk.testing.CoderProperties.TestElementByteSizeObserver
- TestExecutors - Class in org.apache.beam.sdk.fn.test
-
A
TestRule
that validates that all submitted tasks finished and were completed. - TestExecutors() - Constructor for class org.apache.beam.sdk.fn.test.TestExecutors
- TestExecutors.TestExecutorService - Interface in org.apache.beam.sdk.fn.test
-
A union of the
ExecutorService
andTestRule
interfaces. - TestFlinkRunner - Class in org.apache.beam.runners.flink
-
Test Flink runner.
- TESTING - Enum constant in enum class org.apache.beam.sdk.io.solace.SolaceIO.SubmissionMode
- testingPipelineOptions() - Static method in class org.apache.beam.sdk.testing.TestPipeline
-
Creates
PipelineOptions
for testing. - TestJobService - Class in org.apache.beam.runners.portability.testing
-
A JobService for tests.
- TestJobService(Endpoints.ApiServiceDescriptor, String, String, JobApi.JobState.Enum, JobApi.MetricResults) - Constructor for class org.apache.beam.runners.portability.testing.TestJobService
- testNewArray(ByteStringOutputStreamBenchmark.NewVsCopy.ArrayNewState, Blackhole) - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark.NewVsCopy
- TestPipeline - Class in org.apache.beam.sdk.testing
-
A creator of test pipelines that can be used inside of tests that can be configured to run locally or against a remote pipeline runner.
- TestPipeline.AbandonedNodeException - Exception Class in org.apache.beam.sdk.testing
-
An exception thrown in case an abandoned
PTransform
is detected, that is, aPTransform
that has not been run. - TestPipeline.PipelineRunMissingException - Exception Class in org.apache.beam.sdk.testing
-
An exception thrown in case a test finishes without invoking
Pipeline.run()
. - TestPipeline.TestValueProviderOptions - Interface in org.apache.beam.sdk.testing
-
Implementation detail of
TestPipeline.newProvider(T)
, do not use. - TestPipelineOptions - Interface in org.apache.beam.sdk.testing
-
TestPipelineOptions
is a set of options for test pipelines. - TestPipelineOptions.AlwaysPassMatcher - Class in org.apache.beam.sdk.testing
-
Matcher which will always pass.
- TestPipelineOptions.AlwaysPassMatcherFactory - Class in org.apache.beam.sdk.testing
-
Factory for
PipelineResult
matchers which always pass. - TestPortablePipelineOptions - Interface in org.apache.beam.runners.portability.testing
-
Options for
TestPortableRunner
. - TestPortablePipelineOptions.DefaultJobServerConfigFactory - Class in org.apache.beam.runners.portability.testing
-
Factory for default config.
- TestPortablePipelineOptions.TestPortablePipelineOptionsRegistrar - Class in org.apache.beam.runners.portability.testing
-
Register
TestPortablePipelineOptions
. - TestPortablePipelineOptionsRegistrar() - Constructor for class org.apache.beam.runners.portability.testing.TestPortablePipelineOptions.TestPortablePipelineOptionsRegistrar
- TestPortableRunner - Class in org.apache.beam.runners.portability.testing
-
TestPortableRunner
is a pipeline runner that wraps aPortableRunner
when running tests against theTestPipeline
. - TestPrismPipelineOptions - Interface in org.apache.beam.runners.prism
-
PipelineOptions
for use with theTestPrismRunner
. - TestPrismRunner - Class in org.apache.beam.runners.prism
- testProtobufByteStringOutputStreamFewLargeWrites() - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
- testProtobufByteStringOutputStreamFewMixedWritesWithoutReuse() - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
- testProtobufByteStringOutputStreamFewMixedWritesWithReuse(ByteStringOutputStreamBenchmark.ProtobufByteStringOutputStream) - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
- testProtobufByteStringOutputStreamFewSmallWrites() - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
- testProtobufByteStringOutputStreamFewTinyWrites() - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
- testProtobufByteStringOutputStreamManyLargeWrites() - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
- testProtobufByteStringOutputStreamManyMixedWritesWithoutReuse() - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
- testProtobufByteStringOutputStreamManyMixedWritesWithReuse(ByteStringOutputStreamBenchmark.ProtobufByteStringOutputStream) - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
- testProtobufByteStringOutputStreamManySmallWrites() - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
- testProtobufByteStringOutputStreamManyTinyWrites() - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
- TestPubsub - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Test rule which creates a new topic and subscription with randomized names and exposes the APIs to work with them.
- TestPubsub.PollingAssertion - Interface in org.apache.beam.sdk.io.gcp.pubsub
- TestPubsubOptions - Interface in org.apache.beam.sdk.io.gcp.pubsub
- TestPubsubSignal - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Test rule which observes elements of the
PCollection
and checks whether they match the success criteria. - TestSchemaTransformProvider - Class in org.apache.beam.sdk.managed.testing
- TestSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.managed.testing.TestSchemaTransformProvider
- TestSchemaTransformProvider.Config - Class in org.apache.beam.sdk.managed.testing
- TestSchemaTransformProvider.Config.Builder - Class in org.apache.beam.sdk.managed.testing
- testSdkCoreByteStringOutputStreamFewLargeWrites() - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
- testSdkCoreByteStringOutputStreamFewMixedWritesWithoutReuse() - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
- testSdkCoreByteStringOutputStreamFewMixedWritesWithReuse(ByteStringOutputStreamBenchmark.SdkCoreByteStringOutputStream) - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
- testSdkCoreByteStringOutputStreamFewSmallWrites() - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
- testSdkCoreByteStringOutputStreamFewTinyWrites() - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
- testSdkCoreByteStringOutputStreamManyLargeWrites() - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
- testSdkCoreByteStringOutputStreamManyMixedWritesWithoutReuse() - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
- testSdkCoreByteStringOutputStreamManyMixedWritesWithReuse(ByteStringOutputStreamBenchmark.SdkCoreByteStringOutputStream) - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
- testSdkCoreByteStringOutputStreamManySmallWrites() - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
- testSdkCoreByteStringOutputStreamManyTinyWrites() - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
- TestSparkPipelineOptions - Interface in org.apache.beam.runners.spark
-
A
SparkPipelineOptions
for tests. - TestSparkPipelineOptions.DefaultStopPipelineWatermarkFactory - Class in org.apache.beam.runners.spark
-
A factory to provide the default watermark to stop a pipeline that reads from an unbounded source.
- TestSparkRunner - Class in org.apache.beam.runners.spark
-
The SparkRunner translate operations defined on a pipeline to a representation executable by Spark, and then submitting the job to Spark to be executed.
- TestStream<T> - Class in org.apache.beam.sdk.testing
-
A testing input that generates an unbounded
PCollection
of elements, advancing the watermark and processing time as elements are emitted. - TestStream.Builder<T> - Class in org.apache.beam.sdk.testing
-
An incomplete
TestStream
. - TestStream.ElementEvent<T> - Class in org.apache.beam.sdk.testing
-
A
TestStream.Event
that produces elements. - TestStream.Event<T> - Interface in org.apache.beam.sdk.testing
-
An event in a
TestStream
. - TestStream.EventType - Enum Class in org.apache.beam.sdk.testing
-
The types of
TestStream.Event
that are supported byTestStream
. - TestStream.ProcessingTimeEvent<T> - Class in org.apache.beam.sdk.testing
-
A
TestStream.Event
that advances the processing time clock. - TestStream.TestStreamCoder<T> - Class in org.apache.beam.sdk.testing
-
Coder for
TestStream
. - TestStream.WatermarkEvent<T> - Class in org.apache.beam.sdk.testing
-
A
TestStream.Event
that advances the watermark. - TestStreams - Class in org.apache.beam.sdk.fn.test
-
Utility methods which enable testing of
StreamObserver
s. - TestStreams() - Constructor for class org.apache.beam.sdk.fn.test.TestStreams
- TestStreams.Builder<T> - Class in org.apache.beam.sdk.fn.test
-
A builder for a test
CallStreamObserver
that performs various callbacks. - TestStreamSource<T> - Class in org.apache.beam.runners.flink.translation.wrappers.streaming.io
-
Flink source for executing
TestStream
. - TestStreamSource(SerializableFunction<byte[], TestStream<T>>, byte[]) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.io.TestStreamSource
- TestTable - Class in org.apache.beam.sdk.extensions.sql.meta.provider.test
-
Base class for mocked table.
- TestTable(Schema) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTable
- TestTableFilter - Class in org.apache.beam.sdk.extensions.sql.meta.provider.test
- TestTableFilter(List<RexNode>) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableFilter
- TestTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.test
-
Test in-memory table provider for use in tests.
- TestTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider
- TestTableProvider.PushDownOptions - Enum Class in org.apache.beam.sdk.extensions.sql.meta.provider.test
- TestTableProvider.TableWithRows - Class in org.apache.beam.sdk.extensions.sql.meta.provider.test
-
TableWitRows.
- TestTableUtils - Class in org.apache.beam.sdk.extensions.sql.meta.provider.test
-
Utility functions for mock classes.
- TestTableUtils() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableUtils
- TestUnboundedTable - Class in org.apache.beam.sdk.extensions.sql.meta.provider.test
-
A mocked unbounded table.
- TestUniversalRunner - Class in org.apache.beam.runners.portability.testing
- TestUniversalRunner.Options - Interface in org.apache.beam.runners.portability.testing
- TestUniversalRunner.OptionsRegistrar - Class in org.apache.beam.runners.portability.testing
-
Register
TestUniversalRunner.Options
. - TestUniversalRunner.RunnerRegistrar - Class in org.apache.beam.runners.portability.testing
-
Registrar for the portable runner.
- TextIO - Class in org.apache.beam.sdk.io
-
PTransform
s for reading and writing text files. - TextIO.CompressionType - Enum Class in org.apache.beam.sdk.io
-
Deprecated.Use
Compression
. - TextIO.Read - Class in org.apache.beam.sdk.io
-
Implementation of
TextIO.read()
. - TextIO.ReadAll - Class in org.apache.beam.sdk.io
-
Deprecated.See
TextIO.readAll()
for details. - TextIO.ReadFiles - Class in org.apache.beam.sdk.io
-
Implementation of
TextIO.readFiles()
. - TextIO.Sink - Class in org.apache.beam.sdk.io
-
Implementation of
TextIO.sink()
. - TextIO.TypedWrite<UserT,
DestinationT> - Class in org.apache.beam.sdk.io -
Implementation of
TextIO.write()
. - TextIO.Write - Class in org.apache.beam.sdk.io
-
This class is used as the default return value of
TextIO.write()
. - TextJsonTable - Class in org.apache.beam.sdk.extensions.sql.meta.provider.text
-
TextJsonTable
is aBeamSqlTable
that reads text files and converts them according to the JSON format. - TextJsonTable(Schema, String, TextTableProvider.JsonToRow, TextTableProvider.RowToJson) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextJsonTable
- TextMessageMapper - Class in org.apache.beam.sdk.io.jms
- TextMessageMapper() - Constructor for class org.apache.beam.sdk.io.jms.TextMessageMapper
- TextRowCountEstimator - Class in org.apache.beam.sdk.io
-
This returns a row count estimation for files associated with a file pattern.
- TextRowCountEstimator() - Constructor for class org.apache.beam.sdk.io.TextRowCountEstimator
- TextRowCountEstimator.Builder - Class in org.apache.beam.sdk.io
-
Builder for
TextRowCountEstimator
. - TextRowCountEstimator.LimitNumberOfFiles - Class in org.apache.beam.sdk.io
-
This strategy stops sampling if we sample enough number of bytes.
- TextRowCountEstimator.LimitNumberOfTotalBytes - Class in org.apache.beam.sdk.io
-
This strategy stops sampling when total number of sampled bytes are more than some threshold.
- TextRowCountEstimator.NoEstimationException - Exception Class in org.apache.beam.sdk.io
-
An exception that will be thrown if the estimator cannot get an estimation of the number of lines.
- TextRowCountEstimator.SampleAllFiles - Class in org.apache.beam.sdk.io
-
This strategy samples all the files.
- TextRowCountEstimator.SamplingStrategy - Interface in org.apache.beam.sdk.io
-
Sampling Strategy shows us when should we stop reading further files.
- TextSource - Class in org.apache.beam.sdk.io
-
Implementation detail of
TextIO.Read
. - TextSource(MatchResult.Metadata, long, long, byte[]) - Constructor for class org.apache.beam.sdk.io.TextSource
- TextSource(MatchResult.Metadata, long, long, byte[], int) - Constructor for class org.apache.beam.sdk.io.TextSource
- TextSource(ValueProvider<String>, EmptyMatchTreatment, byte[]) - Constructor for class org.apache.beam.sdk.io.TextSource
- TextSource(ValueProvider<String>, EmptyMatchTreatment, byte[], int) - Constructor for class org.apache.beam.sdk.io.TextSource
- TextSourceBenchmark - Class in org.apache.beam.sdk.jmh.io
- TextSourceBenchmark() - Constructor for class org.apache.beam.sdk.jmh.io.TextSourceBenchmark
- TextSourceBenchmark.Data - Class in org.apache.beam.sdk.jmh.io
- TextTable - Class in org.apache.beam.sdk.extensions.sql.meta.provider.text
-
TextTable
is aBeamSqlTable
that reads text files and converts them according to the specified format. - TextTable(Schema, String, PTransform<PCollection<String>, PCollection<Row>>, PTransform<PCollection<Row>, PCollection<String>>) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTable
-
Text table with the specified read and write transforms.
- TextTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.text
-
Text table provider.
- TextTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTableProvider
- TextTableProvider.CsvToRow - Class in org.apache.beam.sdk.extensions.sql.meta.provider.text
-
Read-side converter for
TextTable
with format'csv'
. - TextTableProvider.LinesReadConverter - Class in org.apache.beam.sdk.extensions.sql.meta.provider.text
-
Read-side converter for
TextTable
with format'lines'
. - TextTableProvider.LinesWriteConverter - Class in org.apache.beam.sdk.extensions.sql.meta.provider.text
-
Write-side converter for for
TextTable
with format'lines'
. - TextualIntegerCoder - Class in org.apache.beam.sdk.coders
-
A
Coder
that encodesInteger Integers
as the ASCII bytes of their textual, decimal, representation. - TextualIntegerCoder() - Constructor for class org.apache.beam.sdk.coders.TextualIntegerCoder
- TFRecordIO - Class in org.apache.beam.sdk.io
-
PTransform
s for reading and writing TensorFlow TFRecord files. - TFRecordIO.CompressionType - Enum Class in org.apache.beam.sdk.io
-
Deprecated.Use
Compression
. - TFRecordIO.Read - Class in org.apache.beam.sdk.io
-
Implementation of
TFRecordIO.read()
. - TFRecordIO.ReadFiles - Class in org.apache.beam.sdk.io
-
Implementation of
TFRecordIO.readFiles()
. - TFRecordIO.Sink - Class in org.apache.beam.sdk.io
- TFRecordIO.Write - Class in org.apache.beam.sdk.io
-
Implementation of
TFRecordIO.write()
. - TFRecordReadSchemaTransformConfiguration - Class in org.apache.beam.sdk.io
-
Configuration for reading from TFRecord.
- TFRecordReadSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.TFRecordReadSchemaTransformConfiguration
- TFRecordReadSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io
-
Builder for
TFRecordReadSchemaTransformConfiguration
. - TFRecordReadSchemaTransformProvider - Class in org.apache.beam.sdk.io
- TFRecordReadSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.TFRecordReadSchemaTransformProvider
- TFRecordReadSchemaTransformProvider.ErrorFn - Class in org.apache.beam.sdk.io
- TFRecordReadSchemaTransformTranslator() - Constructor for class org.apache.beam.sdk.io.TFRecordSchemaTransformTranslation.TFRecordReadSchemaTransformTranslator
- TFRecordSchemaTransformTranslation - Class in org.apache.beam.sdk.io
- TFRecordSchemaTransformTranslation() - Constructor for class org.apache.beam.sdk.io.TFRecordSchemaTransformTranslation
- TFRecordSchemaTransformTranslation.ReadWriteRegistrar - Class in org.apache.beam.sdk.io
- TFRecordSchemaTransformTranslation.TFRecordReadSchemaTransformTranslator - Class in org.apache.beam.sdk.io
- TFRecordSchemaTransformTranslation.TFRecordWriteSchemaTransformTranslator - Class in org.apache.beam.sdk.io
- TFRecordWriteSchemaTransformConfiguration - Class in org.apache.beam.sdk.io
-
Configuration for reading from TFRecord.
- TFRecordWriteSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformConfiguration
- TFRecordWriteSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io
- TFRecordWriteSchemaTransformProvider - Class in org.apache.beam.sdk.io
- TFRecordWriteSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformProvider
- TFRecordWriteSchemaTransformProvider.ErrorFn - Class in org.apache.beam.sdk.io
- TFRecordWriteSchemaTransformTranslator() - Constructor for class org.apache.beam.sdk.io.TFRecordSchemaTransformTranslation.TFRecordWriteSchemaTransformTranslator
- that(String, PCollection<T>) - Static method in class org.apache.beam.sdk.testing.PAssert
-
Constructs an
PAssert.IterableAssert
for the elements of the providedPCollection
with the specified reason. - that(PCollection<T>) - Static method in class org.apache.beam.sdk.testing.PAssert
-
Constructs an
PAssert.IterableAssert
for the elements of the providedPCollection
. - thatFlattened(String, PCollectionList<T>) - Static method in class org.apache.beam.sdk.testing.PAssert
-
Constructs an
PAssert.IterableAssert
for the elements of the flattenedPCollectionList
with the specified reason. - thatFlattened(PCollectionList<T>) - Static method in class org.apache.beam.sdk.testing.PAssert
-
Constructs an
PAssert.IterableAssert
for the elements of the flattenedPCollectionList
. - thatList(PCollectionList<T>) - Static method in class org.apache.beam.sdk.testing.PAssert
-
Constructs an
PAssert.PCollectionListContentsAssert
for the providedPCollectionList
. - thatMap(String, PCollection<KV<K, V>>) - Static method in class org.apache.beam.sdk.testing.PAssert
-
Constructs a
PAssert.SingletonAssert
for the value of the providedPCollection
with the specified reason. - thatMap(PCollection<KV<K, V>>) - Static method in class org.apache.beam.sdk.testing.PAssert
-
Constructs a
PAssert.SingletonAssert
for the value of the providedPCollection
, which must have at most one value per key. - thatMultimap(String, PCollection<KV<K, V>>) - Static method in class org.apache.beam.sdk.testing.PAssert
-
Constructs a
PAssert.SingletonAssert
for the value of the providedPCollection
with the specified reason. - thatMultimap(PCollection<KV<K, V>>) - Static method in class org.apache.beam.sdk.testing.PAssert
-
Constructs a
PAssert.SingletonAssert
for the value of the providedPCollection
. - thatSingleton(String, PCollection<T>) - Static method in class org.apache.beam.sdk.testing.PAssert
-
Constructs a
PAssert.SingletonAssert
for the value of the providedPCollection PCollection<T>
with the specified reason. - thatSingleton(PCollection<T>) - Static method in class org.apache.beam.sdk.testing.PAssert
-
Constructs a
PAssert.SingletonAssert
for the value of the providedPCollection PCollection<T>
, which must be a singleton. - thatSingletonIterable(String, PCollection<? extends Iterable<T>>) - Static method in class org.apache.beam.sdk.testing.PAssert
-
Constructs an
PAssert.IterableAssert
for the value of the providedPCollection
with the specified reason. - thatSingletonIterable(PCollection<? extends Iterable<T>>) - Static method in class org.apache.beam.sdk.testing.PAssert
-
Constructs an
PAssert.IterableAssert
for the value of the providedPCollection
which must contain a singleIterable<T>
value. - The advanced SolaceIO#read(TypeDescriptor, SerializableFunction, SerializableFunction) method - Search tag in class org.apache.beam.sdk.io.solace.SolaceIO
- Section
- The no-arg SolaceIO#read() method - Search tag in class org.apache.beam.sdk.io.solace.SolaceIO
- Section
- threadSafe(WatermarkEstimator<WatermarkEstimatorStateT>) - Static method in class org.apache.beam.sdk.fn.splittabledofn.WatermarkEstimators
-
Returns a thread safe
WatermarkEstimator
which allows getting a snapshot of the current watermark and watermark estimator state. - Thread safety - Search tag in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
- Section
- Thread safety - Search tag in org.apache.beam.sdk.io.BoundedSource.BoundedReader.getFractionConsumed()
- Section
- Thread safety - Search tag in org.apache.beam.sdk.io.BoundedSource.BoundedReader.getFractionConsumed()
- Section
- Thread safety - Search tag in org.apache.beam.sdk.io.BoundedSource.BoundedReader.getFractionConsumed()
- Section
- Thread safety - Search tag in org.apache.beam.sdk.io.BoundedSource.BoundedReader.getSplitPointsConsumed()
- Section
- Thread safety - Search tag in org.apache.beam.sdk.io.BoundedSource.BoundedReader.getSplitPointsConsumed()
- Section
- Thread safety - Search tag in org.apache.beam.sdk.io.BoundedSource.BoundedReader.getSplitPointsConsumed()
- Section
- Thread safety - Search tag in org.apache.beam.sdk.io.BoundedSource.BoundedReader.getSplitPointsRemaining()
- Section
- Thread safety - Search tag in org.apache.beam.sdk.io.BoundedSource.BoundedReader.getSplitPointsRemaining()
- Section
- Thread safety - Search tag in org.apache.beam.sdk.io.BoundedSource.BoundedReader.getSplitPointsRemaining()
- Section
- Thread safety - Search tag in org.apache.beam.sdk.io.BoundedSource.BoundedReader.getSplitPointsRemaining()
- Section
- Thread Safety - Search tag in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
- Section
- Thread safety and blocking - Search tag in org.apache.beam.sdk.io.BoundedSource.BoundedReader.splitAtFraction(double)
- Section
- Thread safety and blocking - Search tag in org.apache.beam.sdk.io.BoundedSource.BoundedReader.splitAtFraction(double)
- Section
- ThriftCoder<T> - Class in org.apache.beam.sdk.io.thrift
-
A
Coder
using a ThriftTProtocol
to serialize/deserialize elements. - ThriftCoder(Class<T>, TProtocolFactory) - Constructor for class org.apache.beam.sdk.io.thrift.ThriftCoder
- ThriftIO - Class in org.apache.beam.sdk.io.thrift
-
PTransform
s for reading and writing files containing Thrift encoded data. - ThriftIO.ReadFiles<T> - Class in org.apache.beam.sdk.io.thrift
-
Implementation of
ThriftIO.readFiles(java.lang.Class<T>)
. - ThriftIO.Sink<T> - Class in org.apache.beam.sdk.io.thrift
-
Implementation of
ThriftIO.sink(org.apache.thrift.protocol.TProtocolFactory)
. - ThriftIO.ThriftWriter<T> - Class in org.apache.beam.sdk.io.thrift
-
Writer to write Thrift object to
OutputStream
. - ThriftPayloadSerializerProvider - Class in org.apache.beam.sdk.io.thrift
- ThriftPayloadSerializerProvider() - Constructor for class org.apache.beam.sdk.io.thrift.ThriftPayloadSerializerProvider
- ThriftSchema - Class in org.apache.beam.sdk.io.thrift
-
Schema provider for generated thrift types.
- ThriftSchema.Customizer - Class in org.apache.beam.sdk.io.thrift
- THROTTLE_TIME_COUNTER_NAME - Static variable in class org.apache.beam.sdk.metrics.Metrics
- THROTTLE_TIME_NAMESPACE - Static variable in class org.apache.beam.sdk.metrics.Metrics
- THROTTLED_TIME - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
- throttledBaseBackoff() - Method in class org.apache.beam.sdk.io.aws2.common.RetryConfiguration
- throttledBaseBackoff(Duration) - Method in class org.apache.beam.sdk.io.aws2.common.RetryConfiguration.Builder
- throttledTimeCounter(BigQuerySinkMetrics.RpcMethod) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
- throttleRampup - Variable in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntityWithSummary
- throttleRampup - Variable in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKeyWithSummary
- throttleRampup - Variable in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.WriteWithSummary
- Throttling - Search tag in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO
- Section
- THROUGHPUT_BASED - Enum constant in enum class org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions.AutoscalingAlgorithmType
-
Autoscale the workerpool based on throughput (up to maxNumWorkers).
- THROUGHPUT_WINDOW_SECONDS - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
-
The sliding window size in seconds for throughput reporting.
- Throughput and latency - Search tag in class org.apache.beam.sdk.io.solace.SolaceIO
- Section
- ThroughputEstimator<T> - Interface in org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator
-
An estimator to calculate the throughput of the outputted elements from a DoFn.
- throwable() - Method in class org.apache.beam.sdk.values.EncodableThrowable
-
Returns the underlying
Throwable
. - ThrowableHandler() - Constructor for class org.apache.beam.sdk.transforms.WithFailures.ThrowableHandler
- throwableToGRPCCodeString(Throwable) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
-
Converts a Throwable to a gRPC Status code.
- THROWING_ROUTER - Static variable in interface org.apache.beam.sdk.transforms.errorhandling.BadRecordRouter
- ThrowingBadRecordRouter() - Constructor for class org.apache.beam.sdk.transforms.errorhandling.BadRecordRouter.ThrowingBadRecordRouter
- ThrowingBiConsumer<T1,
T2> - Interface in org.apache.beam.sdk.function -
A
BiConsumer
which can throwException
s. - ThrowingBiFunction<T1,
T2, - Interface in org.apache.beam.sdk.functionT3> -
A
BiFunction
which can throwException
s. - ThrowingConsumer<ExceptionT,
T> - Interface in org.apache.beam.sdk.function - ThrowingFunction<T1,
T2> - Interface in org.apache.beam.sdk.function - ThrowingRunnable - Interface in org.apache.beam.sdk.function
- throwNullCredentialException() - Static method in class org.apache.beam.sdk.extensions.gcp.auth.NullCredentialInitializer
- TikaIO - Class in org.apache.beam.sdk.io.tika
-
Transforms for parsing arbitrary files using Apache Tika.
- TikaIO() - Constructor for class org.apache.beam.sdk.io.tika.TikaIO
- TikaIO.Parse - Class in org.apache.beam.sdk.io.tika
-
Implementation of
TikaIO.parse()
. - TikaIO.ParseFiles - Class in org.apache.beam.sdk.io.tika
-
Implementation of
TikaIO.parseFiles()
. - time() - Method in class org.apache.beam.sdk.io.splunk.SplunkEvent
- Time - Class in org.apache.beam.sdk.schemas.logicaltypes
-
A time without a time-zone.
- Time() - Constructor for class org.apache.beam.sdk.schemas.logicaltypes.Time
- TIME - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
- TIME - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.SqlTypes
-
Beam LogicalType corresponding to ZetaSQL/CalciteSQL TIME type.
- TIME_FIELD_NAME - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.DateTime
- TIME_WITH_LOCAL_TZ - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
- TimeConversion() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimeConversion
- timeDomain() - Method in class org.apache.beam.sdk.transforms.DoFn.OnTimerContext
-
Returns the time domain of the current timer.
- TimeDomain - Enum Class in org.apache.beam.sdk.state
-
TimeDomain
specifies whether an operation is based on timestamps of elements or current "real-world" time as reported while processing. - TimeMicrosConversion() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimeMicrosConversion
- TimeMicrosConversion() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimeMicrosConversion
- TimeMillisConversion() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimeMillisConversion
- timer(String, String, String) - Static method in class org.apache.beam.sdk.fn.data.LogicalEndpoint
- timer(TimeDomain) - Static method in class org.apache.beam.sdk.state.TimerSpecs
- Timer - Interface in org.apache.beam.sdk.state
-
A timer for a specified time domain that can be set to register the desire for further processing at particular time in its specified time domain.
- TIMER_MARKER - Static variable in class org.apache.beam.runners.spark.util.TimerUtils
-
Constant marker used to identify timer values in transformations.
- TimerEndpoint<T> - Class in org.apache.beam.sdk.fn.data
- TimerEndpoint() - Constructor for class org.apache.beam.sdk.fn.data.TimerEndpoint
- timerId() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.TimerSpec
- timerInternals - Variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- timerInternals() - Method in class org.apache.beam.runners.flink.translation.functions.FlinkNoOpStepContext
- timerInternals() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator.FlinkStepContext
- timerInternals() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.NoOpStepContext
- timerInternals() - Method in class org.apache.beam.runners.twister2.utils.NoOpStepContext
- timerMap(TimeDomain) - Static method in class org.apache.beam.sdk.state.TimerSpecs
- TimerMap - Interface in org.apache.beam.sdk.state
- TimerMarker() - Constructor for class org.apache.beam.runners.spark.util.TimerUtils.TimerMarker
- TimerReceiverFactory - Class in org.apache.beam.runners.fnexecution.control
-
A factory that passes timers to
TimerReceiverFactory.timerDataConsumer
. - TimerReceiverFactory(StageBundleFactory, BiConsumer<Timer<?>, TimerInternals.TimerData>, Coder) - Constructor for class org.apache.beam.runners.fnexecution.control.TimerReceiverFactory
- Timers - Interface in org.apache.beam.sdk.state
-
Interface for interacting with time.
- timerService - Variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- timersIterable() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.SingletonKeyedWorkItem
- TimerSpec - Interface in org.apache.beam.sdk.state
-
A specification for a
Timer
. - TimerSpec() - Constructor for class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.TimerSpec
- TimerSpecs - Class in org.apache.beam.sdk.state
-
Static methods for working with
TimerSpecs
. - TimerSpecs() - Constructor for class org.apache.beam.sdk.state.TimerSpecs
- TimerUtils - Class in org.apache.beam.runners.spark.util
-
Utility class for handling timers in the Spark runner.
- TimerUtils() - Constructor for class org.apache.beam.runners.spark.util.TimerUtils
- TimerUtils.TimerMarker - Class in org.apache.beam.runners.spark.util
-
A marker class used to identify timer keys and values in Spark transformations.
- TIMES - Enum constant in enum class org.apache.beam.sdk.extensions.sql.impl.cep.CEPKind
- timestamp() - Method in class org.apache.beam.runners.twister2.utils.Twister2AssignContext
- timestamp() - Method in class org.apache.beam.sdk.transforms.DoFn.OnTimerContext
-
Returns the output timestamp of the current timer.
- timestamp() - Method in class org.apache.beam.sdk.transforms.DoFn.ProcessContext
-
Returns the timestamp of the input element.
- timestamp() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn.AssignContext
-
Returns the timestamp of the current element.
- timestamp(Integer) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.TimestampFunctions
- timestamp(Integer, String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.TimestampFunctions
- timestamp(String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.TimestampFunctions
- timestamp(String, String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.TimestampFunctions
- TIMESTAMP - Enum constant in enum class org.apache.beam.sdk.transforms.display.DisplayData.Type
- TIMESTAMP - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
- TIMESTAMP - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.SqlTypes
-
Beam LogicalType corresponding to ZetaSQL TIMESTAMP type.
- TIMESTAMP_FIELD_NAME - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOErrorToTableRow
- TIMESTAMP_MAX_VALUE - Static variable in class org.apache.beam.sdk.transforms.windowing.BoundedWindow
-
The maximum value for any Beam timestamp.
- TIMESTAMP_METHOD - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.BeamBuiltinMethods
- TIMESTAMP_MICROS - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.RowUtils
- TIMESTAMP_MIN_VALUE - Static variable in class org.apache.beam.sdk.transforms.windowing.BoundedWindow
-
The minimum value for any Beam timestamp.
- TIMESTAMP_OP - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
- TIMESTAMP_WITH_LOCAL_TZ - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
- timestampAttributeKey - Search tag in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
- Section
- timestampColumnIndex(int) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestUnboundedTable
- TimestampCombiner - Enum Class in org.apache.beam.sdk.transforms.windowing
-
Policies for combining timestamps that occur within a window.
- TimeStampComparator() - Constructor for class org.apache.beam.runners.dataflow.util.MonitoringUtil.TimeStampComparator
- TimestampConversion() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimestampConversion
- TimestampConvert() - Constructor for class org.apache.beam.sdk.extensions.protobuf.ProtoSchemaLogicalTypes.TimestampConvert
- TimestampConverter - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams
-
Convert between different Timestamp and Instant classes.
- TimestampConverter() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.TimestampConverter
- timestamped(Iterable<TimestampedValue<T>>) - Static method in class org.apache.beam.sdk.transforms.Create
-
Returns a new
Create.TimestampedValues
transform that produces aPCollection
containing the elements of the providedIterable
with the specified timestamps. - timestamped(Iterable<T>, Iterable<Long>) - Static method in class org.apache.beam.sdk.transforms.Create
-
Returns a new root transform that produces a
PCollection
containing the specified elements with the specified timestamps. - timestamped(TimestampedValue<T>, TimestampedValue<T>...) - Static method in class org.apache.beam.sdk.transforms.Create
-
Returns a new
Create.TimestampedValues
transform that produces aPCollection
containing the specified elements with the specified timestamps. - TimestampedValue<V> - Class in org.apache.beam.sdk.values
-
An immutable pair of a value and a timestamp.
- TimestampedValue(V, Instant) - Constructor for class org.apache.beam.sdk.values.TimestampedValue
- TimestampedValue.TimestampedValueCoder<T> - Class in org.apache.beam.sdk.values
-
A
Coder
forTimestampedValue
. - timestampedValueInGlobalWindow(T, Instant) - Static method in class org.apache.beam.sdk.values.WindowedValues
-
Returns a
WindowedValue
with the given value and timestamp,GlobalWindow
and default pane. - timestampedValueInGlobalWindow(T, Instant, PaneInfo) - Static method in class org.apache.beam.sdk.values.WindowedValues
-
Returns a
WindowedValue
with the given value, timestamp, and pane in theGlobalWindow
. - TimestampEncoding - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.encoder
-
This encoder/decoder writes a com.google.cloud.Timestamp object as a pair of long and int to avro and reads a Timestamp object from the same pair.
- TimestampEncoding() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.encoder.TimestampEncoding
- timestampExtractor - Variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.bounded.FlinkBoundedSource
- timestampExtractor - Variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase
- TimestampFunctions - Class in org.apache.beam.sdk.extensions.sql.zetasql.translation.impl
-
TimestampFunctions.
- TimestampFunctions() - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.TimestampFunctions
- TimestampMicrosConversion() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimestampMicrosConversion
- TimestampMicrosConversion() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimestampMicrosConversion
- TimestampMillisConversion() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimestampMillisConversion
- timestampMsSinceEpoch() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.IncomingMessage
-
Timestamp for element (ms since epoch).
- TimestampObservingWatermarkEstimator<WatermarkEstimatorStateT> - Interface in org.apache.beam.sdk.transforms.splittabledofn
-
A
WatermarkEstimator
that observes the timestamps of all records output from aDoFn
. - TimestampPolicy<K,
V> - Class in org.apache.beam.sdk.io.kafka -
A timestamp policy to assign event time for messages in a Kafka partition and watermark for it.
- TimestampPolicy() - Constructor for class org.apache.beam.sdk.io.kafka.TimestampPolicy
- TimestampPolicy.PartitionContext - Class in org.apache.beam.sdk.io.kafka
-
The context contains state maintained in the reader for the partition.
- TimestampPolicyFactory<KeyT,
ValueT> - Interface in org.apache.beam.sdk.io.kafka -
An extendable factory to create a
TimestampPolicy
for each partition at runtime by KafkaIO reader. - TimestampPolicyFactory.LogAppendTimePolicy<K,
V> - Class in org.apache.beam.sdk.io.kafka -
Assigns Kafka's log append time (server side ingestion time) to each record.
- TimestampPolicyFactory.ProcessingTimePolicy<K,
V> - Class in org.apache.beam.sdk.io.kafka -
A simple policy that uses current time for event time and watermark.
- TimestampPolicyFactory.TimestampFnPolicy<K,
V> - Class in org.apache.beam.sdk.io.kafka -
Internal policy to support deprecated withTimestampFn API.
- TimestampPrefixingWindowCoder<T> - Class in org.apache.beam.sdk.coders
-
A
TimestampPrefixingWindowCoder
wraps arbitrary user custom window coder. - TimestampRange - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction
-
A restriction represented by a range of timestamps [from, to).
- TimestampRangeTracker - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction
-
A
RestrictionTracker
for claiming positions in aTimestampRange
in a monotonically increasing fashion. - TimestampRangeTracker(TimestampRange) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
- timestamps() - Static method in class org.apache.beam.sdk.transforms.Reify
-
Create a
PTransform
that will output all inputs wrapped in aTimestampedValue
. - timestampsInValue() - Static method in class org.apache.beam.sdk.transforms.Reify
-
Create a
PTransform
that will output all inputKVs
with the timestamp inside the value. - TimestampTransform - Class in org.apache.beam.sdk.transforms.windowing
-
For internal use only; no backwards-compatibility guarantees.
- TimestampTransform.AlignTo - Class in org.apache.beam.sdk.transforms.windowing
-
For internal use only; no backwards-compatibility guarantees.
- TimestampTransform.Delay - Class in org.apache.beam.sdk.transforms.windowing
-
For internal use only; no backwards-compatibility guarantees.
- TimestampUtils - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction
-
Provides methods in order to convert timestamp to nanoseconds representation and back.
- TimestampUtils() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampUtils
- timeSupplier - Variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
- TimeUtil - Class in org.apache.beam.runners.dataflow.util
-
A helper class for converting between Dataflow API and SDK time representations.
- TimeUtil - Class in org.apache.beam.sdk.io.aws2.kinesis
-
Time conversion utilities.
- TimeWithLocalTzType() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils.TimeWithLocalTzType
- TINY_INT - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
- TmpCheckpointDirFactory() - Constructor for class org.apache.beam.runners.spark.SparkCommonPipelineOptions.TmpCheckpointDirFactory
- to(long) - Method in class org.apache.beam.sdk.io.GenerateSequence
-
Specifies the maximum number to generate (exclusive).
- to(TableReference) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Writes to the given table, specified as a
TableReference
. - to(Class<OutputT>) - Static method in class org.apache.beam.sdk.schemas.transforms.Convert
-
Convert a
PCollection
<InputT> to aPCollection
<OutputT>. - to(String) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
-
Writes to file(s) with the given output prefix.
- to(String) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
- to(String) - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.WriteBatches
-
Queue url to write to.
- to(String) - Method in class org.apache.beam.sdk.io.FileIO.Write
-
Specifies a common directory for all generated files.
- to(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Writes to the given table, specified in the format described in
BigQueryHelpers.parseTableSpec(java.lang.String)
. - to(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
-
Publishes to the specified topic.
- to(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
-
A table name to be written in Snowflake.
- to(String) - Method in class org.apache.beam.sdk.io.solr.SolrIO.Write
-
Provide name of collection while reading from Solr.
- to(String) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
-
Writes to text files with the given prefix.
- to(String) - Method in class org.apache.beam.sdk.io.TextIO.Write
- to(String) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
-
Writes TFRecord file(s) with the given output prefix.
- to(String) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Write
-
Writes to files with the given path prefix.
- to(DynamicAvroDestinations<T, ?, T>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
-
Deprecated.Use
FileIO.write()
orFileIO.writeDynamic()
instead. - to(DynamicAvroDestinations<UserT, NewDestinationT, OutputT>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
-
Deprecated.Use
FileIO.write()
orFileIO.writeDynamic()
instead. - to(SqsIO.WriteBatches.DynamicDestination<T>) - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.WriteBatches
-
Dynamic record based destination to write to.
- to(FileBasedSink.DynamicDestinations<String, ?, String>) - Method in class org.apache.beam.sdk.io.TextIO.Write
-
Deprecated.
- to(FileBasedSink.DynamicDestinations<UserT, NewDestinationT, String>) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
-
Deprecated.
- to(FileBasedSink.FilenamePolicy) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
-
Writes to files named according to the given
FileBasedSink.FilenamePolicy
. - to(FileBasedSink.FilenamePolicy) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
- to(FileBasedSink.FilenamePolicy) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
-
Writes to files named according to the given
FileBasedSink.FilenamePolicy
. - to(FileBasedSink.FilenamePolicy) - Method in class org.apache.beam.sdk.io.TextIO.Write
- to(FileBasedSink<UserT, DestinationT, OutputT>) - Static method in class org.apache.beam.sdk.io.WriteFiles
-
Creates a
WriteFiles
transform that writes to the givenFileBasedSink
, letting the runner control how many different shards are produced. - to(ResourceId) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
-
Writes to file(s) with the given output prefix.
- to(ResourceId) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
- to(ResourceId) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
- to(ResourceId) - Method in class org.apache.beam.sdk.io.TextIO.Write
- to(ResourceId) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
-
Writes TFRecord file(s) with a prefix given by the specified resource.
- to(DynamicDestinations<T, ?>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Writes to the table and schema specified by the
DynamicDestinations
object. - to(DynamicDestinations) - Method in class org.apache.beam.sdk.io.iceberg.IcebergIO.WriteRows
- to(Solace.Queue) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Write
-
Write to a Solace queue.
- to(Solace.Topic) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Write
-
Write to a Solace topic.
- to(ValueProvider<String>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
- to(ValueProvider<String>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
- to(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.FileIO.Write
-
Like
FileIO.Write.to(String)
but with aValueProvider
. - to(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Same as
BigQueryIO.Write.to(String)
, but with aValueProvider
. - to(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
-
Like
topic()
but with aValueProvider
. - to(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
- to(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
- to(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.TextIO.Write
- to(SerializableFunction<String, DefaultFilenamePolicy.Params>, DefaultFilenamePolicy.Params) - Method in class org.apache.beam.sdk.io.TextIO.Write
-
Deprecated.
- to(SerializableFunction<ValueInSingleWindow<T>, String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
-
Provides a function to dynamically specify the target topic per message.
- to(SerializableFunction<ValueInSingleWindow<T>, TableDestination>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Writes to table specified by the specified table function.
- to(SerializableFunction<UserT, DefaultFilenamePolicy.Params>, DefaultFilenamePolicy.Params) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
-
Deprecated.
- to(TypeDescriptor<OutputT>) - Static method in class org.apache.beam.sdk.schemas.transforms.Convert
-
Convert a
PCollection
<InputT> to aPCollection
<OutputT>. - to(TableIdentifier) - Method in class org.apache.beam.sdk.io.iceberg.IcebergIO.WriteRows
- toAbsolutePath() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- toAdditionalInputs(Iterable<PCollectionView<?>>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
-
Expands a list of
PCollectionView
into the form needed forPTransform.getAdditionalInputs()
. - toAvroField(Schema.Field, String) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
-
Get Avro Field from Beam Field.
- toAvroSchema(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
- toAvroSchema(Schema, String, String) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
-
Converts a Beam Schema into an AVRO schema.
- toAvroType(String, String) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.FixedBytesField
-
Convert to an AVRO type.
- toBaseType(byte[]) - Method in class org.apache.beam.sdk.schemas.logicaltypes.FixedBytes
- toBaseType(InputT) - Method in interface org.apache.beam.sdk.schemas.Schema.LogicalType
-
Convert the input type to the type Java type used by the base
Schema.FieldType
. - toBaseType(Instant) - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimestamp
- toBaseType(LocalDate) - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.LocalMktDate
- toBaseType(LocalDate) - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCDateOnly
- toBaseType(LocalTime) - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimeOnly
- toBaseType(OffsetDateTime) - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimestamp
- toBaseType(OffsetTime) - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimeOnly
- toBaseType(Duration) - Method in class org.apache.beam.sdk.schemas.logicaltypes.NanosDuration
- toBaseType(Instant) - Method in class org.apache.beam.sdk.schemas.logicaltypes.MicrosInstant
- toBaseType(Instant) - Method in class org.apache.beam.sdk.schemas.logicaltypes.NanosInstant
- toBaseType(LocalDate) - Method in class org.apache.beam.sdk.schemas.logicaltypes.Date
- toBaseType(LocalDateTime) - Method in class org.apache.beam.sdk.schemas.logicaltypes.DateTime
- toBaseType(LocalTime) - Method in class org.apache.beam.sdk.schemas.logicaltypes.Time
- toBaseType(UUID) - Method in class org.apache.beam.sdk.schemas.logicaltypes.UuidLogicalType
- toBaseType(EnumerationType.Value) - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
- toBaseType(OneOfType.Value) - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
- toBaseType(Schema) - Method in class org.apache.beam.sdk.schemas.logicaltypes.SchemaLogicalType
- toBaseType(PythonCallableSource) - Method in class org.apache.beam.sdk.schemas.logicaltypes.PythonCallable
- toBaseType(T) - Method in class org.apache.beam.sdk.schemas.logicaltypes.PassThroughLogicalType
- toBeamField(Schema.Field) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
-
Get Beam Field from avro Field.
- toBeamObject(Value, boolean) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSqlBeamTranslationUtils
- toBeamObject(Value, Schema.FieldType, boolean) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSqlBeamTranslationUtils
- toBeamRow() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadConfiguration
-
Serializes configuration to a
Row
. - toBeamRow() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteConfiguration
-
Serializes configuration to a
Row
. - toBeamRow(Value, Schema, boolean) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSqlBeamTranslationUtils
- toBeamRow(String, Schema) - Static method in class org.apache.beam.sdk.schemas.utils.YamlUtils
- toBeamRow(String, Schema, boolean) - Static method in class org.apache.beam.sdk.schemas.utils.YamlUtils
- toBeamRow(Map<String, Object>, Schema, boolean) - Static method in class org.apache.beam.sdk.schemas.utils.YamlUtils
- toBeamRow(GenericRecord, Schema, BigQueryUtils.ConversionOptions) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
- toBeamRow(Schema, TableRow) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
- toBeamRow(Schema, TableSchema, TableRow) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
-
Tries to parse the JSON
TableRow
from BigQuery. - toBeamRowStrict(GenericRecord, Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
-
Strict conversion from AVRO to Beam, strict because it doesn't do widening or narrowing during conversion.
- toBeamRowStrict(GenericRecord, Schema, GenericData) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
-
Strict conversion from AVRO to Beam, strict because it doesn't do widening or narrowing during conversion.
- toBeamSchema(Class<?>) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
-
Converts AVRO schema to Beam row schema.
- toBeamSchema(ResultSetMetaData) - Static method in class org.apache.beam.sdk.io.jdbc.SchemaUtil
-
Infers the Beam
Schema
fromResultSetMetaData
. - toBeamSchema(List<Field>) - Static method in class org.apache.beam.sdk.extensions.arrow.ArrowConversion.ArrowSchemaTranslator
- toBeamSchema(Schema) - Static method in class org.apache.beam.sdk.extensions.arrow.ArrowConversion.ArrowSchemaTranslator
- toBeamSchema(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
-
Converts AVRO schema to Beam row schema.
- toBeamType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.FixedBytesField
-
Convert to a Beam type.
- toBeamType(Type) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSqlBeamTranslationUtils
- ToBigtableRowFn(Map<String, String>) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BeamRowToBigtableMutation.ToBigtableRowFn
- toBuilder() - Method in class org.apache.beam.io.requestresponse.Monitoring
- toBuilder() - Method in class org.apache.beam.sdk.extensions.sbe.SbeSchema.IrOptions
- toBuilder() - Method in class org.apache.beam.sdk.extensions.sbe.UnsignedOptions
- toBuilder() - Method in class org.apache.beam.sdk.extensions.sql.meta.Table
- toBuilder() - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.Globally
- toBuilder() - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.PerKey
- toBuilder() - Method in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration
- toBuilder() - Method in class org.apache.beam.sdk.io.aws2.common.RetryConfiguration
- toBuilder() - Method in class org.apache.beam.sdk.io.aws2.kinesis.WatermarkParameters
- toBuilder() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
-
Creates a new
S3FileSystemConfiguration.Builder
with values initialized by this instance's properties. - toBuilder() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload
- toBuilder() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchGetDocuments
- toBuilder() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchWriteWithDeadLetterQueue
- toBuilder() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchWriteWithSummary
- toBuilder() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.ListCollectionIds
- toBuilder() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.ListDocuments
- toBuilder() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.PartitionQuery
- toBuilder() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.RunQuery
- toBuilder() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
Create a new
RpcQosOptions.Builder
initialized with the values from this instance. - toBuilder() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.SubscriberOptions
- toBuilder() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
Transforms the instance into a builder, so field values can be modified.
- toBuilder() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
- toBuilder() - Method in class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig
- toBuilder() - Method in class org.apache.beam.sdk.io.solace.data.Semp.Queue
- toBuilder() - Method in class org.apache.beam.sdk.io.solace.data.Semp.QueueData
- toBuilder() - Method in class org.apache.beam.sdk.schemas.Schema.Field
- toBuilder() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
- toBuilder() - Method in class org.apache.beam.sdk.transforms.DoFnSchemaInformation
- toBuilder() - Method in class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn.ParseWithError
- toByteArray(T, Coder<T>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
-
Utility method for serializing an object using the specified coder.
- toByteArray(T, Coder<T>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.CoderHelpers
-
Utility method for serializing an object using the specified coder.
- toByteArrays(Iterable<T>, Coder<T>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
-
Utility method for serializing a Iterable of values using the specified coder.
- toByteArrays(Iterator<T>, Coder<T>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
-
Utility method for serializing a Iterator of values using the specified coder.
- toByteArrayWithTs(T, Coder<T>, Instant) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
-
Utility method for serializing an object using the specified coder, appending timestamp representation.
- toByteFunction(Coder<K>, Coder<V>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
-
A function wrapper for converting a key-value pair to a byte array pair.
- toByteFunction(Coder<T>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
-
A function wrapper for converting an object to a bytearray.
- toByteFunctionWithTs(Coder<K>, Coder<V>, Function<Tuple2<K, V>, Instant>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
-
A function wrapper for converting a key-value pair to a byte array pair, where the key in resulting ByteArray contains (key, timestamp).
- toBytes() - Method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHint
-
Defines how to represent the as bytestring.
- toCalciteRowType(Schema, RelDataTypeFactory) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
-
Create an instance of
RelDataType
so it can be used to create a table. - toCalciteType(Type, boolean, RexBuilder) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSqlCalciteTranslationUtils
- toCamelCase() - Method in class org.apache.beam.sdk.schemas.Schema
-
Recursively converts all field names to `lowerCamelCase`.
- toCamelCase() - Method in class org.apache.beam.sdk.values.Row
-
Returns an equivalent
Row
with `lowerCamelCase` field names. - toChangeStreamRecords(PartitionMetadata, ChangeStreamResultSet, ChangeStreamResultSetMetadata) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper.ChangeStreamRecordMapper
-
In GoogleSQL, change stream records are returned as an array of
Struct
. - toCloudDuration(ReadableDuration) - Static method in class org.apache.beam.runners.dataflow.util.TimeUtil
-
Converts a
ReadableDuration
into a Dataflow API duration string. - toCloudObject(RowCoder, SdkComponents) - Method in class org.apache.beam.runners.dataflow.util.RowCoderCloudObjectTranslator
-
Convert to a cloud object.
- toCloudObject(SchemaCoder, SdkComponents) - Method in class org.apache.beam.runners.dataflow.util.SchemaCoderCloudObjectTranslator
-
Convert to a cloud object.
- toCloudObject(T, SdkComponents) - Method in interface org.apache.beam.runners.dataflow.util.CloudObjectTranslator
-
Converts the provided object into an equivalent
CloudObject
. - toCloudPubsubMessages() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.CloudPubsubTransforms
-
Transform messages read from Pub/Sub Lite to their equivalent Cloud Pub/Sub Message that would have been read from PubsubIO.
- toCloudTime(ReadableInstant) - Static method in class org.apache.beam.runners.dataflow.util.TimeUtil
-
Converts a
ReadableInstant
into a Dataflow API time value. - toConfigRow(BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransform) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQuerySchemaTransformTranslation.BigQueryStorageReadSchemaTransformTranslator
- toConfigRow(BigQueryWriteSchemaTransformProvider.BigQueryWriteSchemaTransform) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQuerySchemaTransformTranslation.BigQueryWriteSchemaTransformTranslator
- toConfigRow(TFRecordReadSchemaTransformProvider.TFRecordReadSchemaTransform) - Method in class org.apache.beam.sdk.io.TFRecordSchemaTransformTranslation.TFRecordReadSchemaTransformTranslator
- toConfigRow(TFRecordWriteSchemaTransformProvider.TFRecordWriteSchemaTransform) - Method in class org.apache.beam.sdk.io.TFRecordSchemaTransformTranslation.TFRecordWriteSchemaTransformTranslator
- toDefaultPolicies(SerializableFunction<UserT, DefaultFilenamePolicy.Params>, DefaultFilenamePolicy.Params, SerializableFunction<UserT, OutputT>) - Static method in class org.apache.beam.sdk.io.DynamicFileDestinations
-
Returns a
FileBasedSink.DynamicDestinations
that returns instances ofDefaultFilenamePolicy
configured with the givenDefaultFilenamePolicy.Params
. - toDuration(Row) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoSchemaLogicalTypes.DurationConvert
-
ByteBuddy conversion for NanosDuration base type to Duration.
- toEnumerable(BeamRelNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamEnumerableConverter
- toFeedRange() - Method in class org.apache.beam.sdk.io.azure.cosmos.NormalizedRange
- toField(String, RelDataType) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
- toField(RelDataTypeField) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
- toFieldType(RelDataType) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
- toFieldType(SqlTypeNameSpec) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
- toFieldType(SqlTypeName) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
- toFile() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- toGcpBackOff(BackOff) - Static method in class org.apache.beam.sdk.extensions.gcp.util.BackOffAdapter
-
Returns an adapter to convert from
BackOff
toBackOff
. - toGenericAvroSchema(TableSchema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
-
Convert a BigQuery
TableSchema
to AvroSchema
. - toGenericAvroSchema(TableSchema, Boolean) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
-
Convert a BigQuery
TableSchema
to AvroSchema
. - toGenericAvroSchema(String, List<TableFieldSchema>) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
-
Convert a list of BigQuery
TableFieldSchema
to AvroSchema
. - toGenericAvroSchema(String, List<TableFieldSchema>, Boolean) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
-
Convert a list of BigQuery
TableFieldSchema
to AvroSchema
. - toGenericRecord(Row) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
-
Convert from a Beam Row to an AVRO GenericRecord.
- toGenericRecord(Row, Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
-
Convert from a Beam Row to an AVRO GenericRecord.
- toHex(byte[]) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinStringFunctions
- toInputType(byte[]) - Method in class org.apache.beam.sdk.schemas.logicaltypes.FixedBytes
- toInputType(byte[]) - Method in class org.apache.beam.sdk.schemas.logicaltypes.VariableBytes
- toInputType(byte[]) - Method in class org.apache.beam.sdk.schemas.logicaltypes.SchemaLogicalType
- toInputType(BaseT) - Method in interface org.apache.beam.sdk.schemas.Schema.LogicalType
-
Convert the Java type used by the base
Schema.FieldType
to the input type. - toInputType(String) - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.LocalMktDate
- toInputType(String) - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCDateOnly
- toInputType(String) - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimeOnly
- toInputType(String) - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimestamp
- toInputType(String) - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimeOnly
- toInputType(String) - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimestamp
- toInputType(String) - Method in class org.apache.beam.sdk.schemas.logicaltypes.PythonCallable
- toInputType(Integer) - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
- toInputType(Long) - Method in class org.apache.beam.sdk.schemas.logicaltypes.Date
- toInputType(Long) - Method in class org.apache.beam.sdk.schemas.logicaltypes.Time
- toInputType(String) - Method in class org.apache.beam.sdk.schemas.logicaltypes.FixedString
- toInputType(String) - Method in class org.apache.beam.sdk.schemas.logicaltypes.VariableString
- toInputType(BigDecimal) - Method in class org.apache.beam.sdk.schemas.logicaltypes.FixedPrecisionNumeric
- toInputType(Row) - Method in class org.apache.beam.sdk.schemas.logicaltypes.DateTime
- toInputType(Row) - Method in class org.apache.beam.sdk.schemas.logicaltypes.MicrosInstant
- toInputType(Row) - Method in class org.apache.beam.sdk.schemas.logicaltypes.NanosDuration
- toInputType(Row) - Method in class org.apache.beam.sdk.schemas.logicaltypes.NanosInstant
- toInputType(Row) - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
- toInputType(Row) - Method in class org.apache.beam.sdk.schemas.logicaltypes.UuidLogicalType
- toInputType(T) - Method in class org.apache.beam.sdk.schemas.logicaltypes.PassThroughLogicalType
- toInt(LocalDate, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.DateConversion
- toInt(LocalTime, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimeMillisConversion
- toInt(LocalDate, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.DateConversion
- toInt(LocalTime, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimeConversion
- toJava(Instant) - Static method in class org.apache.beam.sdk.io.aws2.kinesis.TimeUtil
- toJoda(Instant) - Static method in class org.apache.beam.sdk.io.aws2.kinesis.TimeUtil
- toJodaTime(Instant) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.TimestampConverter
- toJson() - Method in class org.apache.beam.io.debezium.SourceRecordJson
-
Transforms the extracted data to a JSON string.
- ToJson<T> - Class in org.apache.beam.sdk.transforms
-
Creates a
PTransform
that serializes UTF-8 JSON objects from aSchema
-aware PCollection (i.e. - toJsonString(Object) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
- tokenNames - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
-
Deprecated.Use
FieldSpecifierNotationLexer.VOCABULARY
instead. - tokenNames - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
-
Deprecated.Use
FieldSpecifierNotationParser.VOCABULARY
instead. - ToListViewDoFn() - Constructor for class org.apache.beam.sdk.transforms.View.ToListViewDoFn
- toLogicalBaseType(Schema.LogicalType<InputT, BaseT>, InputT) - Static method in class org.apache.beam.sdk.schemas.SchemaUtils
-
Returns the base type given a logical type and the input type.
- toLogicalInputType(Schema.LogicalType<InputT, BaseT>, BaseT) - Static method in class org.apache.beam.sdk.schemas.SchemaUtils
-
Returns the input type given a logical type and the base type.
- toLong(Instant, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimestampMicrosConversion
- toLong(Instant, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimestampMillisConversion
- toLong(LocalDateTime, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.LocalTimestampMicrosConversion
- toLong(LocalDateTime, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.LocalTimestampMillisConversion
- toLong(LocalTime, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimeMicrosConversion
- toLong(DateTime, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.LossyTimestampMicrosConversion
- toLong(DateTime, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimestampConversion
- toLong(LocalTime, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.LossyTimeMicrosConversion
- toMap(ArrayData, ArrayData, DataType, DataType) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers.Utils
- toModel() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
-
To model message.
- toModificationRel(RelOptCluster, RelOptTable, Prepare.CatalogReader, RelNode, TableModify.Operation, List<String>, List<RexNode>, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteTable
- toNanos(Timestamp) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampUtils
-
Converts the given timestamp to respective nanoseconds representation.
- toNativeString() - Method in class org.apache.beam.runners.spark.translation.streaming.StatefulStreamingParDoEvaluator
- toNativeString() - Method in interface org.apache.beam.runners.spark.translation.TransformEvaluator
- toNullableRecordField(Object[], int) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableUtils
- Top - Class in org.apache.beam.sdk.transforms
-
PTransform
s for finding the largest (or smallest) set of elements in aPCollection
, or the largest (or smallest) set of values associated with each key in aPCollection
ofKV
s. - TOP_LEVEL - Enum constant in enum class org.apache.beam.sdk.io.FileSystem.LineageLevel
- Top.Largest<T> - Class in org.apache.beam.sdk.transforms
-
Deprecated.use
Top.Natural
instead - Top.Natural<T> - Class in org.apache.beam.sdk.transforms
-
A
Serializable
Comparator
that that uses the compared elements' natural ordering. - Top.Reversed<T> - Class in org.apache.beam.sdk.transforms
-
Serializable
Comparator
that that uses the reverse of the compared elements' natural ordering. - Top.Smallest<T> - Class in org.apache.beam.sdk.transforms
-
Deprecated.use
Top.Reversed
instead - Top.TopCombineFn<T,
ComparatorT> - Class in org.apache.beam.sdk.transforms -
CombineFn
forTop
transforms that combines a bunch ofT
s into a singlecount
-longList<T>
, usingcompareFn
to choose the largestT
s. - toPairByKeyInWindowedValue(Coder<K>) - Static method in class org.apache.beam.runners.spark.translation.TranslationUtils
-
Extract key from a
WindowedValue
KV
into a pair. - toPairFlatMapFunction() - Static method in class org.apache.beam.runners.spark.translation.TranslationUtils
-
KV
to pair flatmap function. - toPairFunction() - Static method in class org.apache.beam.runners.spark.translation.TranslationUtils
-
KV
to pair function. - toPCollection(Pipeline, BeamRelNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils
- toPCollection(Pipeline, BeamRelNode, PTransform<PCollection<Row>, ? extends POutput>) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils
- TopCombineFn(int, ComparatorT) - Constructor for class org.apache.beam.sdk.transforms.Top.TopCombineFn
- toPeriodicDStream(JavaDStream<WindowedValue<KV<KeyT, ValueT>>>) - Static method in class org.apache.beam.runners.spark.util.TimerUtils
-
Converts a standard DStream into a periodic DStream that ensures all keys are processed in every micro-batch, even if they don't receive new data.
- topic() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.OutgoingMessage
- TOPIC - Enum constant in enum class org.apache.beam.sdk.io.solace.data.Solace.DestinationType
- TopicPartitionCoder - Class in org.apache.beam.sdk.io.kafka
-
The
Coder
for encoding and decodingTopicPartition
in Beam. - TopicPartitionCoder() - Constructor for class org.apache.beam.sdk.io.kafka.TopicPartitionCoder
- topicPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
-
Topic path where events will be published to.
- topicPath() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PublisherOptions
- topicPathFromName(String, String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
- topicPathFromPath(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
- toProto() - Method in class org.apache.beam.runners.jobsubmission.JobInvocation
-
Convert to
JobApi.JobInfo
. - toProto(PubsubMessage) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages
- toProvisionInfo() - Method in class org.apache.beam.runners.fnexecution.provisioning.JobInfo
- toPTransform() - Method in class org.apache.beam.sdk.fn.data.RemoteGrpcPortRead
- toPTransform() - Method in class org.apache.beam.sdk.fn.data.RemoteGrpcPortWrite
- toRealPath(LinkOption...) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- toRecordField(Object[], int) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableUtils
- toRel(RelOptTable.ToRelContext, RelOptTable) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteTable
- toRelDataType(RelDataTypeFactory, Schema.FieldType) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
- toResource(ValueProvider<ResourceId>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
- toResource(ValueProvider<ResourceId>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
- toResource(ValueProvider<ResourceId>) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
- toResource(ValueProvider<ResourceId>) - Method in class org.apache.beam.sdk.io.TextIO.Write
- toResource(ValueProvider<ResourceId>) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
- toResourceName() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- toRexNode(Value, RexBuilder) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSqlCalciteTranslationUtils
- toRow() - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo
- toRow(Duration) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoSchemaLogicalTypes.DurationConvert
-
ByteBuddy conversion for Duration to NanosDuration base type.
- toRow(Timestamp) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoSchemaLogicalTypes.TimestampConvert
-
ByteBuddy conversion for Timestamp to NanosInstant base type.
- toRow(Schema) - Static method in class org.apache.beam.sdk.values.Row
-
Creates a
Row
from the list of values andRow.getSchema()
. - toRowFunction(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.annotations.DefaultSchema.DefaultSchemaProvider
-
Given a type, return a function that converts that type to a
Row
object If no schema exists, returns null. - toRowFunction(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProvider
-
Deprecated.
- toRowFunction(TypeDescriptor<T>) - Method in interface org.apache.beam.sdk.schemas.SchemaProvider
-
Given a type, return a function that converts that type to a
Row
object If no schema exists, returns null. - toRowList(BeamRelNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamEnumerableConverter
- toRowList(BeamRelNode, Map<String, String>) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamEnumerableConverter
- toRows() - Static method in class org.apache.beam.sdk.schemas.transforms.Convert
-
Convert a
PCollection
<InputT> into aPCollection
<Row>. - toSchema() - Static method in class org.apache.beam.sdk.schemas.Schema
-
Collects a stream of
Schema.Field
s into aSchema
. - toSchema(RelDataType) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
-
Generate
Schema
fromRelDataType
which is used to create table. - toSeconds(Instant) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.TimestampConverter
- toSeq(Collection<Object>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers.Utils
- toSeq(ArrayData) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers.Utils
- toSideInputBroadcast() - Method in class org.apache.beam.runners.spark.translation.SideInputMetadata
-
Converts this metadata to a
SideInputBroadcast
instance. - toSnakeCase() - Method in class org.apache.beam.sdk.schemas.Schema
-
Recursively converts all field names to `snake_case`.
- toSnakeCase() - Method in class org.apache.beam.sdk.values.Row
-
Returns an equivalent
Row
with `snake_case` field names. - toSnapshot(Long) - Method in class org.apache.beam.sdk.io.iceberg.IcebergIO.ReadRows
- toSql(RexProgram, RexNode) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamSqlUnparseContext
- toSqlTypeName(Schema.FieldType) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
- toState(String) - Static method in class org.apache.beam.runners.dataflow.util.MonitoringUtil
- toString() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator
- toString() - Method in class org.apache.beam.runners.dataflow.DataflowRunner
- toString() - Method in class org.apache.beam.runners.dataflow.TestDataflowRunner
- toString() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
- toString() - Method in class org.apache.beam.runners.flink.FlinkDetachedRunnerResult
- toString() - Method in class org.apache.beam.runners.flink.FlinkRunner
- toString() - Method in class org.apache.beam.runners.flink.FlinkRunnerResult
- toString() - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeInformation
- toString() - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeSerializer
- toString() - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueTypeInformation
- toString() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceReaderBase.ReaderAndOutput
- toString() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSourceSplit
- toString() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.SingletonKeyedWorkItem
- toString() - Method in class org.apache.beam.runners.portability.PortableRunner
- toString() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
- toString() - Method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder.SparkWatermarks
- toString() - Method in class org.apache.beam.runners.spark.util.TimerUtils.TimerMarker
- toString() - Method in class org.apache.beam.sdk.coders.Coder.Context
-
Deprecated.
- toString() - Method in class org.apache.beam.sdk.coders.DelegateCoder
- toString() - Method in class org.apache.beam.sdk.coders.SerializableCoder
- toString() - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
- toString() - Method in class org.apache.beam.sdk.coders.StructuralByteArray
- toString() - Method in class org.apache.beam.sdk.coders.StructuredCoder
- toString() - Method in class org.apache.beam.sdk.coders.ZstdCoder
- toString() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsResourceId
- toString() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- toString() - Method in class org.apache.beam.sdk.extensions.ordered.combiner.SequenceRangeAccumulator
- toString() - Method in class org.apache.beam.sdk.extensions.sql.example.model.Customer
- toString() - Method in class org.apache.beam.sdk.extensions.sql.example.model.Order
- toString() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPCall
- toString() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPFieldRef
- toString() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPOperator
- toString() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPPattern
- toString() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.Quantifier
- toString() - Method in class org.apache.beam.sdk.extensions.sql.impl.LazyAggregateCombineFn
- toString() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
- toString() - Method in enum class org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention
- toString() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter.RelType
- toString() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryFilter
- toString() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.iceberg.IcebergFilter
- toString() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableFilter
- toString() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer
- toString() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
- toString() - Method in class org.apache.beam.sdk.io.cassandra.RingRange
- toString() - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy.Params
- toString() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
- toString() - Method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
- toString() - Method in class org.apache.beam.sdk.io.FileBasedSource
- toString() - Method in class org.apache.beam.sdk.io.FileIO.ReadableFile
- toString() - Method in interface org.apache.beam.sdk.io.fs.ResourceId
-
Returns the string representation of this
ResourceId
. - toString() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertError
- toString() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiDynamicDestinationsTableRow
- toString() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- toString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
- toString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
- toString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
- toString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.WriteWithResults
- toString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.DetectNewPartitionsState
- toString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.InitialPipelineState
- toString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
- toString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
- toString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.StreamPartitionWithWatermark
- toString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.ReadChangeStreamPartitionProgressTracker
- toString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
- toString() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
- toString() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
- toString() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntityWithSummary
- toString() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKeyWithSummary
- toString() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.WriteWithSummary
- toString() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
- toString() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
- toString() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.WriteSuccessSummary
- toString() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteSuccessSummary
- toString() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
- toString() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
- toString() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
- toString() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadResponse
- toString() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.ProjectPath
- toString() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath
- toString() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.TopicPath
- toString() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubSubscription
- toString() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
- toString() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
- toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.TransactionResult
- toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataTableNames
- toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
- toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartition
- toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
- toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
- toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
- toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.HeartbeatRecord
- toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.Mod
- toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionEndRecord
- toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionEventRecord
- toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
- toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionStartRecord
- toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.TypeCode
- toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRange
- toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
- toString() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark.PartitionMark
- toString() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark
- toString() - Method in enum class org.apache.beam.sdk.io.kafka.KafkaTimestampType
- toString() - Method in class org.apache.beam.sdk.io.OffsetBasedSource
- toString() - Method in class org.apache.beam.sdk.io.range.ByteKey
- toString() - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
- toString() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
- toString() - Method in class org.apache.beam.sdk.io.range.OffsetRange
- toString() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
- toString() - Method in class org.apache.beam.sdk.io.solace.broker.BrokerResponse
- toString() - Method in class org.apache.beam.sdk.io.tika.ParseResult
- toString() - Method in enum class org.apache.beam.sdk.metrics.Lineage.Type
- toString() - Method in class org.apache.beam.sdk.metrics.MetricKey
- toString() - Method in class org.apache.beam.sdk.metrics.MetricName
- toString() - Method in class org.apache.beam.sdk.metrics.MetricQueryResults
- toString() - Method in class org.apache.beam.sdk.metrics.MetricResults
- toString() - Method in class org.apache.beam.sdk.options.ValueProvider.NestedValueProvider
- toString() - Method in class org.apache.beam.sdk.options.ValueProvider.RuntimeValueProvider
- toString() - Method in class org.apache.beam.sdk.options.ValueProvider.StaticValueProvider
- toString() - Method in class org.apache.beam.sdk.Pipeline
- toString() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
- toString() - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
- toString() - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType.Value
- toString() - Method in class org.apache.beam.sdk.schemas.logicaltypes.FixedBytes
- toString() - Method in class org.apache.beam.sdk.schemas.logicaltypes.FixedString
- toString() - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
- toString() - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType.Value
- toString() - Method in class org.apache.beam.sdk.schemas.logicaltypes.VariableBytes
- toString() - Method in class org.apache.beam.sdk.schemas.logicaltypes.VariableString
- toString() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
- toString() - Method in class org.apache.beam.sdk.schemas.Schema.Options
- toString() - Method in class org.apache.beam.sdk.schemas.Schema
- toString() - Method in class org.apache.beam.sdk.schemas.SchemaCoder
- toString() - Method in class org.apache.beam.sdk.schemas.transforms.Cast.Narrowing
- toString() - Method in class org.apache.beam.sdk.schemas.transforms.Cast.Widening
- toString() - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
- toString() - Method in class org.apache.beam.sdk.testing.SuccessOrFailure
- toString() - Method in class org.apache.beam.sdk.testing.TestPipeline
- toString() - Method in class org.apache.beam.sdk.transforms.Combine.Holder
- toString() - Method in class org.apache.beam.sdk.transforms.Contextful
- toString() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Identifier
- toString() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
- toString() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
- toString() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Path
- toString() - Method in class org.apache.beam.sdk.transforms.display.DisplayData
- toString() - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches
- toString() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
- toString() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
- toString() - Method in class org.apache.beam.sdk.transforms.join.RawUnionValue
- toString() - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
- toString() - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
- toString() - Method in class org.apache.beam.sdk.transforms.PeriodicSequence.OutputRangeTracker
- toString() - Method in class org.apache.beam.sdk.transforms.PTransform
- toString() - Method in class org.apache.beam.sdk.transforms.splittabledofn.ByteKeyRangeTracker
- toString() - Method in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
- toString() - Method in class org.apache.beam.sdk.transforms.Watch.Growth.PollResult
- toString() - Method in class org.apache.beam.sdk.transforms.windowing.AfterAll
- toString() - Method in class org.apache.beam.sdk.transforms.windowing.AfterEach
- toString() - Method in class org.apache.beam.sdk.transforms.windowing.AfterFirst
- toString() - Method in class org.apache.beam.sdk.transforms.windowing.AfterPane
- toString() - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
- toString() - Method in class org.apache.beam.sdk.transforms.windowing.AfterSynchronizedProcessingTime
- toString() - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
- toString() - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.FromEndOfWindow
- toString() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow
- toString() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
- toString() - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
- toString() - Method in class org.apache.beam.sdk.transforms.windowing.OrFinallyTrigger
- toString() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
- toString() - Method in class org.apache.beam.sdk.transforms.windowing.Repeatedly
- toString() - Method in class org.apache.beam.sdk.transforms.windowing.ReshuffleTrigger
- toString() - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
- toString() - Method in class org.apache.beam.sdk.values.KV
- toString() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
- toString() - Method in class org.apache.beam.sdk.values.PValueBase
- toString() - Method in class org.apache.beam.sdk.values.Row
- toString() - Method in class org.apache.beam.sdk.values.ShardedKey
- toString() - Method in class org.apache.beam.sdk.values.TimestampedValue
- toString() - Method in class org.apache.beam.sdk.values.TupleTag
- toString() - Method in class org.apache.beam.sdk.values.TupleTagList
- toString() - Method in class org.apache.beam.sdk.values.TypeDescriptor
- toString() - Method in class org.apache.beam.sdk.values.TypeParameter
- toString() - Method in class org.apache.beam.sdk.values.ValueWithRecordId
- toString() - Method in class org.apache.beam.sdk.values.WindowingStrategy
- toString(boolean) - Method in class org.apache.beam.sdk.values.Row
-
Convert Row to String.
- toString(EnumerationType.Value) - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
- toString(Metric) - Static method in class org.apache.beam.runners.flink.metrics.Metrics
- toString(StateT) - Method in interface org.apache.beam.sdk.transforms.Watch.Growth.TerminationCondition
-
Creates a human-readable representation of the given state of this condition.
- ToString - Class in org.apache.beam.sdk.transforms
-
PTransforms
for converting aPCollection<?>
,PCollection<KV<?,?>>
, orPCollection<Iterable<?>>
to aPCollection<String>
. - toStringTimestamp(long) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.BeamCodegenUtils
- toStringUTF8(byte[]) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.BeamCodegenUtils
- toTableReference(String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
- toTableRow() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
- toTableRow(SerializableFunction<T, Row>) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
-
Convert a Beam schema type to a BigQuery
TableRow
. - toTableRow(Row) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
-
Convert a Beam Row to a BigQuery TableRow.
- toTableSchema(Schema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
-
Convert a Beam
Schema
to a BigQueryTableSchema
. - toTableSpec(TableReference) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
-
Returns a canonical string representation of the
TableReference
. - toTableSpec(TableReference) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
- toThreetenInstant(Instant) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.TimestampConverter
- toTimestamp(Long) - Method in class org.apache.beam.sdk.io.iceberg.IcebergIO.ReadRows
- toTimestamp(BigDecimal) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampUtils
-
Converts nanoseconds to their respective timestamp.
- toTimestamp(Row) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoSchemaLogicalTypes.TimestampConvert
-
ByteBuddy conversion for NanosInstant base type to Timestamp.
- toTreeMap(ArrayData, ArrayData, DataType, DataType) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers.Utils
- toUnsplittableSource(BoundedSource<T>) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
-
Returns an equivalent unsplittable
BoundedSource<T>
. - toUri() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
- toWrite(Schema, PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryFileLoadsSchemaTransformProvider.BigQueryFileLoadsSchemaTransform
- toZetaSqlStructType(Schema) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSqlBeamTranslationUtils
- toZetaSqlStructValue(Row, Schema) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSqlBeamTranslationUtils
- toZetaSqlType(Schema.FieldType) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSqlBeamTranslationUtils
- toZetaSqlType(RelDataType) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSqlCalciteTranslationUtils
- toZetaSqlValue(Object, Schema.FieldType) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSqlBeamTranslationUtils
- TRACE - Enum constant in enum class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.Level
-
Deprecated.Level for logging tracing messages.
- TRACE - Enum constant in enum class org.apache.beam.sdk.options.SdkHarnessOptions.LogLevel
-
LogLevel for logging tracing messages.
- TrackerWithProgress - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
- TrackerWithProgress() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.TrackerWithProgress
- Transaction - Class in org.apache.beam.sdk.io.gcp.spanner
-
A transaction object.
- Transaction() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.Transaction
- transactionId() - Method in class org.apache.beam.sdk.io.gcp.spanner.Transaction
- TransactionResult(T, Timestamp) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.TransactionResult
- Transactions - Search tag in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
- Section
- transfer() - Method in class org.apache.beam.runners.portability.CloseableResource
-
Returns a new
CloseableResource
that owns the underlying resource and relinquishes ownership from thisCloseableResource
. - transform(Function<T, V>) - Method in class org.apache.beam.sdk.metrics.MetricResult
- TRANSFORM_URN - Static variable in class org.apache.beam.runners.spark.io.ConsoleIO.Write.Unbound
- TRANSFORM_URN - Static variable in class org.apache.beam.runners.spark.io.CreateStream
- transformContainer(Iterable<FromT>, Function<FromT, DestT>) - Static method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils
- TransformEvaluator<TransformT> - Interface in org.apache.beam.runners.spark.translation
-
Describe a
PTransform
evaluator. - TransformExecutor - Interface in org.apache.beam.runners.direct
-
A
Runnable
that will execute aPTransform
on some bundle of input. - transformId() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.BagUserStateSpec
- transformId() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.SideInputSpec
- transformId() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.TimerSpec
- TransformingMap(Map<K1, V1>, Function<K1, K2>, Function<V1, V2>) - Constructor for class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
- TransformProvider<InputT,
OutputT> - Interface in org.apache.beam.sdk.expansion.service -
Provides a mapping of
RunnerApi.FunctionSpec
to aPTransform
, together with mappings of its inputs and outputs to maps of PCollections. - TransformServiceLauncher - Class in org.apache.beam.sdk.transformservice.launcher
-
A utility that can be used to manage a Beam Transform Service.
- transformTo(RelNode, Map<RelNode, RelNode>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall
- transformTo(RelNode, Map<RelNode, RelNode>, RelHintsPropagator) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall
- TransformTranslator<InT,
OutT, - Class in org.apache.beam.runners.spark.structuredstreaming.translationTransformT> -
A
TransformTranslator
provides the capability to translate a specific primitive or compositePTransform
into its Spark correspondence. - TransformTranslator - Class in org.apache.beam.runners.spark.translation
-
Supports translation between a Beam transform, and Spark's operations on RDDs.
- TransformTranslator<TransformT> - Interface in org.apache.beam.runners.dataflow
-
A
TransformTranslator
knows how to translate a particular subclass ofPTransform
for the Cloud Dataflow service. - TransformTranslator(float) - Constructor for class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator
- TransformTranslator.Context - Class in org.apache.beam.runners.spark.structuredstreaming.translation
-
Available mutable context to translate a
PTransform
. - TransformTranslator.StepTranslationContext - Interface in org.apache.beam.runners.dataflow
-
The interface for a
TransformTranslator
to build a Dataflow step. - TransformTranslator.TranslationContext - Interface in org.apache.beam.runners.dataflow
-
The interface provided to registered callbacks for interacting with the
DataflowRunner
, including reading and writing the values ofPCollection
s and side inputs. - TransformTranslator.Translator - Class in org.apache.beam.runners.spark.translation
-
Translator matches Beam transformation with the appropriate evaluator.
- translate(String, RunnerApi.Pipeline, T) - Method in interface org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator.PTransformTranslator
- translate(RunnerApi.Pipeline, SparkStreamingTranslationContext) - Method in class org.apache.beam.runners.spark.translation.SparkStreamingPortablePipelineTranslator
-
Translates pipeline from Beam into the Spark context.
- translate(RunnerApi.Pipeline, SparkTranslationContext) - Method in class org.apache.beam.runners.spark.translation.SparkBatchPortablePipelineTranslator
-
Translates pipeline from Beam into the Spark context.
- translate(RunnerApi.Pipeline, T) - Method in interface org.apache.beam.runners.spark.translation.SparkPortablePipelineTranslator
-
Translates the given pipeline.
- translate(FlinkBatchPortablePipelineTranslator.BatchTranslationContext, RunnerApi.Pipeline) - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator
- translate(FlinkStreamingPortablePipelineTranslator.StreamingTranslationContext, RunnerApi.Pipeline) - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator
- translate(Pipeline) - Method in class org.apache.beam.runners.twister2.translators.Twister2PipelineTranslator
-
Translates the pipeline by passing this class as a visitor.
- translate(Pipeline) - Method in class org.apache.beam.runners.twister2.Twister2PipelineExecutionEnvironment
-
translate the pipline into Twister2 TSet graph.
- translate(Pipeline, RunnerApi.Pipeline, SdkComponents, DataflowRunner, List<DataflowPackage>) - Method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator
-
Translates a
Pipeline
into aJobSpecification
. - translate(Pipeline, SparkSession, SparkCommonPipelineOptions) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator
-
Translates a Beam pipeline into its Spark correspondence using the Spark SQL / Dataset API.
- translate(AppliedPTransform<?, ?, PrimitiveParDoSingleFactory.ParDoSingle<?, ?>>, SdkComponents) - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.PayloadTranslator
- translate(AppliedPTransform<?, ?, T>, SdkComponents) - Method in class org.apache.beam.sdk.schemas.transforms.SchemaTransformTranslation.SchemaTransformPayloadTranslator
- translate(TransformHierarchy.Node, TransformT) - Method in class org.apache.beam.runners.spark.SparkRunner.Evaluator
-
Determine if this Node belongs to a Bounded branch of the pipeline, or Unbounded, and translate with the proper translator.
- translate(PipelineNode.PTransformNode, RunnerApi.Pipeline, FlinkBatchPortablePipelineTranslator.BatchTranslationContext) - Method in interface org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.PTransformTranslator
-
Translate a PTransform into the given translation context.
- translate(TransformT, TransformTranslator.TranslationContext) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator
- translate(TransformT, TransformTranslator.Context) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator
- translate(T, RunnerApi.Pipeline) - Method in interface org.apache.beam.runners.flink.FlinkPortablePipelineTranslator
-
Translates the given pipeline.
- translateBounded(PTransform<?, ?>) - Method in interface org.apache.beam.runners.spark.translation.SparkPipelineTranslator
- translateBounded(PTransform<?, ?>) - Method in class org.apache.beam.runners.spark.translation.streaming.StreamingTransformTranslator.Translator
- translateBounded(PTransform<?, ?>) - Method in class org.apache.beam.runners.spark.translation.TransformTranslator.Translator
- translateNode(Flatten.PCollections<T>, Twister2BatchTranslationContext) - Method in class org.apache.beam.runners.twister2.translators.batch.FlattenTranslatorBatch
- translateNode(GroupByKey<K, V>, Twister2BatchTranslationContext) - Method in class org.apache.beam.runners.twister2.translators.batch.GroupByKeyTranslatorBatch
- translateNode(Impulse, Twister2BatchTranslationContext) - Method in class org.apache.beam.runners.twister2.translators.batch.ImpulseTranslatorBatch
- translateNode(ParDo.MultiOutput<InputT, OutputT>, Twister2BatchTranslationContext) - Method in class org.apache.beam.runners.twister2.translators.batch.ParDoMultiOutputTranslatorBatch
- translateNode(PTransform<PBegin, PCollection<T>>, Twister2StreamTranslationContext) - Method in class org.apache.beam.runners.twister2.translators.streaming.ReadSourceTranslatorStream
- translateNode(View.CreatePCollectionView<ElemT, ViewT>, Twister2BatchTranslationContext) - Method in class org.apache.beam.runners.twister2.translators.batch.PCollectionViewTranslatorBatch
- translateNode(Window.Assign<T>, Twister2BatchTranslationContext) - Method in class org.apache.beam.runners.twister2.translators.batch.AssignWindowTranslatorBatch
- translateNode(SplittableParDo.PrimitiveBoundedRead<T>, Twister2BatchTranslationContext) - Method in class org.apache.beam.runners.twister2.translators.batch.ReadSourceTranslatorBatch
- translateNode(TransformT, Twister2BatchTranslationContext) - Method in interface org.apache.beam.runners.twister2.translators.BatchTransformTranslator
- translateNode(TransformT, Twister2StreamTranslationContext) - Method in interface org.apache.beam.runners.twister2.translators.StreamTransformTranslator
- translateUnbounded(PTransform<?, ?>) - Method in interface org.apache.beam.runners.spark.translation.SparkPipelineTranslator
- translateUnbounded(PTransform<?, ?>) - Method in class org.apache.beam.runners.spark.translation.streaming.StreamingTransformTranslator.Translator
- translateUnbounded(PTransform<?, ?>) - Method in class org.apache.beam.runners.spark.translation.TransformTranslator.Translator
- TranslationUtils - Class in org.apache.beam.runners.spark.translation
-
A set of utilities to help translating Beam transformations into Spark transformations.
- TranslationUtils - Class in org.apache.beam.runners.twister2.utils
-
doc.
- TranslationUtils.CombineGroupedValues<K,
InputT, - Class in org.apache.beam.runners.spark.translationOutputT> -
A SparkCombineFn function applied to grouped KVs.
- TranslationUtils.TupleTagFilter<V> - Class in org.apache.beam.runners.spark.translation
-
A utility class to filter
TupleTag
s. - translator - Variable in class org.apache.beam.runners.spark.SparkRunner.Evaluator
- translator - Variable in class org.apache.beam.runners.spark.translation.GroupByKeyVisitor
- Translator() - Constructor for class org.apache.beam.runners.spark.translation.TransformTranslator.Translator
- Translator(SparkPipelineTranslator) - Constructor for class org.apache.beam.runners.spark.translation.streaming.StreamingTransformTranslator.Translator
- Transport - Class in org.apache.beam.sdk.extensions.gcp.util
-
Helpers for cloud communication.
- Transport() - Constructor for class org.apache.beam.sdk.extensions.gcp.util.Transport
- traverseTopologically(Pipeline.PipelineVisitor) - Method in class org.apache.beam.sdk.Pipeline
-
For internal use only; no backwards-compatibility guarantees.
- Trigger - Class in org.apache.beam.sdk.transforms.windowing
-
Triggers control when the elements for a specific key and window are output.
- Trigger() - Constructor for class org.apache.beam.sdk.transforms.windowing.Trigger
- Trigger(List<Trigger>) - Constructor for class org.apache.beam.sdk.transforms.windowing.Trigger
- Trigger.OnceTrigger - Class in org.apache.beam.sdk.transforms.windowing
-
For internal use only; no backwards-compatibility guarantees.
- triggerExpiredTimers(SparkTimerInternals, WindowingStrategy<?, W>, AbstractInOutIterator<?, ?, ?>) - Static method in class org.apache.beam.runners.spark.util.TimerUtils
-
Fires all expired timers using the provided iterator.
- triggering(Trigger) - Method in class org.apache.beam.sdk.transforms.windowing.Window
-
Sets a non-default trigger for this
Window
PTransform
. - Triggers - Search tag in class org.apache.beam.sdk.transforms.windowing.Window
- Section
- trim(String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
- trim(String, String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
- TRIM - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
- TRIM_METHOD - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.BeamBuiltinMethods
- trivial() - Static method in class org.apache.beam.sdk.fn.stream.OutboundObserverFactory
-
Creates an
OutboundObserverFactory
that simply delegates to the base factory, with no flow control or synchronization. - trueLiteral() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.ExpressionConverter
- truncate(long) - Method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingSeekableByteChannel
- TRUNCATE - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions.TruncateTimestamps
-
Truncate timestamps to millisecond precision.
- TRUNCATE - Enum constant in enum class org.apache.beam.sdk.io.snowflake.enums.WriteDisposition
- TruncateResult() - Constructor for class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker.TruncateResult
- trustManagers() - Method in class org.apache.beam.sdk.io.aws2.common.ClientBuilderFactory.SkipCertificateVerificationTrustManagerProvider
- tryAcquireJobLock(Configuration) - Method in interface org.apache.beam.sdk.io.hadoop.format.ExternalSynchronization
-
Tries to acquire lock for given job.
- tryAcquireJobLock(Configuration) - Method in class org.apache.beam.sdk.io.hadoop.format.HDFSSynchronization
- tryClaim(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.DetectNewPartitionsRangeTracker
-
Attempts to claim the given position.
- tryClaim(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.ReadChangeStreamPartitionRangeTracker
-
Attempts to claim the given position.
- tryClaim(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
-
Attempts to claim the given position.
- tryClaim(Timestamp, PartitionMetadata) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
- tryClaim(Long) - Method in class org.apache.beam.sdk.transforms.PeriodicSequence.OutputRangeTracker
- tryClaim(Long) - Method in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
-
Attempts to claim the given offset.
- tryClaim(StreamProgress) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.ReadChangeStreamPartitionProgressTracker
-
Claims a new StreamProgress to be processed.
- tryClaim(ByteKey) - Method in class org.apache.beam.sdk.transforms.splittabledofn.ByteKeyRangeTracker
-
Attempts to claim the given key.
- tryClaim(PositionT) - Method in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker
-
Attempts to claim the block of work in the current restriction identified by the given position.
- tryInterrupt(T) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.RestrictionInterrupter
-
Returns true if the restriction tracker should be interrupted in claiming new positions.
- tryProcess() - Method in class org.apache.beam.runners.jet.processors.ParDoP
- tryProcess() - Method in class org.apache.beam.runners.jet.processors.StatefulParDoP
- tryProcess() - Method in class org.apache.beam.runners.jet.processors.WindowGroupP
- tryProcess(int, Object) - Method in class org.apache.beam.runners.jet.processors.AssignWindowP
- tryProcess(int, Object) - Method in class org.apache.beam.runners.jet.processors.FlattenP
- tryProcess(int, Object) - Method in class org.apache.beam.runners.jet.processors.ViewP
- tryProcess(int, Object) - Method in class org.apache.beam.runners.jet.processors.WindowGroupP
- tryProcessWatermark(Watermark) - Method in class org.apache.beam.runners.jet.processors.ParDoP
- tryProcessWatermark(Watermark) - Method in class org.apache.beam.runners.jet.processors.StatefulParDoP
- tryProcessWatermark(Watermark) - Method in class org.apache.beam.runners.jet.processors.WindowGroupP
- tryReturnRecordAt(boolean, long) - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
- tryReturnRecordAt(boolean, Long) - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
- tryReturnRecordAt(boolean, ByteKey) - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
- tryReturnRecordAt(boolean, PositionT) - Method in interface org.apache.beam.sdk.io.range.RangeTracker
-
Atomically determines whether a record at the given position can be returned and updates internal state.
- trySplit(double) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.DetectNewPartitionsTracker
- trySplit(double) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.ReadChangeStreamPartitionProgressTracker
-
Splits the work that's left.
- trySplit(double) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.ReadChangeStreamPartitionRangeTracker
-
If the partition token is the
InitialPartition.PARTITION_TOKEN
, it does not allow for splits (returns null). - trySplit(double) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
-
Splits the restriction through the following algorithm:
- trySplit(double) - Method in class org.apache.beam.sdk.transforms.PeriodicSequence.OutputRangeTracker
- trySplit(double) - Method in class org.apache.beam.sdk.transforms.splittabledofn.ByteKeyRangeTracker
- trySplit(double) - Method in class org.apache.beam.sdk.transforms.splittabledofn.GrowableOffsetRangeTracker
- trySplit(double) - Method in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
- trySplit(double) - Method in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker
-
Splits current restriction based on
fractionOfRemainder
. - trySplitAtPosition(long) - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
- trySplitAtPosition(Long) - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
- trySplitAtPosition(ByteKey) - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
- trySplitAtPosition(PositionT) - Method in interface org.apache.beam.sdk.io.range.RangeTracker
-
Atomically splits the current range [
RangeTracker.getStartPosition()
,RangeTracker.getStopPosition()
) into a "primary" part [RangeTracker.getStartPosition()
,splitPosition
) and a "residual" part [splitPosition
,RangeTracker.getStopPosition()
), assuming the current last-consumed position is within [RangeTracker.getStartPosition()
, splitPosition) (i.e.,splitPosition
has not been consumed yet). - tuple(Map<String, TableSchema.ColumnType>) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- tuple(T1, T2) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.utils.ScalaInterop
- TUPLE - Enum constant in enum class org.apache.beam.sdk.io.clickhouse.TableSchema.TypeName
- TUPLE - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- TUPLE_TAGS - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- tupleEncoder(Encoder<T1>, Encoder<T2>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
- TupleTag<V> - Class in org.apache.beam.sdk.values
-
A
TupleTag
is a typed tag to use as the key of a heterogeneously typed tuple, likePCollectionTuple
. - TupleTag() - Constructor for class org.apache.beam.sdk.values.TupleTag
-
Constructs a new
TupleTag
, with a fresh unique id. - TupleTag(String) - Constructor for class org.apache.beam.sdk.values.TupleTag
-
Constructs a new
TupleTag
with the given id. - TupleTagFilter(TupleTag<V>) - Constructor for class org.apache.beam.runners.spark.translation.TranslationUtils.TupleTagFilter
- TupleTagList - Class in org.apache.beam.sdk.values
-
A
TupleTagList
is an immutable list of heterogeneously typedTupleTags
. - tupleTypes() - Method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- TVFSlidingWindowFn - Class in org.apache.beam.sdk.extensions.sql.impl
-
TVFSlidingWindowFn assigns window based on input row's "window_start" and "window_end" timestamps.
- TVFSlidingWindowFn() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.TVFSlidingWindowFn
- TVFStreamingUtils - Class in org.apache.beam.sdk.extensions.sql.impl.utils
-
Provides static constants or utils for TVF streaming.
- TVFStreamingUtils() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.utils.TVFStreamingUtils
- Twister2AssignContext<T,
W> - Class in org.apache.beam.runners.twister2.utils -
doc.
- Twister2AssignContext(WindowFn<T, W>, WindowedValue<T>) - Constructor for class org.apache.beam.runners.twister2.utils.Twister2AssignContext
- Twister2BatchPipelineTranslator - Class in org.apache.beam.runners.twister2.translators
-
Twister pipeline translator for batch pipelines.
- Twister2BatchPipelineTranslator(Twister2PipelineOptions, Twister2BatchTranslationContext) - Constructor for class org.apache.beam.runners.twister2.translators.Twister2BatchPipelineTranslator
- Twister2BatchTranslationContext - Class in org.apache.beam.runners.twister2
-
Twister2BatchTranslationContext.
- Twister2BatchTranslationContext(Twister2PipelineOptions) - Constructor for class org.apache.beam.runners.twister2.Twister2BatchTranslationContext
- Twister2BoundedSource<T> - Class in org.apache.beam.runners.twister2.translation.wrappers
-
Twister2 wrapper for Bounded Source.
- Twister2BoundedSource(BoundedSource<T>, Twister2TranslationContext, PipelineOptions) - Constructor for class org.apache.beam.runners.twister2.translation.wrappers.Twister2BoundedSource
- Twister2EmptySource<T> - Class in org.apache.beam.runners.twister2.translation.wrappers
-
Empty Source wrapper.
- Twister2EmptySource() - Constructor for class org.apache.beam.runners.twister2.translation.wrappers.Twister2EmptySource
- Twister2PipelineExecutionEnvironment - Class in org.apache.beam.runners.twister2
-
Twister2PipelineExecutionEnvironment.
- Twister2PipelineExecutionEnvironment(Twister2PipelineOptions) - Constructor for class org.apache.beam.runners.twister2.Twister2PipelineExecutionEnvironment
- Twister2PipelineOptions - Interface in org.apache.beam.runners.twister2
-
Twister2PipelineOptions.
- Twister2PipelineResult - Class in org.apache.beam.runners.twister2
-
Represents a Twister2 pipeline execution result.
- Twister2PipelineResult(Twister2JobState) - Constructor for class org.apache.beam.runners.twister2.Twister2PipelineResult
- Twister2PipelineTranslator - Class in org.apache.beam.runners.twister2.translators
-
Twister2PipelineTranslator, both batch and streaming translators need to extend from this.
- Twister2PipelineTranslator() - Constructor for class org.apache.beam.runners.twister2.translators.Twister2PipelineTranslator
- Twister2Runner - Class in org.apache.beam.runners.twister2
-
A
PipelineRunner
that executes the operations in the pipeline by first translating them to a Twister2 Plan and then executing them either locally or on a Twister2 cluster, depending on the configuration. - Twister2Runner(Twister2PipelineOptions) - Constructor for class org.apache.beam.runners.twister2.Twister2Runner
- Twister2RunnerRegistrar - Class in org.apache.beam.runners.twister2
-
AutoService registrar - will register Twister2Runner and Twister2Options as possible pipeline runner services.
- Twister2RunnerRegistrar.Options - Class in org.apache.beam.runners.twister2
-
Pipeline options registrar.
- Twister2RunnerRegistrar.Runner - Class in org.apache.beam.runners.twister2
-
Pipeline runner registrar.
- Twister2SideInputReader - Class in org.apache.beam.runners.twister2.utils
- Twister2SideInputReader(Map<TupleTag<?>, WindowingStrategy<?, ?>>, TSetContext) - Constructor for class org.apache.beam.runners.twister2.utils.Twister2SideInputReader
- Twister2SinkFunction<T> - Class in org.apache.beam.runners.twister2.translators.functions
-
Sink Function that collects results.
- Twister2SinkFunction() - Constructor for class org.apache.beam.runners.twister2.translators.functions.Twister2SinkFunction
- Twister2StreamPipelineTranslator - Class in org.apache.beam.runners.twister2.translators
-
Twister pipeline translator for stream pipelines.
- Twister2StreamPipelineTranslator() - Constructor for class org.apache.beam.runners.twister2.translators.Twister2StreamPipelineTranslator
- Twister2StreamTranslationContext - Class in org.apache.beam.runners.twister2
-
Twister2StreamingTranslationContext.
- Twister2StreamTranslationContext(Twister2PipelineOptions) - Constructor for class org.apache.beam.runners.twister2.Twister2StreamTranslationContext
- Twister2TestRunner - Class in org.apache.beam.runners.twister2
-
A
PipelineRunner
that executes the operations in the pipeline by first translating them to a Twister2 Plan and then executing them either locally or on a Twister2 cluster, depending on the configuration. - Twister2TestRunner(Twister2PipelineOptions) - Constructor for class org.apache.beam.runners.twister2.Twister2TestRunner
- Twister2TranslationContext - Class in org.apache.beam.runners.twister2
-
Twister2TranslationContext.
- Twister2TranslationContext(Twister2PipelineOptions) - Constructor for class org.apache.beam.runners.twister2.Twister2TranslationContext
- type - Variable in class org.apache.beam.runners.dataflow.util.OutputReference
- type - Variable in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory
- type - Variable in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.TypeWithNullability
- type() - Method in interface org.apache.beam.sdk.extensions.sql.meta.catalog.Catalog
-
A type that defines this catalog.
- type() - Method in class org.apache.beam.sdk.extensions.sql.meta.catalog.InMemoryCatalog
- type() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.iceberg.IcebergCatalog
- type(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.Table.Builder
- TYPE_ERASURE - Enum constant in enum class org.apache.beam.sdk.coders.CannotProvideCoderException.ReasonCode
-
The reason a coder could not be provided is type erasure, for example when requesting coder inference for a
List<T>
whereT
is unknown. - TypeCode - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
Represents a type of a column within Cloud Spanner.
- TypeCode(String) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.TypeCode
-
Constructs a type code from the given String code.
- TypeConversion() - Constructor for class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
- TypedCombineFnDelegate<InputT,
AccumT, - Class in org.apache.beam.sdk.extensions.sqlOutputT> -
A
Combine.CombineFn
delegating all relevant calls to given delegate. - TypedCombineFnDelegate(Combine.CombineFn<InputT, AccumT, OutputT>) - Constructor for class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
- typedef(String, Schema.FieldType) - Method in class org.apache.beam.sdk.io.thrift.ThriftSchema.Customizer
- TypeDescriptor<T> - Class in org.apache.beam.sdk.values
-
A description of a Java type, including actual generic parameters where possible.
- TypeDescriptor() - Constructor for class org.apache.beam.sdk.values.TypeDescriptor
-
Creates a
TypeDescriptor
representing the type parameterT
. - TypeDescriptor(Class<?>) - Constructor for class org.apache.beam.sdk.values.TypeDescriptor
-
Creates a
TypeDescriptor
representing the type parameterT
, which should resolve to a concrete type in the context of the classclazz
. - TypeDescriptor(Object) - Constructor for class org.apache.beam.sdk.values.TypeDescriptor
-
Creates a
TypeDescriptor
representing the type parameterT
, which should resolve to a concrete type in the context of the classclazz
. - TypeDescriptors - Class in org.apache.beam.sdk.values
-
A utility class for creating
TypeDescriptor
objects for different types, such as Java primitive types, containers andKVs
of otherTypeDescriptor
objects, and extracting type variables of parameterized types (e.g. - TypeDescriptors() - Constructor for class org.apache.beam.sdk.values.TypeDescriptors
- TypeDescriptors.TypeVariableExtractor<InputT,
OutputT> - Interface in org.apache.beam.sdk.values -
A helper interface for use with
TypeDescriptors.extractFromTypeParameters(Object, Class, TypeVariableExtractor)
. - TypeDescriptorWithSchema() - Constructor for class org.apache.beam.sdk.schemas.utils.ReflectUtils.TypeDescriptorWithSchema
- TypedRead() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- TypedSchemaTransformProvider<ConfigT> - Class in org.apache.beam.sdk.schemas.transforms
-
Like
SchemaTransformProvider
except uses a configuration object instead of Schema and Row. - TypedSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.schemas.transforms.TypedSchemaTransformProvider
- TypedWrite() - Constructor for class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
- TypedWrite() - Constructor for class org.apache.beam.sdk.io.TextIO.TypedWrite
- typeName() - Method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- TypeParameter<T> - Class in org.apache.beam.sdk.values
-
Captures a free type variable that can be used in
TypeDescriptor.where(org.apache.beam.sdk.values.TypeParameter<X>, org.apache.beam.sdk.values.TypeDescriptor<X>)
. - TypeParameter() - Constructor for class org.apache.beam.sdk.values.TypeParameter
- typesEqual(Schema) - Method in class org.apache.beam.sdk.schemas.Schema
-
Returns true if two schemas are equal ignoring field names and descriptions.
- typesEqual(Schema.Field) - Method in class org.apache.beam.sdk.schemas.Schema.Field
-
Returns true if two fields are equal, ignoring name and description.
- typesEqual(Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
-
Returns true if two FieldTypes are equal.
- typeToProtoType(String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
- TZTimeOnly() - Constructor for class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimeOnly
- TZTimestamp() - Constructor for class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimestamp
U
- UdafImpl<InputT,
AccumT, - Class in org.apache.beam.sdk.extensions.sql.implOutputT> -
Implement
AggregateFunction
to take aCombine.CombineFn
as UDAF. - UdafImpl(Combine.CombineFn<InputT, AccumT, OutputT>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.UdafImpl
- UDF_METHOD - Static variable in interface org.apache.beam.sdk.extensions.sql.BeamSqlUdf
- UdfImplReflectiveFunctionBase - Class in org.apache.beam.sdk.extensions.sql.impl
-
Beam-customized version from
ReflectiveFunctionBase
, to address BEAM-5921. - UdfImplReflectiveFunctionBase(Method) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.UdfImplReflectiveFunctionBase
-
UdfImplReflectiveFunctionBase
constructor. - UdfImplReflectiveFunctionBase.ParameterListBuilder - Class in org.apache.beam.sdk.extensions.sql.impl
-
Helps build lists of
FunctionParameter
. - UdfProvider - Interface in org.apache.beam.sdk.extensions.sql.udf
-
Provider for user-defined functions written in Java.
- UdfTestProvider - Class in org.apache.beam.sdk.extensions.sql.provider
-
Defines Java UDFs for use in tests.
- UdfTestProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider
- UdfTestProvider.DateIncrementAllFn - Class in org.apache.beam.sdk.extensions.sql.provider
- UdfTestProvider.HelloWorldFn - Class in org.apache.beam.sdk.extensions.sql.provider
- UdfTestProvider.IncrementFn - Class in org.apache.beam.sdk.extensions.sql.provider
- UdfTestProvider.IsNullFn - Class in org.apache.beam.sdk.extensions.sql.provider
- UdfTestProvider.MatchFn - Class in org.apache.beam.sdk.extensions.sql.provider
- UdfTestProvider.Sum - Class in org.apache.beam.sdk.extensions.sql.provider
- UdfTestProvider.UnusedFn - Class in org.apache.beam.sdk.extensions.sql.provider
- UdfUdafProvider - Interface in org.apache.beam.sdk.extensions.sql.meta.provider
-
Provider for UDF and UDAF.
- Uint16() - Constructor for class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.Uint16
- UINT16 - Enum constant in enum class org.apache.beam.sdk.io.clickhouse.TableSchema.TypeName
- UINT16 - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- uint16Behavior() - Method in class org.apache.beam.sdk.extensions.sbe.UnsignedOptions
- Uint32() - Constructor for class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.Uint32
- UINT32 - Enum constant in enum class org.apache.beam.sdk.io.clickhouse.TableSchema.TypeName
- UINT32 - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- uint32Behavior() - Method in class org.apache.beam.sdk.extensions.sbe.UnsignedOptions
- Uint64() - Constructor for class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.Uint64
- UINT64 - Enum constant in enum class org.apache.beam.sdk.io.clickhouse.TableSchema.TypeName
- UINT64 - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- uint64Behavior() - Method in class org.apache.beam.sdk.extensions.sbe.UnsignedOptions
- Uint8() - Constructor for class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.Uint8
- UINT8 - Enum constant in enum class org.apache.beam.sdk.io.clickhouse.TableSchema.TypeName
- UINT8 - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- uint8Behavior() - Method in class org.apache.beam.sdk.extensions.sbe.UnsignedOptions
- unbounded() - Static method in class org.apache.beam.sdk.io.CountingSource
-
Deprecated.use
GenerateSequence
instead - unbounded(String, UnboundedSource<T, ?>, SerializablePipelineOptions, int) - Static method in class org.apache.beam.runners.flink.translation.wrappers.streaming.io.source.FlinkSource
- Unbounded(SparkContext, SerializablePipelineOptions, MicrobatchSource<T, CheckpointMarkT>, int) - Constructor for class org.apache.beam.runners.spark.io.SourceRDD.Unbounded
- UNBOUNDED - Enum constant in enum class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker.IsBounded
-
Indicates that a
Restriction
represents an unbounded amount of work. - UNBOUNDED - Enum constant in enum class org.apache.beam.sdk.values.PCollection.IsBounded
-
Indicates that a
PCollection
contains an unbounded number of elements. - UNBOUNDED_UNKNOWN - Static variable in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
- UnboundedBatchedSolaceWriter - Class in org.apache.beam.sdk.io.solace.write
-
This DoFn is the responsible for writing to Solace in batch mode (holding up any messages), and emit the corresponding output (success or fail; only for persistent messages), so the SolaceIO.Write connector can be composed with other subsequent transforms in the pipeline.
- UnboundedBatchedSolaceWriter(SerializableFunction<Solace.Record, Destination>, SessionServiceFactory, DeliveryMode, SolaceIO.SubmissionMode, int, boolean) - Constructor for class org.apache.beam.sdk.io.solace.write.UnboundedBatchedSolaceWriter
- UnboundedDataset<T> - Class in org.apache.beam.runners.spark.translation.streaming
-
DStream holder Can also crate a DStream from a supplied queue of values, but mainly for testing.
- UnboundedDataset(JavaDStream<WindowedValue<T>>, List<Integer>) - Constructor for class org.apache.beam.runners.spark.translation.streaming.UnboundedDataset
- UnboundedReader() - Constructor for class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
- UnboundedReaderImpl - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
- UnboundedReaderMaxReadTimeFactory() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions.UnboundedReaderMaxReadTimeFactory
- UnboundedSolaceSource<T> - Class in org.apache.beam.sdk.io.solace.read
- UnboundedSolaceSource(Queue, SempClientFactory, SessionServiceFactory, Integer, boolean, Coder<T>, SerializableFunction<T, Instant>, Duration, SerializableFunction<BytesXMLMessage, T>) - Constructor for class org.apache.beam.sdk.io.solace.read.UnboundedSolaceSource
- UnboundedSolaceWriter - Class in org.apache.beam.sdk.io.solace.write
-
This DoFn encapsulates common code used both for the
UnboundedBatchedSolaceWriter
andUnboundedStreamingSolaceWriter
. - UnboundedSolaceWriter(SerializableFunction<Solace.Record, Destination>, SessionServiceFactory, DeliveryMode, SolaceIO.SubmissionMode, int, boolean) - Constructor for class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
- UnboundedSource<OutputT,
CheckpointMarkT> - Class in org.apache.beam.sdk.io -
A
Source
that reads an unbounded amount of input and, because of that, supports some additional operations such as checkpointing, watermarks, and record ids. - UnboundedSource() - Constructor for class org.apache.beam.sdk.io.UnboundedSource
- UnboundedSource.CheckpointMark - Interface in org.apache.beam.sdk.io
-
A marker representing the progress and state of an
UnboundedSource.UnboundedReader
. - UnboundedSource.CheckpointMark.NoopCheckpointMark - Class in org.apache.beam.sdk.io
-
A checkpoint mark that does nothing when finalized.
- UnboundedSource.UnboundedReader<OutputT> - Class in org.apache.beam.sdk.io
-
A
Reader
that reads an unbounded amount of input. - UnboundedSourceImpl - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
- UnboundedSourceP<T,
CmT> - Class in org.apache.beam.runners.jet.processors -
Jet
Processor
implementation for reading from an unbounded Beam source. - UnboundedSourceWrapper<OutputT,
CheckpointMarkT> - Class in org.apache.beam.runners.flink.translation.wrappers.streaming.io -
Wrapper for executing
UnboundedSources
as a Flink Source. - UnboundedSourceWrapper(String, PipelineOptions, UnboundedSource<OutputT, CheckpointMarkT>, int) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.io.UnboundedSourceWrapper
- UnboundedStreamingSolaceWriter - Class in org.apache.beam.sdk.io.solace.write
-
This DoFn is the responsible for writing to Solace in streaming mode (one message at a time, not holding up any message), and emit the corresponding output (success or fail; only for persistent messages), so the SolaceIO.Write connector can be composed with other subsequent transforms in the pipeline.
- UnboundedStreamingSolaceWriter(SerializableFunction<Solace.Record, Destination>, SessionServiceFactory, DeliveryMode, SolaceIO.SubmissionMode, int, boolean) - Constructor for class org.apache.beam.sdk.io.solace.write.UnboundedStreamingSolaceWriter
- unboundedWithTimestampFn(SerializableFunction<Long, Instant>) - Static method in class org.apache.beam.sdk.io.CountingSource
-
Deprecated.use
GenerateSequence
and callGenerateSequence.withTimestampFn(SerializableFunction)
instead - unboxedType - Variable in class org.apache.beam.sdk.schemas.utils.ConvertHelpers.ConvertedSchemaInformation
- UNCOMPRESSED - Enum constant in enum class org.apache.beam.sdk.io.CompressedSource.CompressionMode
-
Deprecated.
- UNCOMPRESSED - Enum constant in enum class org.apache.beam.sdk.io.Compression
-
No compression.
- UNCOMPRESSED - Enum constant in enum class org.apache.beam.sdk.io.FileBasedSink.CompressionType
-
Deprecated.
- UNCOMPRESSED - Enum constant in enum class org.apache.beam.sdk.io.TextIO.CompressionType
-
Deprecated.
- UNCOMPRESSED - Enum constant in enum class org.apache.beam.sdk.io.xml.XmlIO.Read.CompressionType
-
Deprecated.
- union(Iterable<FieldAccessDescriptor>) - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
- union(Contextful...) - Static method in class org.apache.beam.sdk.transforms.Requirements
- UNION - Enum constant in enum class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSetOperatorRelBase.OpType
- unionAll() - Static method in class org.apache.beam.sdk.transforms.Sets
-
Returns a new
PTransform
transform that follows SET ALL semantics which takes aPCollectionList<PCollection<T>>
and returns aPCollection<T>
containing the unionAll of collections done in order for all collections inPCollectionList<T>
. - unionAll(PCollection<T>) - Static method in class org.apache.beam.sdk.transforms.Sets
-
Returns a new
PTransform
transform that follows SET ALL semantics to compute the unionAll with providedPCollection<T>
. - UnionCoder - Class in org.apache.beam.sdk.transforms.join
-
A UnionCoder encodes RawUnionValues.
- unionDistinct() - Static method in class org.apache.beam.sdk.transforms.Sets
-
Returns a new
PTransform
transform that follows SET DISTINCT semantics which takes aPCollectionList<PCollection<T>>
and returns aPCollection<T>
containing the union of collections done in order for all collections inPCollectionList<T>
. - unionDistinct(PCollection<T>) - Static method in class org.apache.beam.sdk.transforms.Sets
-
Returns a new
PTransform
transform that follows SET DISTINCT semantics to compute the union with providedPCollection<T>
. - UniqueIdGenerator - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams
-
Generate unique IDs that can be used to differentiate different jobs and partitions.
- UniqueIdGenerator() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.UniqueIdGenerator
- unknown() - Static method in class org.apache.beam.sdk.io.fs.MatchResult
-
Returns a
MatchResult
withMatchResult.Status.UNKNOWN
. - UNKNOWN - Enum constant in enum class org.apache.beam.sdk.coders.CannotProvideCoderException.ReasonCode
-
The reason a coder could not be provided is unknown or does not have an established
CannotProvideCoderException.ReasonCode
. - UNKNOWN - Enum constant in enum class org.apache.beam.sdk.io.fs.MatchResult.Status
- UNKNOWN - Enum constant in enum class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ModType
- UNKNOWN - Enum constant in enum class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ValueCaptureType
- UNKNOWN - Enum constant in enum class org.apache.beam.sdk.io.solace.data.Solace.DestinationType
- UNKNOWN - Enum constant in enum class org.apache.beam.sdk.PipelineResult.State
-
The job state was not specified or unknown to a runner.
- UNKNOWN - Enum constant in enum class org.apache.beam.sdk.transforms.windowing.PaneInfo.Timing
-
This element was not produced in a triggered pane and its relation to input and output watermarks is unknown.
- UNKNOWN - Static variable in class org.apache.beam.sdk.extensions.sql.impl.planner.NodeStats
-
Returns an instance with all values set to INFINITY.
- UNKNOWN - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
- UnknownLogicalType<T> - Class in org.apache.beam.sdk.schemas.logicaltypes
-
A base class for logical types that are not understood by the Java SDK.
- UnknownLogicalType(String, byte[], Schema.FieldType, Object, Schema.FieldType) - Constructor for class org.apache.beam.sdk.schemas.logicaltypes.UnknownLogicalType
- unparse(SqlWriter, int, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCheckConstraint
- unparse(SqlWriter, int, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlColumnDeclaration
- unparse(SqlWriter, int, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateCatalog
- unparse(SqlWriter, int, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateDatabase
- unparse(SqlWriter, int, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateExternalTable
- unparse(SqlWriter, int, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateFunction
- unparse(SqlWriter, int, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlDropCatalog
- unparse(SqlWriter, int, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlDropDatabase
- unparse(SqlWriter, int, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlDropTable
- unparseCall(SqlWriter, SqlCall, int, int) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamBigQuerySqlDialect
- unparseDateTimeLiteral(SqlWriter, SqlAbstractDateTimeLiteral, int, int) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamBigQuerySqlDialect
- unparseSqlIntervalLiteral(SqlWriter, SqlIntervalLiteral, int, int) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamBigQuerySqlDialect
-
BigQuery interval syntax: INTERVAL int64 time_unit.
- unpersist() - Method in class org.apache.beam.runners.spark.util.SideInputBroadcast
- unpin() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StreamAppendClient
-
Unpin this object.
- UnprocessedEvent<EventT> - Class in org.apache.beam.sdk.extensions.ordered
-
Combines the source event which failed to process with the failure reason.
- UnprocessedEvent() - Constructor for class org.apache.beam.sdk.extensions.ordered.UnprocessedEvent
- UnprocessedEvent.Reason - Enum Class in org.apache.beam.sdk.extensions.ordered
- unprocessedEvents() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedEventProcessorResult
- UNRECOGNIZED - Enum constant in enum class org.apache.beam.sdk.PipelineResult.State
-
The job state reported by a runner cannot be interpreted by the SDK.
- unregisterConsumer(String) - Method in class org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer
-
Unregisters a previously registered consumer.
- unregisterReceiver(String) - Method in interface org.apache.beam.runners.fnexecution.data.FnDataService
-
Receivers are only expected to be unregistered when bundle processing has completed successfully.
- unregisterReceiver(String) - Method in class org.apache.beam.runners.fnexecution.data.GrpcDataService
- UNSAFELY_ATTEMPT_TO_PROCESS_UNBOUNDED_DATA_IN_BATCH_MODE - Static variable in class org.apache.beam.runners.dataflow.DataflowRunner
-
Experiment to "unsafely attempt to process unbounded data in batch mode".
- UNSIGNED_LEXICOGRAPHICAL_COMPARATOR - Static variable in class org.apache.beam.runners.dataflow.util.RandomAccessData
- UnsignedOptions - Class in org.apache.beam.sdk.extensions.sbe
-
Options for controlling what to do with unsigned types, specifically whether to use a higher bit count or, in the case of uint64, a string.
- UnsignedOptions() - Constructor for class org.apache.beam.sdk.extensions.sbe.UnsignedOptions
- UnsignedOptions.Behavior - Enum Class in org.apache.beam.sdk.extensions.sbe
-
Defines the exact behavior for unsigned types.
- UnsignedOptions.Builder - Class in org.apache.beam.sdk.extensions.sbe
-
Builder for
UnsignedOptions
. - UNSPECIFIED - Enum constant in enum class org.apache.beam.runners.dataflow.options.DataflowPipelineOptions.FlexResourceSchedulingGoal
-
No goal specified.
- unsupported() - Static method in interface org.apache.beam.runners.fnexecution.control.BundleSplitHandler
-
Returns a bundle split handler that throws on any split response.
- unsupported() - Static method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandler
- unsupported() - Static method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.BagUserStateHandlerFactory
-
Throws a
UnsupportedOperationException
on the first access. - unsupported() - Static method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.SideInputHandlerFactory
-
Throws a
UnsupportedOperationException
on the first access. - UnusedFn() - Constructor for class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.UnusedFn
- UnversionedTypeSerializerSnapshot<T> - Class in org.apache.beam.runners.flink.translation.types
-
A legacy snapshot which does not care about schema compatibility.
- UnversionedTypeSerializerSnapshot() - Constructor for class org.apache.beam.runners.flink.translation.types.UnversionedTypeSerializerSnapshot
-
Needs to be public to work with
VersionedIOReadableWritable
. - UnversionedTypeSerializerSnapshot(CoderTypeSerializer<T>) - Constructor for class org.apache.beam.runners.flink.translation.types.UnversionedTypeSerializerSnapshot
- unwindowedFilename(int, int, FileBasedSink.OutputFileHints) - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy
- unwindowedFilename(int, int, FileBasedSink.OutputFileHints) - Method in class org.apache.beam.sdk.io.FileBasedSink.FilenamePolicy
-
When a sink has not requested windowed or triggered output, this method will be invoked to return the file
resource
to be created given the base output directory and aFileBasedSink.OutputFileHints
containing information about the file, including a suggested (e.g. - unwrap(Class<T>) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
- update(double) - Method in class org.apache.beam.sdk.io.kafka.KafkaIOUtils.MovingAvg
- update(double) - Method in class org.apache.beam.sdk.metrics.DelegatingHistogram
- update(double) - Method in interface org.apache.beam.sdk.metrics.Histogram
-
Add an observation to this histogram.
- update(double) - Method in class org.apache.beam.sdk.metrics.NoOpHistogram
- update(double...) - Method in class org.apache.beam.sdk.metrics.DelegatingHistogram
- update(double...) - Method in interface org.apache.beam.sdk.metrics.Histogram
-
Add observations to this histogram.
- update(double...) - Method in class org.apache.beam.sdk.metrics.NoOpHistogram
- update(long) - Method in class org.apache.beam.runners.jet.metrics.DistributionImpl
- update(long) - Method in class org.apache.beam.sdk.metrics.DelegatingDistribution
- update(long) - Method in interface org.apache.beam.sdk.metrics.Distribution
-
Add an observation to this distribution.
- update(long, long, long, long) - Method in class org.apache.beam.runners.jet.metrics.DistributionImpl
- update(long, long, long, long) - Method in class org.apache.beam.sdk.metrics.DelegatingDistribution
- update(long, long, long, long) - Method in interface org.apache.beam.sdk.metrics.Distribution
- update(Timestamp, T) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.BytesThroughputEstimator
-
Updates the estimator with the bytes of records.
- update(Timestamp, T) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.NullThroughputEstimator
-
NoOp.
- update(Timestamp, T) - Method in interface org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.ThroughputEstimator
-
Updates the estimator with the size of the records.
- update(UserCodeExecutionException) - Method in interface org.apache.beam.io.requestresponse.CallShouldBackoff
-
Update the state of whether to backoff using information about the exception.
- update(KinesisRecord) - Method in interface org.apache.beam.sdk.io.aws2.kinesis.WatermarkPolicy
- update(KinesisRecord) - Method in class org.apache.beam.sdk.io.aws2.kinesis.WatermarkPolicyFactory.ArrivalTimeWatermarkPolicy
- update(KinesisRecord) - Method in class org.apache.beam.sdk.io.aws2.kinesis.WatermarkPolicyFactory.CustomWatermarkPolicy
- update(KinesisRecord) - Method in class org.apache.beam.sdk.io.aws2.kinesis.WatermarkPolicyFactory.ProcessingTimeWatermarkPolicy
- update(HistogramData) - Method in interface org.apache.beam.sdk.metrics.Histogram
-
Add a histogram to this histogram.
- update(Instant, T) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator.BytesThroughputEstimator
-
Updates the estimator with the bytes of records if it is selected to be sampled.
- update(ResponseT) - Method in interface org.apache.beam.io.requestresponse.CallShouldBackoff
-
Update the state of whether to backoff using information about the response.
- UPDATE - Enum constant in enum class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ModType
- UPDATE_URN - Static variable in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar
- updateBacklogBytes(String, int, long) - Method in class org.apache.beam.sdk.io.kafka.KafkaMetrics.KafkaMetricsImpl
-
This is for tracking backlog bytes to be added to the Metric Container at a later time.
- updateBacklogBytes(String, int, long) - Method in class org.apache.beam.sdk.io.kafka.KafkaMetrics.NoOpKafkaMetrics
- updateBacklogBytes(String, int, long) - Method in interface org.apache.beam.sdk.io.kafka.KafkaMetrics
- UpdateBuilder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.UpdateBuilder
- updateCacheCandidates(Pipeline, SparkPipelineTranslator, EvaluationContext) - Static method in class org.apache.beam.runners.spark.SparkRunner
-
Evaluator that update/populate the cache candidates.
- updateCompatibilityVersionLessThan(PipelineOptions, String) - Static method in interface org.apache.beam.sdk.options.StreamingOptions
- UpdateConfiguration - Class in org.apache.beam.sdk.io.mongodb
-
Builds a MongoDB UpdateConfiguration object.
- UpdateConfiguration() - Constructor for class org.apache.beam.sdk.io.mongodb.UpdateConfiguration
- updateConsumerProperties(Map<String, Object>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
- updateConsumerProperties(Map<String, Object>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Deprecated.as of version 2.13. Use
KafkaIO.Read.withConsumerConfigUpdates(Map)
instead - UPDATED - Enum constant in enum class org.apache.beam.sdk.PipelineResult.State
-
The job has been updated.
- updateDataClientSettings(BigtableDataSettings.Builder) - Method in interface org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.BigtableClientOverride
-
Update
BigtableDataSettings.Builder
with custom configurations. - updateDataRecordCommittedToEmitted(Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
- updateDependentTransforms(Pipeline, SparkPipelineTranslator, EvaluationContext) - Static method in class org.apache.beam.runners.spark.SparkRunner
-
Evaluator that update/populate information about dependent transforms for pCollections.
- updateDetectNewPartitionWatermark(Instant) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
-
Update the watermark cell for Detect New Partition step.
- updateFailedRpcMetrics(Instant, Instant, String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingInsertsMetrics.NoOpStreamingInsertsMetrics
- updateFailedRpcMetrics(Instant, Instant, String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingInsertsMetrics.StreamingInsertsMetricsImpl
-
Record the rpc status and latency of a failed StreamingInserts RPC call.
- updateFailedRpcMetrics(Instant, Instant, String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.StreamingInsertsMetrics
- UpdateField - Class in org.apache.beam.sdk.io.mongodb
- UpdateField() - Constructor for class org.apache.beam.sdk.io.mongodb.UpdateField
- updateInstanceAdminClientSettings(BigtableInstanceAdminSettings.Builder) - Method in interface org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.BigtableClientOverride
-
Update
BigtableInstanceAdminSettings.Builder
with custom configurations. - updateJob(String, Job) - Method in class org.apache.beam.runners.dataflow.DataflowClient
-
Updates the Dataflow
Job
with the givenjobId
. - updateMetrics(String, List<MetricsApi.MonitoringInfo>) - Method in class org.apache.beam.runners.flink.metrics.FlinkMetricContainer
-
Update this container with metrics from the passed
MetricsApi.MonitoringInfo
s, and send updates along to Flink's internal metrics framework. - updateMetrics(String, List<MetricsApi.MonitoringInfo>) - Method in class org.apache.beam.runners.flink.metrics.FlinkMetricContainerWithoutAccumulator
-
Update this container with metrics from the passed
MetricsApi.MonitoringInfo
s, and send updates along to Flink's internal metrics framework. - updatePartitionCreatedToScheduled(Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Adds measurement of an instance for the
ChangeStreamMetrics.PARTITION_CREATED_TO_SCHEDULED_MS
if the metric is enabled. - updatePartitionScheduledToRunning(Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Adds measurement of an instance for the
ChangeStreamMetrics.PARTITION_SCHEDULED_TO_RUNNING_MS
if the metric is enabled. - updateProcessingDelayFromCommitTimestamp(long) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
-
Adds measurement of an instance for the
ChangeStreamMetrics.PROCESSING_DELAY_FROM_COMMIT_TIMESTAMP
. - updateProducerIndex() - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
- updateProducerProperties(Map<String, Object>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
-
Deprecated.as of version 2.13. Use
KafkaIO.Write.withProducerConfigUpdates(Map)
instead. - updateProducerProperties(Map<String, Object>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
-
Deprecated.as of version 2.13. Use
KafkaIO.WriteRecords.withProducerConfigUpdates(Map)
instead. - updateRetriedRowsWithStatus(String, int) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingInsertsMetrics.NoOpStreamingInsertsMetrics
- updateRetriedRowsWithStatus(String, int) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingInsertsMetrics.StreamingInsertsMetricsImpl
-
Update metrics for rows that were retried due to an RPC error.
- updateRetriedRowsWithStatus(String, int) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.StreamingInsertsMetrics
- UpdateSchemaDestination<DestinationT> - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Update destination schema based on data that is about to be copied into it.
- UpdateSchemaDestination(BigQueryServices, PCollectionView<String>, ValueProvider<String>, BigQueryIO.Write.WriteDisposition, BigQueryIO.Write.CreateDisposition, int, String, Set<BigQueryIO.Write.SchemaUpdateOption>, DynamicDestinations<?, DestinationT>) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.UpdateSchemaDestination
- updateSerializedOptions(String, Map<String, String>) - Static method in class org.apache.beam.sdk.options.ValueProviders
-
Deprecated.Use
TestPipeline.newProvider(T)
for testingValueProvider
code. - Updates to the connector code - Search tag in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1
- Section
- Updates to the I/O connector code - Search tag in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
- Section
- Updates to the I/O connector code - Search tag in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO
- Section
- Updates to the I/O connector code - Search tag in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
- Section
- Updates to the I/O connector code - Search tag in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
- Section
- Updates to the I/O connector code - Search tag in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteIO
- Section
- Updates to the I/O connector code - Search tag in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
- Section
- Updates to the I/O connector code - Search tag in class org.apache.beam.sdk.io.kafka.KafkaIO
- Section
- updateStreamingInsertsMetrics(TableReference, int, int) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingInsertsMetrics.NoOpStreamingInsertsMetrics
- updateStreamingInsertsMetrics(TableReference, int, int) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingInsertsMetrics.StreamingInsertsMetricsImpl
-
Export all metrics recorded in this instance to the underlying
perWorkerMetrics
containers. - updateStreamingInsertsMetrics(TableReference, int, int) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.StreamingInsertsMetrics
- updateSuccessfulRpcMetrics(String, Duration) - Method in class org.apache.beam.sdk.io.kafka.KafkaMetrics.KafkaMetricsImpl
-
Record the rpc status and latency of a successful Kafka poll RPC call.
- updateSuccessfulRpcMetrics(String, Duration) - Method in class org.apache.beam.sdk.io.kafka.KafkaMetrics.NoOpKafkaMetrics
- updateSuccessfulRpcMetrics(String, Duration) - Method in interface org.apache.beam.sdk.io.kafka.KafkaMetrics
- updateSuccessfulRpcMetrics(Instant, Instant) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingInsertsMetrics.NoOpStreamingInsertsMetrics
- updateSuccessfulRpcMetrics(Instant, Instant) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingInsertsMetrics.StreamingInsertsMetricsImpl
-
Record the rpc status and latency of a successful StreamingInserts RPC call.
- updateSuccessfulRpcMetrics(Instant, Instant) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.StreamingInsertsMetrics
- updateTableAdminClientSettings(BigtableTableAdminSettings.Builder) - Method in interface org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.BigtableClientOverride
-
Update
BigtableTableAdminSettings.Builder
with custom configurations. - updateTableSchema(TableReference, TableSchema) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- updateTableSchema(String, String, String, TableSchema) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
- updateToFinished(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.InTransactionContext
-
Updates a partition row to
PartitionMetadata.State.FINISHED
state. - updateToFinished(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
-
Updates a partition row to
PartitionMetadata.State.FINISHED
state. - updateToRunning(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.InTransactionContext
-
Updates a partition row to
PartitionMetadata.State.RUNNING
state. - updateToRunning(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
-
Updates a partition row to
PartitionMetadata.State.RUNNING
state. - updateToScheduled(List<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.InTransactionContext
-
Updates multiple partition rows to
PartitionMetadata.State.SCHEDULED
state. - updateToScheduled(List<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
-
Updates multiple partition row to
PartitionMetadata.State.SCHEDULED
state. - updateWatermark(Range.ByteStringRange, Instant, ChangeStreamContinuationToken) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
-
Update the metadata for the row key represented by the partition.
- updateWatermark(String, Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.InTransactionContext
-
Update the partition watermark to the given timestamp.
- updateWatermark(String, Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
-
Update the partition watermark to the given timestamp.
- updateWindowingStrategy(WindowingStrategy<?, ?>) - Method in class org.apache.beam.runners.dataflow.internal.DataflowGroupByKey
- updateWindowingStrategy(WindowingStrategy<?, ?>) - Method in class org.apache.beam.sdk.transforms.GroupByKey
- UploadIdResponseInterceptor - Class in org.apache.beam.sdk.extensions.gcp.util
-
Implements a response intercepter that logs the upload id if the upload id header exists and it is the first request (does not have upload_id parameter in the request).
- UploadIdResponseInterceptor() - Constructor for class org.apache.beam.sdk.extensions.gcp.util.UploadIdResponseInterceptor
- uploadToDicomStore(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Upload to a Dicom Store.
- uploadToDicomStore(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- UPSERT - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.RowMutationInformation.MutationType
- Upserts and deletes - Search tag in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
- Section
- upTo(long) - Static method in class org.apache.beam.sdk.io.CountingSource
-
Deprecated.use
GenerateSequence
instead - URL_ARTIFACT_URN - Static variable in class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
- URN - Static variable in class org.apache.beam.sdk.fn.data.RemoteGrpcPortRead
- URN - Static variable in class org.apache.beam.sdk.fn.data.RemoteGrpcPortWrite
- URN - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead
- URN - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite
- URN - Static variable in class org.apache.beam.sdk.io.GenerateSequence.External
- URN - Static variable in class org.apache.beam.sdk.io.kafka.KafkaIO.Write.External
- URN_WITH_METADATA - Static variable in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External
- URN_WITHOUT_METADATA - Static variable in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External
- Usage - Search tag in org.apache.beam.sdk.io.BoundedSource.BoundedReader.getCurrentSource()
- Section
- Usage - Search tag in org.apache.beam.sdk.io.BoundedSource.BoundedReader.getCurrentSource()
- Section
- Usage - Search tag in org.apache.beam.sdk.io.BoundedSource.BoundedReader.getCurrentSource()
- Section
- Usage - Search tag in org.apache.beam.sdk.io.BoundedSource.BoundedReader.getCurrentSource()
- Section
- Usage example - Search tag in class org.apache.beam.io.debezium.DebeziumIO
- Section
- Usage example - Search tag in class org.apache.beam.sdk.io.solace.SolaceIO
- Section
- Usage Example - Search tag in class org.apache.beam.io.debezium.SourceRecordJson
- Section
- Usage of the RangeTracker class hierarchy - Search tag in interface org.apache.beam.sdk.io.range.RangeTracker
- Section
- Usage with different models of iteration - Search tag in interface org.apache.beam.sdk.io.range.RangeTracker
- Section
- Usage with templates - Search tag in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
- Section
- USE_INDEXED_FORMAT - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- useAbstractConvertersForConversion(RelTraitSet, RelTraitSet) - Method in enum class org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention
- useAvroLogicalTypes() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- useAvroLogicalTypes() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Enables interpreting logical types into their corresponding types (ie.
- Use Avro schema with Confluent Schema Registry - Search tag in class org.apache.beam.sdk.io.kafka.KafkaIO
- Section
- useBeamSchema() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
If true, then the BigQuery schema will be inferred from the input schema.
- useCatalog(String) - Method in interface org.apache.beam.sdk.extensions.sql.meta.catalog.CatalogManager
-
Switches the active catalog.
- useCatalog(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.catalog.EmptyCatalogManager
- useCatalog(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.catalog.InMemoryCatalogManager
- useDatabase(String) - Method in interface org.apache.beam.sdk.extensions.sql.meta.catalog.Catalog
-
Switches to use the specified database.
- useDatabase(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.catalog.InMemoryCatalog
- USER_DEFINED_JAVA_AGGREGATE_FUNCTIONS - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.BeamZetaSqlCatalog
- USER_DEFINED_JAVA_SCALAR_FUNCTIONS - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.BeamZetaSqlCatalog
- USER_DEFINED_SQL_FUNCTIONS - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.BeamZetaSqlCatalog
- USER_FN - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- USER_NAME - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- UserAgentFactory() - Constructor for class org.apache.beam.sdk.options.PipelineOptions.UserAgentFactory
- UserCodeExecutionException - Exception Class in org.apache.beam.io.requestresponse
-
Base
Exception
for signaling errors in user custom code. - UserCodeExecutionException(String) - Constructor for exception class org.apache.beam.io.requestresponse.UserCodeExecutionException
- UserCodeExecutionException(String, Throwable) - Constructor for exception class org.apache.beam.io.requestresponse.UserCodeExecutionException
- UserCodeExecutionException(String, Throwable, boolean, boolean) - Constructor for exception class org.apache.beam.io.requestresponse.UserCodeExecutionException
- UserCodeExecutionException(Throwable) - Constructor for exception class org.apache.beam.io.requestresponse.UserCodeExecutionException
- UserCodeQuotaException - Exception Class in org.apache.beam.io.requestresponse
-
Extends
UserCodeQuotaException
to allow the user custom code to specifically signal a Quota or API overuse related error. - UserCodeQuotaException(String) - Constructor for exception class org.apache.beam.io.requestresponse.UserCodeQuotaException
- UserCodeQuotaException(String, Throwable) - Constructor for exception class org.apache.beam.io.requestresponse.UserCodeQuotaException
- UserCodeQuotaException(String, Throwable, boolean, boolean) - Constructor for exception class org.apache.beam.io.requestresponse.UserCodeQuotaException
- UserCodeQuotaException(Throwable) - Constructor for exception class org.apache.beam.io.requestresponse.UserCodeQuotaException
- UserCodeRemoteSystemException - Exception Class in org.apache.beam.io.requestresponse
-
A
UserCodeExecutionException
that signals an error with a remote system. - UserCodeRemoteSystemException(String) - Constructor for exception class org.apache.beam.io.requestresponse.UserCodeRemoteSystemException
- UserCodeRemoteSystemException(String, Throwable) - Constructor for exception class org.apache.beam.io.requestresponse.UserCodeRemoteSystemException
- UserCodeRemoteSystemException(String, Throwable, boolean, boolean) - Constructor for exception class org.apache.beam.io.requestresponse.UserCodeRemoteSystemException
- UserCodeRemoteSystemException(Throwable) - Constructor for exception class org.apache.beam.io.requestresponse.UserCodeRemoteSystemException
- UserCodeTimeoutException - Exception Class in org.apache.beam.io.requestresponse
-
An extension of
UserCodeQuotaException
to specifically signal a user code timeout. - UserCodeTimeoutException(String) - Constructor for exception class org.apache.beam.io.requestresponse.UserCodeTimeoutException
- UserCodeTimeoutException(String, Throwable) - Constructor for exception class org.apache.beam.io.requestresponse.UserCodeTimeoutException
- UserCodeTimeoutException(String, Throwable, boolean, boolean) - Constructor for exception class org.apache.beam.io.requestresponse.UserCodeTimeoutException
- UserCodeTimeoutException(Throwable) - Constructor for exception class org.apache.beam.io.requestresponse.UserCodeTimeoutException
- userDefinedAggregateFunctions() - Method in class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider
- userDefinedAggregateFunctions() - Method in interface org.apache.beam.sdk.extensions.sql.udf.UdfProvider
-
Maps function names to aggregate function implementations.
- userDefinedScalarFunctions() - Method in class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider
- userDefinedScalarFunctions() - Method in interface org.apache.beam.sdk.extensions.sql.udf.UdfProvider
-
Maps function names to scalar function implementations.
- useReflectApi() - Method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
-
Deprecated.kept for backward API compatibility only.
- UserFunctionDefinitions - Class in org.apache.beam.sdk.extensions.sql.zetasql.translation
-
Holds user defined function definitions.
- UserFunctionDefinitions() - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.translation.UserFunctionDefinitions
- UserFunctionDefinitions.Builder - Class in org.apache.beam.sdk.extensions.sql.zetasql.translation
- UserFunctionDefinitions.JavaScalarFunction - Class in org.apache.beam.sdk.extensions.sql.zetasql.translation
- username() - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionServiceFactory
-
The username to use for authentication.
- username() - Method in class org.apache.beam.sdk.io.solace.broker.GCPSecretSessionServiceFactory
- username(String) - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionServiceFactory.Builder
-
Set Solace username.
- username(String) - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthSempClientFactory.Builder
-
Set Solace username.
- username(String) - Method in class org.apache.beam.sdk.io.solace.broker.GCPSecretSessionServiceFactory.Builder
-
Username to be used to authenticate with the broker.
- userStateId() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.BagUserStateSpec
- USES_KEYED_STATE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- UsesAttemptedMetrics - Class in org.apache.beam.sdk.testing
-
Category tag for validation tests which utilize
Metrics
. - UsesAttemptedMetrics() - Constructor for class org.apache.beam.sdk.testing.UsesAttemptedMetrics
- UsesBoundedSplittableParDo - Interface in org.apache.beam.sdk.testing
-
Category tag for validation tests which utilize splittable
ParDo
with aDoFn.BoundedPerElement
DoFn
. - UsesBoundedTrieMetrics - Class in org.apache.beam.sdk.testing
-
Category tag for validation tests which utilize
BoundedTrie
. - UsesBoundedTrieMetrics() - Constructor for class org.apache.beam.sdk.testing.UsesBoundedTrieMetrics
- UsesBundleFinalizer - Interface in org.apache.beam.sdk.testing
-
Category tag for validation tests which use
DoFn.BundleFinalizer
. - UsesCommittedMetrics - Interface in org.apache.beam.sdk.testing
-
Category tag for validation tests which utilize
Metrics
. - UsesCounterMetrics - Class in org.apache.beam.sdk.testing
-
Category tag for validation tests which utilize
Counter
. - UsesCounterMetrics() - Constructor for class org.apache.beam.sdk.testing.UsesCounterMetrics
- UsesCustomWindowMerging - Interface in org.apache.beam.sdk.testing
-
Category tag for validation tests which utilize custom window merging.
- UsesDistributionMetrics - Class in org.apache.beam.sdk.testing
-
Category tag for validation tests which utilize
Distribution
. - UsesDistributionMetrics() - Constructor for class org.apache.beam.sdk.testing.UsesDistributionMetrics
- usesErrorHandler() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiLoads
- UsesExternalService - Interface in org.apache.beam.sdk.testing
-
Category tag for tests which relies on a pre-defined port, such as expansion service or transform service.
- UsesFailureMessage - Interface in org.apache.beam.sdk.testing
-
Category tag for tests which validate that currect failure message is provided by failed pipeline.
- UsesGaugeMetrics - Class in org.apache.beam.sdk.testing
-
Category tag for validation tests which utilize
Gauge
. - UsesGaugeMetrics() - Constructor for class org.apache.beam.sdk.testing.UsesGaugeMetrics
- UsesImpulse - Class in org.apache.beam.sdk.testing
-
Category for tests that use
Impulse
transformations. - UsesImpulse() - Constructor for class org.apache.beam.sdk.testing.UsesImpulse
- UsesJavaExpansionService - Interface in org.apache.beam.sdk.testing
-
Category tag for tests which use the expansion service in Java.
- UsesKeyInParDo - Interface in org.apache.beam.sdk.testing
-
Category tag for validation tests which use key.
- UsesKms - Interface in org.apache.beam.sdk.testing
-
Category tag for validation tests which utilize --tempRoot from
TestPipelineOptions
and and expect a default KMS key enable for the bucket specified. - UsesLoopingTimer - Interface in org.apache.beam.sdk.testing
-
Category tag for validation tests which utilize looping timers in
ParDo
. - UsesMapState - Interface in org.apache.beam.sdk.testing
-
Category tag for validation tests which utilize
MapState
. - UsesMetricsPusher - Interface in org.apache.beam.sdk.testing
-
Category tag for validation tests which utilize the metrics pusher feature.
- UsesMultimapState - Interface in org.apache.beam.sdk.testing
-
Category tag for validation tests which utilize
MultimapState
. - UsesOnWindowExpiration - Interface in org.apache.beam.sdk.testing
-
Category tag for validation tests which utilize
DoFn.OnWindowExpiration
. - UsesOrderedListState - Class in org.apache.beam.sdk.testing
-
Category tag for validation tests which utilize
OrderedListState
. - UsesOrderedListState() - Constructor for class org.apache.beam.sdk.testing.UsesOrderedListState
- UsesParDoLifecycle - Interface in org.apache.beam.sdk.testing
-
Category tag for the ParDoLifecycleTest for exclusion (BEAM-3241).
- UsesPerKeyOrderedDelivery - Interface in org.apache.beam.sdk.testing
-
Category tag for validation tests which rely on a runner providing per-key ordering.
- UsesPerKeyOrderInBundle - Interface in org.apache.beam.sdk.testing
-
Category tag for validation tests which rely on a runner providing per-key ordering in between transforms in the same ProcessBundleRequest.
- UsesProcessingTimeTimers - Interface in org.apache.beam.sdk.testing
-
Category tag for validation tests which utilize timers in
ParDo
. - UsesPythonExpansionService - Interface in org.apache.beam.sdk.testing
-
Category tag for tests which use the expansion service in Python.
- UsesRequiresTimeSortedInput - Interface in org.apache.beam.sdk.testing
-
Category tag for validation tests which utilize
DoFn.RequiresTimeSortedInput
in statefulParDo
. - usesReshuffle - Variable in class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceTransform
- UsesSchema - Interface in org.apache.beam.sdk.testing
-
Category tag for validation tests which utilize schemas.
- UsesSdkHarnessEnvironment - Interface in org.apache.beam.sdk.testing
-
Category tag for tests which validate that the SDK harness executes in a well formed environment.
- UsesSetState - Interface in org.apache.beam.sdk.testing
-
Category tag for validation tests which utilize
SetState
. - UsesSideInputs - Interface in org.apache.beam.sdk.testing
-
Category tag for validation tests which use sideinputs.
- UsesSideInputsWithDifferentCoders - Interface in org.apache.beam.sdk.testing
-
Category tag for validation tests which use multiple side inputs with different coders.
- UsesStatefulParDo - Interface in org.apache.beam.sdk.testing
-
Category tag for validation tests which utilize stateful
ParDo
. - UsesStrictTimerOrdering - Interface in org.apache.beam.sdk.testing
-
Category for tests that enforce strict event-time ordering of fired timers, even in situations where multiple tests mutually set one another and watermark hops arbitrarily far to the future.
- UsesStringSetMetrics - Class in org.apache.beam.sdk.testing
-
Category tag for validation tests which utilize
StringSet
. - UsesStringSetMetrics() - Constructor for class org.apache.beam.sdk.testing.UsesStringSetMetrics
- UsesSystemMetrics - Interface in org.apache.beam.sdk.testing
-
Category tag for tests that use System metrics.
- UsesTestStream - Interface in org.apache.beam.sdk.testing
-
Category tag for tests that use
TestStream
, which is not a part of the Beam model but a special feature currently only implemented by the direct runner and the Flink Runner (streaming). - UsesTestStreamWithMultipleStages - Interface in org.apache.beam.sdk.testing
-
Subcategory for
UsesTestStream
tests which useTestStream
# across multiple stages. - UsesTestStreamWithOutputTimestamp - Interface in org.apache.beam.sdk.testing
-
Category tag for validation tests which use outputTimestamp.
- UsesTestStreamWithProcessingTime - Interface in org.apache.beam.sdk.testing
-
Subcategory for
UsesTestStream
tests which use the processing time feature ofTestStream
. - UsesTimerMap - Interface in org.apache.beam.sdk.testing
-
Category tag for validation tests which use timerMap.
- UsesTimersInParDo - Interface in org.apache.beam.sdk.testing
-
Category tag for validation tests which utilize timers in
ParDo
. - useStreamingSideInput() - Method in class org.apache.beam.runners.spark.translation.EvaluationContext
-
Marks that the pipeline contains at least one streaming side input.
- UsesTriggeredSideInputs - Interface in org.apache.beam.sdk.testing
-
Category tag for validation tests which use triggered sideinputs.
- UsesUnboundedPCollections - Interface in org.apache.beam.sdk.testing
-
Category tag for validation tests which utilize at least one unbounded
PCollection
. - UsesUnboundedSplittableParDo - Interface in org.apache.beam.sdk.testing
-
Category tag for validation tests which utilize splittable
ParDo
with aDoFn.UnboundedPerElement
DoFn
. - using(Integer...) - Method in class org.apache.beam.sdk.schemas.transforms.Join.Impl
-
Perform a natural join between the PCollections.
- using(String...) - Method in class org.apache.beam.sdk.schemas.transforms.Join.Impl
-
Perform a natural join between the PCollections.
- using(FieldAccessDescriptor) - Method in class org.apache.beam.sdk.schemas.transforms.Join.Impl
-
Perform a natural join between the PCollections.
- usingFnApiClient(InstructionRequestHandler, FnDataService) - Static method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient
-
Creates a client for a particular SDK harness.
- usingHigherBitSize(UnsignedOptions.Behavior) - Static method in class org.apache.beam.sdk.extensions.sbe.UnsignedOptions
-
Returns options for using a higher bit count for unsigned types.
- Using local emulator - Search tag in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
- Section
- usingRedis(URI, Coder<RequestT>, Coder<ResponseT>, Duration) - Static method in class org.apache.beam.io.requestresponse.Cache
- usingSameBitSize() - Static method in class org.apache.beam.sdk.extensions.sbe.UnsignedOptions
-
Returns options for using the same bit size for all unsigned types.
- usingStandardSql() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
-
Enables BigQuery's Standard SQL dialect when reading from a query.
- usingStandardSql() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- Using the ApproximateDistinctFn CombineFn - Search tag in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct
- Section
- Using the Transforms - Search tag in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct
- Section
- UTCDateOnly() - Constructor for class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCDateOnly
- UTCTimeOnly() - Constructor for class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimeOnly
- UTCTimestamp() - Constructor for class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimestamp
- Utils - Class in org.apache.beam.runners.jet
-
Various common methods used by the Jet based runner.
- Utils() - Constructor for class org.apache.beam.runners.jet.Utils
- Utils() - Constructor for class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers.Utils
- Utils.ByteArrayKey - Class in org.apache.beam.runners.jet
-
A wrapper of
byte[]
that can be used as a hash-map key. - Uuid - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
-
A Uuid storable in a Pub/Sub Lite attribute.
- Uuid() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.Uuid
- UUID_SCHEMA - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.UuidLogicalType
- UuidCoder - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
-
A coder for a Uuid.
- UuidCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.UuidCoder
- UuidDeduplicationOptions - Class in org.apache.beam.sdk.io.gcp.pubsublite
-
Options for deduplicating Pub/Sub Lite messages based on the UUID they were published with.
- UuidDeduplicationOptions() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions
- UuidDeduplicationOptions.Builder - Class in org.apache.beam.sdk.io.gcp.pubsublite
- UuidDeduplicationTransform - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
-
A transform for deduplicating Pub/Sub Lite messages based on the UUID they were published with.
- UuidDeduplicationTransform(UuidDeduplicationOptions) - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.UuidDeduplicationTransform
- uuidExtractor() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions
- UuidLogicalType - Class in org.apache.beam.sdk.schemas.logicaltypes
-
Base class for types representing UUID as two long values.
- UuidLogicalType() - Constructor for class org.apache.beam.sdk.schemas.logicaltypes.UuidLogicalType
V
- v1() - Static method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreIO
-
Returns a
DatastoreV1
that provides an API for accessing Cloud Datastore through v1 version of Datastore Client library. - v1() - Static method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreIO
- V1_READ_OVERRIDE - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.internal.SubscribeTransform
- VALID_DATA_FORMATS - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformProvider
- VALID_DATA_FORMATS - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformProvider
- VALID_DATA_FORMATS - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
- VALID_DATA_FORMATS - Static variable in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- VALID_FIELD_TYPE_SET - Static variable in class org.apache.beam.sdk.io.csv.CsvIO
-
The valid
Schema.FieldType
from whichCsvIO
converts CSV records to the fields. - VALID_FORMATS_STR - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformProvider
- VALID_FORMATS_STR - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformProvider
- VALID_FORMATS_STR - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
- VALID_FORMATS_STR - Static variable in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- VALID_PROVIDERS - Static variable in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration
- VALID_START_OFFSET_VALUES - Static variable in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- validate() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource
- validate() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource
- validate() - Method in class org.apache.beam.sdk.io.CompressedSource
-
Validates that the delegate source is a valid source and that the channel factory is not null.
- validate() - Method in class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider.CsvWriteConfiguration
- validate() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BoundedElasticsearchSource
- validate() - Method in class org.apache.beam.sdk.io.FileBasedSource
- validate() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration
- validate() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
- validate() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration
-
Instantiates a
BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration.Builder
instance. - validate() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration
-
Validates the configuration object.
- validate() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
- validate() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
- validate() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- validate() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration
- validate() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration
- validate() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.HadoopInputFormatBoundedSource
- validate() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
- validate() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
- validate() - Method in class org.apache.beam.sdk.io.json.providers.JsonWriteTransformProvider.JsonWriteConfiguration
- validate() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- validate() - Method in class org.apache.beam.sdk.io.OffsetBasedSource
- validate() - Method in class org.apache.beam.sdk.io.Source
-
Checks that this source is valid, before it can be used in a pipeline.
- validate() - Method in class org.apache.beam.sdk.io.TFRecordReadSchemaTransformConfiguration
- validate() - Method in class org.apache.beam.sdk.io.TFRecordWriteSchemaTransformConfiguration
- validate() - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration
- validate() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf.Configuration
- validate(Class<T>, PipelineOptions) - Static method in class org.apache.beam.sdk.options.PipelineOptionsValidator
-
Validates that the passed
PipelineOptions
conforms to all the validation criteria from the passed in interface. - validate(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
- validate(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
- validate(Schema) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.LocalTimestampMicros
- validate(Schema) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.LocalTimestampMillis
- validate(AwsOptions, ClientConfiguration) - Static method in interface org.apache.beam.sdk.io.aws2.common.ClientBuilderFactory
-
Utility to validate if all necessary configuration is available to create clients using the
ClientBuilderFactory
configured inAwsOptions
. - validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
- validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Read
- validate(PipelineOptions) - Method in class org.apache.beam.sdk.transforms.PTransform
-
Called before running the Pipeline to verify this transform is fully and correctly specified.
- validate(PipelineOptions, Map<TupleTag<?>, PCollection<?>>, Map<TupleTag<?>, PCollection<?>>) - Method in class org.apache.beam.sdk.transforms.GroupByKey
- validate(PipelineOptions, Map<TupleTag<?>, PCollection<?>>, Map<TupleTag<?>, PCollection<?>>) - Method in class org.apache.beam.sdk.transforms.PTransform
-
Called before running the Pipeline to verify this transform, its inputs, and outputs are fully and correctly specified.
- validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
- validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.FileBasedSink
- validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
- validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
- validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
- validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
- validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.WriteWithResults
- validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
- validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
- validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Write
- validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.kudu.KuduIO.Read
- validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.kudu.KuduIO.Write
- validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.WriteFiles
- validate(T) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Write
- VALIDATE_TIME_INTERVAL - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
- VALIDATE_TIMESTAMP - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
- validateCli(Class<T>, PipelineOptions) - Static method in class org.apache.beam.sdk.options.PipelineOptionsValidator
-
Validates that the passed
PipelineOptions
from command line interface (CLI) conforms to all the validation criteria from the passed in interface. - validateCoderIsCompatible(IsmFormat.IsmRecordCoder<?>) - Static method in class org.apache.beam.runners.dataflow.internal.IsmFormat
-
Validates that the key portion of the given coder is deterministic.
- validateGetOutputTimestamps(WindowFn<T, W>, TimestampCombiner, List<List<Long>>) - Static method in class org.apache.beam.sdk.testing.WindowFnTestUtils
-
Verifies that later-ending merged windows from any of the timestamps hold up output of earlier-ending windows, using the provided
WindowFn
andTimestampCombiner
. - validateGetOutputTimestampsWithValue(WindowFn<T, W>, TimestampCombiner, List<List<TimestampedValue<T>>>) - Static method in class org.apache.beam.sdk.testing.WindowFnTestUtils
-
Verifies that later-ending merged windows from any of the timestampValues hold up output of earlier-ending windows, using the provided
WindowFn
andTimestampCombiner
. - validateInputFilePatternSupported(String) - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsPathValidator
-
Validates the input GCS path is accessible and that the path is well formed.
- validateInputFilePatternSupported(String) - Method in class org.apache.beam.sdk.extensions.gcp.storage.NoopPathValidator
- validateInputFilePatternSupported(String) - Method in interface org.apache.beam.sdk.extensions.gcp.storage.PathValidator
-
Validate that a file pattern is conforming.
- validateJavaBean(List<FieldValueTypeInformation>, List<FieldValueTypeInformation>, Schema) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
- validateMaterializations(Iterable<PCollectionView<?>>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.SparkSideInputReader
- validateMethod(Method) - Static method in class org.apache.beam.sdk.extensions.sql.impl.ScalarFunctionImpl
- validateOutputFilePrefixSupported(String) - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsPathValidator
-
Validates the output GCS path is accessible and that the path is well formed.
- validateOutputFilePrefixSupported(String) - Method in class org.apache.beam.sdk.extensions.gcp.storage.NoopPathValidator
- validateOutputFilePrefixSupported(String) - Method in interface org.apache.beam.sdk.extensions.gcp.storage.PathValidator
-
Validate that an output file prefix is conforming.
- validateOutputResourceSupported(ResourceId) - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsPathValidator
- validateOutputResourceSupported(ResourceId) - Method in class org.apache.beam.sdk.extensions.gcp.storage.NoopPathValidator
- validateOutputResourceSupported(ResourceId) - Method in interface org.apache.beam.sdk.extensions.gcp.storage.PathValidator
-
Validates that an output path is conforming.
- ValidatesRunner - Interface in org.apache.beam.sdk.testing
-
Category tag for tests which validate that a Beam runner is correctly implemented.
- validateTimeInterval(Long, TimeUnit) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.DateTimeUtils
-
This function validates that interval is compatible with ZetaSQL timestamp values range.
- validateTimestamp(Long) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.DateTimeUtils
-
This function validates that Long representation of timestamp is compatible with ZetaSQL timestamp values range.
- validateTimestampBounds(Instant) - Static method in class org.apache.beam.sdk.transforms.windowing.BoundedWindow
-
Validates that a given timestamp is within min and max bounds.
- validateTransform() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
-
Validates construction of this transform.
- validateTransform() - Method in class org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIO.Read
- Validation - Annotation Interface in org.apache.beam.sdk.options
-
Validation
represents a set of annotations that can be used to annotate getter properties onPipelineOptions
with information representing the validation criteria to be used when validating with thePipelineOptionsValidator
. - Validation.Required - Annotation Interface in org.apache.beam.sdk.options
-
This criteria specifies that the value must be not null.
- validator() - Method in class org.apache.beam.sdk.schemas.transforms.Cast
- Valid CSVFormat Configuration - Search tag in class org.apache.beam.sdk.io.csv.CsvIO
- Section
- value() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.SingletonKeyedWorkItem
- value() - Method in class org.apache.beam.runners.spark.metrics.MetricsContainerStepMapAccumulator
- value() - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.MetricsAccumulator
- value() - Element in annotation interface org.apache.beam.sdk.coders.DefaultCoder
- value() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.Uuid
- value() - Element in annotation interface org.apache.beam.sdk.options.Default.Boolean
- value() - Element in annotation interface org.apache.beam.sdk.options.Default.Byte
- value() - Element in annotation interface org.apache.beam.sdk.options.Default.Character
- value() - Element in annotation interface org.apache.beam.sdk.options.Default.Class
- value() - Element in annotation interface org.apache.beam.sdk.options.Default.Double
- value() - Element in annotation interface org.apache.beam.sdk.options.Default.Enum
- value() - Element in annotation interface org.apache.beam.sdk.options.Default.Float
- value() - Element in annotation interface org.apache.beam.sdk.options.Default.InstanceFactory
- value() - Element in annotation interface org.apache.beam.sdk.options.Default.Integer
- value() - Element in annotation interface org.apache.beam.sdk.options.Default.Long
- value() - Element in annotation interface org.apache.beam.sdk.options.Default.Short
- value() - Element in annotation interface org.apache.beam.sdk.options.Default.String
- value() - Element in annotation interface org.apache.beam.sdk.options.Description
- value() - Element in annotation interface org.apache.beam.sdk.schemas.annotations.DefaultSchema
-
The schema provider implementation that knows how to vend schemas for the annotated class.
- value() - Element in annotation interface org.apache.beam.sdk.schemas.annotations.SchemaCaseFormat
-
The name to use for the generated schema field.
- value() - Element in annotation interface org.apache.beam.sdk.schemas.annotations.SchemaFieldDescription
-
The description to use for the generated schema field.
- value() - Element in annotation interface org.apache.beam.sdk.schemas.annotations.SchemaFieldName
-
The name to use for the generated schema field.
- value() - Element in annotation interface org.apache.beam.sdk.schemas.annotations.SchemaFieldNumber
-
The name to use for the generated schema field.
- value() - Static method in class org.apache.beam.sdk.state.StateSpecs
-
Create a
StateSpec
for a single value of typeT
. - value() - Element in annotation interface org.apache.beam.sdk.transforms.DoFn.FieldAccess
- value() - Element in annotation interface org.apache.beam.sdk.transforms.DoFn.OnTimer
-
The timer ID.
- value() - Element in annotation interface org.apache.beam.sdk.transforms.DoFn.OnTimerFamily
-
The timer ID.
- value() - Element in annotation interface org.apache.beam.sdk.transforms.DoFn.SideInput
-
The SideInput tag ID.
- value() - Element in annotation interface org.apache.beam.sdk.transforms.DoFn.StateId
-
The state ID.
- value() - Element in annotation interface org.apache.beam.sdk.transforms.DoFn.TimerFamily
-
The TimerMap tag ID.
- value() - Element in annotation interface org.apache.beam.sdk.transforms.DoFn.TimerId
-
The timer ID.
- value(Object) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel.WrappedList
- value(Object) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel.WrappedMap
- value(Coder<T>) - Static method in class org.apache.beam.sdk.state.StateSpecs
-
Identical to
StateSpecs.value()
, but with a coder explicitly supplied. - Value(int) - Constructor for class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType.Value
- Value(EnumerationType.Value, Object) - Constructor for class org.apache.beam.sdk.schemas.logicaltypes.OneOfType.Value
- VALUE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- VALUE - Static variable in class org.apache.beam.sdk.io.contextualtextio.RecordWithMetadata
- VALUE - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.RowUtils
- ValueAndCoderKryoSerializer<T> - Class in org.apache.beam.runners.spark.translation
-
Kryo serializer for
ValueAndCoderLazySerializable
. - ValueAndCoderKryoSerializer() - Constructor for class org.apache.beam.runners.spark.translation.ValueAndCoderKryoSerializer
- ValueAndCoderLazySerializable<T> - Class in org.apache.beam.runners.spark.translation
-
A holder object that lets you serialize an element with a Coder with minimal wasted space.
- ValueCaptureType - Enum Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
Represents the capture type of a change stream.
- valueCoder() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.BagUserStateSpec
- valueCoder() - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.MultimapSideInputHandler
-
Returns the
Coder
to use for the elements of the resulting values iterable. - valueEncoderOf(KvCoder<K, V>) - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderProvider
- valueInGlobalWindow(T) - Static method in class org.apache.beam.sdk.values.WindowedValues
-
Returns a
WindowedValue
with the given value in theGlobalWindow
using the default timestamp and pane. - valueInGlobalWindow(T, PaneInfo) - Static method in class org.apache.beam.sdk.values.WindowedValues
-
Returns a
WindowedValue
with the given value in theGlobalWindow
using the default timestamp and the specified pane. - ValueInSingleWindow<T> - Class in org.apache.beam.sdk.values
-
An immutable tuple of value, timestamp, window, and pane.
- ValueInSingleWindow() - Constructor for class org.apache.beam.sdk.values.ValueInSingleWindow
- ValueInSingleWindow.Coder<T> - Class in org.apache.beam.sdk.values
-
A coder for
ValueInSingleWindow
. - valueOf(int) - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
-
Return an
EnumerationType.Value
corresponding to one of the enumeration integer values. - valueOf(String) - Static method in enum class org.apache.beam.io.debezium.Connectors
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.runners.dataflow.options.DataflowPipelineOptions.FlexResourceSchedulingGoal
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions.AutoscalingAlgorithmType
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.Level
-
Deprecated.Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.runners.local.ExecutionDriver.DriverState
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.runners.spark.translation.SparkCombineFn.WindowedAccumulator.Type
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.runners.spark.translation.SparkPCollectionView.Type
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.coders.CannotProvideCoderException.ReasonCode
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.extensions.ordered.UnprocessedEvent.Reason
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.extensions.sbe.UnsignedOptions.Behavior
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.extensions.sorter.ExternalSorter.Options.SorterType
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.extensions.sql.impl.cep.CEPKind
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.extensions.sql.impl.QueryPlanner.QueryParameters.Kind
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSetOperatorRelBase.OpType
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.extensions.sql.meta.ProjectSupport
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider.PushDownOptions
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.cassandra.CassandraIO.MutationType
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.cdap.PluginConstants.Format
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.cdap.PluginConstants.FormatProvider
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.cdap.PluginConstants.Hadoop
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.cdap.PluginConstants.PluginType
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.clickhouse.TableSchema.DefaultType
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.clickhouse.TableSchema.TypeName
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.CompressedSource.CompressionMode
-
Deprecated.Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.Compression
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.FileBasedSink.CompressionType
-
Deprecated.Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.FileBasedSource.Mode
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.FileIO.ReadMatches.DirectoryTreatment
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.FileSystem.LineageLevel
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.fs.EmptyMatchTreatment
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.fs.MatchResult.Status
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.fs.MoveOptions.StandardMoveOptions
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.fs.ResolveOptions.StandardResolveOptions
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead.Method
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead.QueryPriority
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.CreateDisposition
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.Method
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.SchemaUpdateOption
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.WriteDisposition
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService.TableMetadataView
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics.RpcMethod
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions.TruncateTimestamps
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.gcp.bigquery.RowMutationInformation.MutationType
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ExistingPipelineOptions
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Import.ContentStructure
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.WriteMethod
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write.WriteMethod
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.FhirResourcePagesIterator.FhirMethod
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ModType
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.State
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ValueCaptureType
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.FailureMode
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.iceberg.IcebergIO.ReadRows.StartingStrategy
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.ScanType
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.kafka.KafkaTimestampType
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.redis.RedisIO.Write.Method
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.snowflake.enums.CreateDisposition
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.snowflake.enums.StreamingLogLevel
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.snowflake.enums.WriteDisposition
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.solace.data.Solace.DestinationType
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.solace.SolaceIO.SubmissionMode
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.solace.SolaceIO.WriterType
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.TextIO.CompressionType
-
Deprecated.Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.TFRecordIO.CompressionType
-
Deprecated.Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.io.xml.XmlIO.Read.CompressionType
-
Deprecated.Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.jmh.schemas.RowBundle.Action
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.metrics.Lineage.Type
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.options.PipelineOptions.CheckEnabled
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.options.SdkHarnessOptions.LogLevel
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.Pipeline.PipelineVisitor.CompositeBehavior
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.PipelineResult.State
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.ListQualifier
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.MapQualifier
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.Qualifier.Kind
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
-
Return an
EnumerationType.Value
corresponding to one of the enumeration strings. - valueOf(String) - Static method in enum class org.apache.beam.sdk.schemas.Schema.EquivalenceNullablePolicy
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.schemas.Schema.TypeName
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields.Fanout.Kind
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.state.TimeDomain
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.testing.SourceTestUtils.ExpectedSplitOutcome
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.testing.TestStream.EventType
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.transforms.display.DisplayData.Type
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.transforms.DoFnTester.CloningBehavior
-
Deprecated.Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker.IsBounded
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.transforms.windowing.PaneInfo.Timing
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.transforms.windowing.TimestampCombiner
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.transforms.windowing.Window.ClosingBehavior
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.transforms.windowing.Window.OnTimeBehavior
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.values.PCollection.IsBounded
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class org.apache.beam.sdk.values.WindowingStrategy.AccumulationMode
-
Returns the enum constant of this class with the specified name.
- ValueOrMetadata(boolean, T, MetaT) - Constructor for class org.apache.beam.sdk.values.PCollectionViews.ValueOrMetadata
- ValueProvider<T> - Interface in org.apache.beam.sdk.options
-
A
ValueProvider
abstracts the notion of fetching a value that may or may not be currently available. - ValueProvider.Deserializer - Class in org.apache.beam.sdk.options
-
For internal use only; no backwards compatibility guarantees.
- ValueProvider.NestedValueProvider<T,
X> - Class in org.apache.beam.sdk.options -
ValueProvider.NestedValueProvider
is an implementation ofValueProvider
that allows for wrapping anotherValueProvider
object. - ValueProvider.RuntimeValueProvider<T> - Class in org.apache.beam.sdk.options
-
ValueProvider.RuntimeValueProvider
is an implementation ofValueProvider
that allows for a value to be provided at execution time rather than at graph construction time. - ValueProvider.Serializer - Class in org.apache.beam.sdk.options
-
For internal use only; no backwards compatibility guarantees.
- ValueProvider.StaticValueProvider<T> - Class in org.apache.beam.sdk.options
-
ValueProvider.StaticValueProvider
is an implementation ofValueProvider
that allows for a static value to be provided. - ValueProviders - Class in org.apache.beam.sdk.options
-
Utilities for working with the
ValueProvider
interface. - values() - Static method in enum class org.apache.beam.io.debezium.Connectors
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.runners.dataflow.options.DataflowPipelineOptions.FlexResourceSchedulingGoal
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions.AutoscalingAlgorithmType
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.Level
-
Deprecated.Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.runners.local.ExecutionDriver.DriverState
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.runners.spark.translation.SparkCombineFn.WindowedAccumulator.Type
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.runners.spark.translation.SparkPCollectionView.Type
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.coders.CannotProvideCoderException.ReasonCode
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.extensions.ordered.UnprocessedEvent.Reason
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.extensions.sbe.UnsignedOptions.Behavior
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.extensions.sorter.ExternalSorter.Options.SorterType
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.extensions.sql.impl.cep.CEPKind
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.extensions.sql.impl.QueryPlanner.QueryParameters.Kind
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSetOperatorRelBase.OpType
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.extensions.sql.meta.ProjectSupport
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider.PushDownOptions
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.cassandra.CassandraIO.MutationType
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.cdap.PluginConstants.Format
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.cdap.PluginConstants.FormatProvider
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.cdap.PluginConstants.Hadoop
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.cdap.PluginConstants.PluginType
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.clickhouse.TableSchema.DefaultType
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.clickhouse.TableSchema.TypeName
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.CompressedSource.CompressionMode
-
Deprecated.Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.Compression
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.FileBasedSink.CompressionType
-
Deprecated.Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.FileBasedSource.Mode
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.FileIO.ReadMatches.DirectoryTreatment
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.FileSystem.LineageLevel
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.fs.EmptyMatchTreatment
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.fs.MatchResult.Status
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.fs.MoveOptions.StandardMoveOptions
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.fs.ResolveOptions.StandardResolveOptions
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead.Method
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead.QueryPriority
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.CreateDisposition
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.Method
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.SchemaUpdateOption
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.WriteDisposition
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService.TableMetadataView
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics.RpcMethod
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions.TruncateTimestamps
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.gcp.bigquery.RowMutationInformation.MutationType
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ExistingPipelineOptions
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Import.ContentStructure
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.WriteMethod
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write.WriteMethod
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.FhirResourcePagesIterator.FhirMethod
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ModType
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.State
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ValueCaptureType
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.FailureMode
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.iceberg.IcebergIO.ReadRows.StartingStrategy
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.ScanType
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
-
Writes just the values to Kafka.
- values() - Static method in enum class org.apache.beam.sdk.io.kafka.KafkaTimestampType
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.redis.RedisIO.Write.Method
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.snowflake.enums.CreateDisposition
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.snowflake.enums.StreamingLogLevel
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.snowflake.enums.WriteDisposition
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.solace.data.Solace.DestinationType
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.solace.SolaceIO.SubmissionMode
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.solace.SolaceIO.WriterType
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.TextIO.CompressionType
-
Deprecated.Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.TFRecordIO.CompressionType
-
Deprecated.Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.io.xml.XmlIO.Read.CompressionType
-
Deprecated.Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.jmh.schemas.RowBundle.Action
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.metrics.Lineage.Type
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.options.PipelineOptions.CheckEnabled
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.options.SdkHarnessOptions.LogLevel
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.Pipeline.PipelineVisitor.CompositeBehavior
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.PipelineResult.State
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.ListQualifier
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.MapQualifier
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.Qualifier.Kind
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.schemas.Schema.EquivalenceNullablePolicy
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.schemas.Schema.TypeName
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields.Fanout.Kind
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
- values() - Method in interface org.apache.beam.sdk.state.MapState
-
Returns an
Iterable
over the values contained in this map. - values() - Static method in enum class org.apache.beam.sdk.state.TimeDomain
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.testing.SourceTestUtils.ExpectedSplitOutcome
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.testing.TestStream.EventType
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in class org.apache.beam.sdk.transforms.Deduplicate
-
Returns a deduplication transform that deduplicates values for up to 10 mins within the
processing time domain
. - values() - Static method in enum class org.apache.beam.sdk.transforms.display.DisplayData.Type
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.transforms.DoFnTester.CloningBehavior
-
Deprecated.Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker.IsBounded
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.transforms.windowing.PaneInfo.Timing
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.transforms.windowing.TimestampCombiner
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.transforms.windowing.Window.ClosingBehavior
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.transforms.windowing.Window.OnTimeBehavior
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.values.PCollection.IsBounded
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class org.apache.beam.sdk.values.WindowingStrategy.AccumulationMode
-
Returns an array containing the constants of this enum class, in the order they are declared.
- Values<V> - Class in org.apache.beam.sdk.transforms
-
Values<V>
takes aPCollection
ofKV<K, V>
s and returns aPCollection<V>
of the values. - ValueState<T> - Interface in org.apache.beam.sdk.state
-
A
ReadableState
cell containing a single value. - ValueWithRecordId<ValueT> - Class in org.apache.beam.sdk.values
-
For internal use only; no backwards compatibility guarantees.
- ValueWithRecordId(ValueT, byte[]) - Constructor for class org.apache.beam.sdk.values.ValueWithRecordId
- ValueWithRecordId.StripIdsDoFn<T> - Class in org.apache.beam.sdk.values
- ValueWithRecordId.ValueWithRecordIdCoder<ValueT> - Class in org.apache.beam.sdk.values
- ValueWithRecordIdCoder(Coder<ValueT>) - Constructor for class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
- VARBINARY - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
- VARCHAR - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
- VariableBytes - Class in org.apache.beam.sdk.schemas.logicaltypes
-
A LogicalType representing a variable-length byte array with specified maximum length.
- VariableString - Class in org.apache.beam.sdk.schemas.logicaltypes
-
A LogicalType representing a variable-length string with specified maximum length.
- VarianceFn<T> - Class in org.apache.beam.sdk.extensions.sql.impl.transform.agg
-
Combine.CombineFn
for Variance onNumber
types. - VarIntBenchmark - Class in org.apache.beam.sdk.jmh.util
-
Benchmarks for
VarInt
and variants. - VarIntBenchmark() - Constructor for class org.apache.beam.sdk.jmh.util.VarIntBenchmark
- VarIntBenchmark.BlackholeOutput - Class in org.apache.beam.sdk.jmh.util
-
Output to
Blackhole
. - VarIntBenchmark.Bytes - Class in org.apache.beam.sdk.jmh.util
-
Input from randomly generated bytes.
- VarIntBenchmark.ByteStringOutput - Class in org.apache.beam.sdk.jmh.util
-
Output to
ByteStringOutputStream
. - VarIntBenchmark.Longs - Class in org.apache.beam.sdk.jmh.util
-
Input from randomly generated longs.
- VarIntCoder - Class in org.apache.beam.sdk.coders
- VarLongCoder - Class in org.apache.beam.sdk.coders
- verifyBucketAccessible(GcsPath) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
-
Checks whether the GCS bucket exists.
- verifyCompatibility(Schema) - Method in class org.apache.beam.sdk.schemas.transforms.Cast
- verifyCompatibility(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
- verifyCompatibility(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
- verifyCompatibility(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
- verifyCompatibility(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
- verifyCompatibility(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
- verifyCompatibility(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
- verifyCompatibility(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
- verifyCompatibility(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
-
Throw
IncompatibleWindowException
if this WindowFn does not perform the same merging as the given $WindowFn
. - verifyDeterministic() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.FooterCoder
- verifyDeterministic() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmRecordCoder
- verifyDeterministic() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmShardCoder
- verifyDeterministic() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.KeyPrefixCoder
- verifyDeterministic() - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.MetadataKeyCoder
- verifyDeterministic() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
- verifyDeterministic() - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.SingletonKeyedWorkItemCoder
- verifyDeterministic() - Method in class org.apache.beam.runners.fnexecution.wire.ByteStringCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.AtomicCoder
-
Throw
Coder.NonDeterministicException
if the coding is not deterministic. - verifyDeterministic() - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.BigEndianLongCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.BigEndianShortCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.BitSetCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.ByteCoder
-
Throw
Coder.NonDeterministicException
if the coding is not deterministic. - verifyDeterministic() - Method in class org.apache.beam.sdk.coders.Coder
-
Throw
Coder.NonDeterministicException
if the coding is not deterministic. - verifyDeterministic() - Method in class org.apache.beam.sdk.coders.CustomCoder
-
Throw
Coder.NonDeterministicException
if the coding is not deterministic. - verifyDeterministic() - Method in class org.apache.beam.sdk.coders.DelegateCoder
-
Throw
Coder.NonDeterministicException
if the coding is not deterministic. - verifyDeterministic() - Method in class org.apache.beam.sdk.coders.DequeCoder
-
Deque sizes are always known, so DequeIterable may be deterministic while the general IterableLikeCoder is not.
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.DoubleCoder
-
Throw
Coder.NonDeterministicException
if the coding is not deterministic. - verifyDeterministic() - Method in class org.apache.beam.sdk.coders.DurationCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.FloatCoder
-
Throw
Coder.NonDeterministicException
if the coding is not deterministic. - verifyDeterministic() - Method in class org.apache.beam.sdk.coders.InstantCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
-
Throw
Coder.NonDeterministicException
if the coding is not deterministic. - verifyDeterministic() - Method in class org.apache.beam.sdk.coders.KvCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
-
LengthPrefixCoder
is deterministic if the nestedCoder
is. - verifyDeterministic() - Method in class org.apache.beam.sdk.coders.ListCoder
-
List sizes are always known, so ListIterable may be deterministic while the general IterableLikeCoder is not.
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.MapCoder
-
Throw
Coder.NonDeterministicException
if the coding is not deterministic. - verifyDeterministic() - Method in class org.apache.beam.sdk.coders.NullableCoder
-
NullableCoder
is deterministic if the nestedCoder
is. - verifyDeterministic() - Method in class org.apache.beam.sdk.coders.OptionalCoder
-
OptionalCoder
is deterministic if the nestedCoder
is. - verifyDeterministic() - Method in class org.apache.beam.sdk.coders.SerializableCoder
-
Throw
Coder.NonDeterministicException
if the coding is not deterministic. - verifyDeterministic() - Method in class org.apache.beam.sdk.coders.SetCoder
-
Throw
Coder.NonDeterministicException
if the coding is not deterministic. - verifyDeterministic() - Method in class org.apache.beam.sdk.coders.ShardedKeyCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.SnappyCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.SortedMapCoder
-
Throw
Coder.NonDeterministicException
if the coding is not deterministic. - verifyDeterministic() - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.TextualIntegerCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.TimestampPrefixingWindowCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.VarIntCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.VarLongCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.VoidCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.ZstdCoder
-
Throw
Coder.NonDeterministicException
if the coding is not deterministic. - verifyDeterministic() - Method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResultCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
-
Throw
Coder.NonDeterministicException
if the coding is not deterministic. - verifyDeterministic() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV2
- verifyDeterministic() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV3
- verifyDeterministic() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
-
Throw
Coder.NonDeterministicException
if the coding is not deterministic. - verifyDeterministic() - Method in class org.apache.beam.sdk.io.hadoop.WritableCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.io.kafka.ProducerRecordCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.io.kafka.TopicPartitionCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.io.ReadableFileCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.io.splunk.SplunkEventCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.schemas.SchemaCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.testing.TestStream.TestStreamCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow.Coder
- verifyDeterministic() - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo.PaneInfoCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow.Coder
- verifyDeterministic() - Method in class org.apache.beam.sdk.values.PCollectionViews.ValueOrMetadataCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
- verifyDeterministic() - Method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.values.WindowedValues.FullWindowedValueCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.values.WindowedValues.ParamWindowedValueCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.values.WindowedValues.ValueOnlyWindowedValueCoder
-
Deprecated.
- verifyDeterministic(Coder<?>, String, Iterable<Coder<?>>) - Static method in class org.apache.beam.sdk.coders.Coder
-
Verifies all of the provided coders are deterministic.
- verifyDeterministic(Coder<?>, String, Coder<?>...) - Static method in class org.apache.beam.sdk.coders.Coder
-
Verifies all of the provided coders are deterministic.
- verifyFieldValue(Object, Schema.FieldType, String) - Static method in class org.apache.beam.sdk.values.SchemaVerification
- verifyPAssertsSucceeded(Pipeline, PipelineResult) - Static method in class org.apache.beam.sdk.testing.TestPipeline
-
Verifies all {
PAsserts
} in the pipeline have been executed and were successful. - verifyPath(String) - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsPathValidator
- verifyPath(String) - Method in class org.apache.beam.sdk.extensions.gcp.storage.NoopPathValidator
- verifyPath(String) - Method in interface org.apache.beam.sdk.extensions.gcp.storage.PathValidator
-
Validate that a path is a valid path and that the path is accessible.
- VERSION - Static variable in class org.apache.beam.runners.dataflow.internal.IsmFormat.Footer
- VersionDependentFlinkPipelineOptions - Interface in org.apache.beam.runners.flink
- Versioning - Search tag in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
- Section
- via(FileIO.Sink<UserT>) - Method in class org.apache.beam.sdk.io.FileIO.Write
-
Like
FileIO.Write.via(Contextful)
, but uses the sameFileIO.Sink
for all destinations. - via(Contextful<Contextful.Fn<DestinationT, FileIO.Sink<UserT>>>) - Method in class org.apache.beam.sdk.io.FileIO.Write
-
Like
FileIO.Write.via(Contextful, Contextful)
, but the output type of the sink is the same as the type of the input collection. - via(Contextful<Contextful.Fn<NewInputT, Iterable<OutputT>>>) - Method in class org.apache.beam.sdk.transforms.FlatMapElements
-
Like
FlatMapElements.via(ProcessFunction)
, but allows access to additional context. - via(Contextful<Contextful.Fn<NewInputT, OutputT>>) - Method in class org.apache.beam.sdk.transforms.MapElements
-
Like
MapElements.via(ProcessFunction)
, but supports access to context, such as side inputs. - via(Contextful<Contextful.Fn<UserT, OutputT>>, FileIO.Sink<OutputT>) - Method in class org.apache.beam.sdk.io.FileIO.Write
-
Like
FileIO.Write.via(Contextful, Contextful)
, but uses the same sink for all destinations. - via(Contextful<Contextful.Fn<UserT, OutputT>>, Contextful<Contextful.Fn<DestinationT, FileIO.Sink<OutputT>>>) - Method in class org.apache.beam.sdk.io.FileIO.Write
-
Specifies how to create a
FileIO.Sink
for a particular destination and how to map the element type to the sink's output type. - via(InferableFunction<? super InputT, ? extends Iterable<OutputT>>) - Static method in class org.apache.beam.sdk.transforms.FlatMapElements
-
For a
InferableFunction<InputT, ? extends Iterable<OutputT>>
fn
, return aPTransform
that appliesfn
to every element of the inputPCollection<InputT>
and outputs all of the elements to the outputPCollection<OutputT>
. - via(InferableFunction<InputT, OutputT>) - Static method in class org.apache.beam.sdk.transforms.MapElements
-
For
InferableFunction<InputT, OutputT>
fn
, returns aPTransform
that takes an inputPCollection<InputT>
and returns aPCollection<OutputT>
containingfn.apply(v)
for every elementv
in the input. - via(ProcessFunction<NewInputT, ? extends Iterable<OutputT>>) - Method in class org.apache.beam.sdk.transforms.FlatMapElements
-
For a
ProcessFunction<InputT, ? extends Iterable<OutputT>>
fn
, returns aPTransform
that appliesfn
to every element of the inputPCollection<InputT>
and outputs all of the elements to the outputPCollection<OutputT>
. - via(ProcessFunction<NewInputT, OutputT>) - Method in class org.apache.beam.sdk.transforms.MapElements
-
For a
ProcessFunction<InputT, OutputT>
fn
and output type descriptor, returns aPTransform
that takes an inputPCollection<InputT>
and returns aPCollection<OutputT>
containingfn.apply(v)
for every elementv
in the input. - via(ProcessFunction<KV<K, V>, KV<K, Long>>) - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.PerKey
- via(ProcessFunction<T, Long>) - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.Globally
- via(SerializableFunction<NewInputT, ? extends Iterable<OutputT>>) - Method in class org.apache.beam.sdk.transforms.FlatMapElements
-
Binary compatibility adapter for
FlatMapElements.via(ProcessFunction)
. - via(SerializableFunction<NewInputT, OutputT>) - Method in class org.apache.beam.sdk.transforms.MapElements
-
Binary compatibility adapter for
MapElements.via(ProcessFunction)
. - via(SerializableFunction<NewKeyT, K2>) - Method in class org.apache.beam.sdk.transforms.MapKeys
-
Returns a
MapKeys<K1, K2, V>
PTransform
for aProcessFunction<NewK1, K2>
with predefinedMapKeys.outputType
. - via(SerializableFunction<NewValueT, V2>) - Method in class org.apache.beam.sdk.transforms.MapValues
-
Returns a
MapValues
transform for aProcessFunction<NewV1, V2>
with predefinedMapValues.outputType
. - via(SimpleFunction<? super InputT, ? extends Iterable<OutputT>>) - Static method in class org.apache.beam.sdk.transforms.FlatMapElements
-
Binary compatibility adapter for
FlatMapElements.via(ProcessFunction)
. - via(SimpleFunction<InputT, OutputT>) - Static method in class org.apache.beam.sdk.transforms.MapElements
-
Binary compatibility adapter for
MapElements.via(InferableFunction)
. - viaFlatMapFn(String, Coder<?>) - Static method in class org.apache.beam.sdk.extensions.python.transforms.PythonMap
- viaMapFn(String, Coder<?>) - Static method in class org.apache.beam.sdk.extensions.python.transforms.PythonMap
- viaRandomKey() - Static method in class org.apache.beam.sdk.transforms.Reshuffle
-
Encapsulates the sequence "pair input with unique key, apply
Reshuffle.of()
, drop the key" commonly used to break fusion. - VideoIntelligence - Class in org.apache.beam.sdk.extensions.ml
-
Factory class for PTransforms integrating with Google Cloud AI - VideoIntelligence service.
- VideoIntelligence() - Constructor for class org.apache.beam.sdk.extensions.ml.VideoIntelligence
- VideoIntelligence.AnnotateVideoFromBytes - Class in org.apache.beam.sdk.extensions.ml
-
A PTransform taking a PCollection of
ByteString
and an optional side input with a context map and emitting lists ofVideoAnnotationResults
for each element. - VideoIntelligence.AnnotateVideoFromBytesWithContext - Class in org.apache.beam.sdk.extensions.ml
-
A PTransform taking a PCollection of
KV
ofByteString
andVideoContext
and emitting lists ofVideoAnnotationResults
for each element. - VideoIntelligence.AnnotateVideoFromUri - Class in org.apache.beam.sdk.extensions.ml
-
A PTransform taking a PCollection of
String
and an optional side input with a context map and emitting lists ofVideoAnnotationResults
for each element. - VideoIntelligence.AnnotateVideoFromURIWithContext - Class in org.apache.beam.sdk.extensions.ml
- View - Class in org.apache.beam.sdk.transforms
-
Transforms for creating
PCollectionViews
fromPCollections
(to read them as side inputs). - View.AsIterable<T> - Class in org.apache.beam.sdk.transforms
-
For internal use only; no backwards-compatibility guarantees.
- View.AsList<T> - Class in org.apache.beam.sdk.transforms
-
For internal use only; no backwards-compatibility guarantees.
- View.AsMap<K,
V> - Class in org.apache.beam.sdk.transforms -
For internal use only; no backwards-compatibility guarantees.
- View.AsMultimap<K,
V> - Class in org.apache.beam.sdk.transforms -
For internal use only; no backwards-compatibility guarantees.
- View.AsSingleton<T> - Class in org.apache.beam.sdk.transforms
-
For internal use only; no backwards-compatibility guarantees.
- View.CreatePCollectionView<ElemT,
ViewT> - Class in org.apache.beam.sdk.transforms -
For internal use only; no backwards-compatibility guarantees.
- View.ToListViewDoFn<T> - Class in org.apache.beam.sdk.transforms
-
Provides an index to value mapping using a random starting index and also provides an offset range for each window seen.
- viewAsValues(PCollectionView<V>, Coder<V>) - Static method in class org.apache.beam.sdk.transforms.Reify
-
Pairs each element in a collection with the value of a side input associated with the element's window.
- ViewFn<PrimitiveViewT,
ViewT> - Class in org.apache.beam.sdk.transforms -
For internal use only; no backwards-compatibility guarantees.
- ViewFn() - Constructor for class org.apache.beam.sdk.transforms.ViewFn
- viewInGlobalWindow(PCollectionView<V>, Coder<V>) - Static method in class org.apache.beam.sdk.transforms.Reify
-
Returns a
PCollection
consisting of a single element, containing the value of the given view in the global window. - ViewP - Class in org.apache.beam.runners.jet.processors
-
Jet
Processor
implementation for Beam's side input producing primitives. - visitArrayQualifier(FieldSpecifierNotationParser.ArrayQualifierContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseVisitor
-
Visit a parse tree produced by
FieldSpecifierNotationParser.arrayQualifier()
. - visitArrayQualifier(FieldSpecifierNotationParser.ArrayQualifierContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationVisitor
-
Visit a parse tree produced by
FieldSpecifierNotationParser.arrayQualifier()
. - visitArrayQualifierList(FieldSpecifierNotationParser.ArrayQualifierListContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseVisitor
-
Visit a parse tree produced by the
arrayQualifierList
labeled alternative inFieldSpecifierNotationParser.qualifierList()
. - visitArrayQualifierList(FieldSpecifierNotationParser.ArrayQualifierListContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationVisitor
-
Visit a parse tree produced by the
arrayQualifierList
labeled alternative inFieldSpecifierNotationParser.qualifierList()
. - visitDotExpression(FieldSpecifierNotationParser.DotExpressionContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseVisitor
-
Visit a parse tree produced by
FieldSpecifierNotationParser.dotExpression()
. - visitDotExpression(FieldSpecifierNotationParser.DotExpressionContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationVisitor
-
Visit a parse tree produced by
FieldSpecifierNotationParser.dotExpression()
. - visitErrorNode(ErrorNode) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
- visitFieldSpecifier(FieldSpecifierNotationParser.FieldSpecifierContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseVisitor
-
Visit a parse tree produced by
FieldSpecifierNotationParser.fieldSpecifier()
. - visitFieldSpecifier(FieldSpecifierNotationParser.FieldSpecifierContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationVisitor
-
Visit a parse tree produced by
FieldSpecifierNotationParser.fieldSpecifier()
. - visitMapQualifier(FieldSpecifierNotationParser.MapQualifierContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseVisitor
-
Visit a parse tree produced by
FieldSpecifierNotationParser.mapQualifier()
. - visitMapQualifier(FieldSpecifierNotationParser.MapQualifierContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationVisitor
-
Visit a parse tree produced by
FieldSpecifierNotationParser.mapQualifier()
. - visitMapQualifierList(FieldSpecifierNotationParser.MapQualifierListContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseVisitor
-
Visit a parse tree produced by the
mapQualifierList
labeled alternative inFieldSpecifierNotationParser.qualifierList()
. - visitMapQualifierList(FieldSpecifierNotationParser.MapQualifierListContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationVisitor
-
Visit a parse tree produced by the
mapQualifierList
labeled alternative inFieldSpecifierNotationParser.qualifierList()
. - visitPrimitiveTransform(TransformHierarchy.Node) - Method in class org.apache.beam.runners.flink.translation.utils.CountingPipelineVisitor
- visitPrimitiveTransform(TransformHierarchy.Node) - Method in class org.apache.beam.runners.flink.translation.utils.LookupPipelineVisitor
- visitPrimitiveTransform(TransformHierarchy.Node) - Method in class org.apache.beam.runners.spark.SparkRunner.Evaluator
- visitPrimitiveTransform(TransformHierarchy.Node) - Method in class org.apache.beam.runners.spark.translation.GroupByKeyVisitor
- visitPrimitiveTransform(TransformHierarchy.Node) - Method in class org.apache.beam.runners.twister2.translators.Twister2BatchPipelineTranslator
- visitPrimitiveTransform(TransformHierarchy.Node) - Method in class org.apache.beam.sdk.Pipeline.PipelineVisitor.Defaults
- visitPrimitiveTransform(TransformHierarchy.Node) - Method in interface org.apache.beam.sdk.Pipeline.PipelineVisitor
-
Called for each primitive transform after all of its topological predecessors and inputs have been visited.
- visitQualifiedComponent(FieldSpecifierNotationParser.QualifiedComponentContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseVisitor
-
Visit a parse tree produced by
FieldSpecifierNotationParser.qualifiedComponent()
. - visitQualifiedComponent(FieldSpecifierNotationParser.QualifiedComponentContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationVisitor
-
Visit a parse tree produced by
FieldSpecifierNotationParser.qualifiedComponent()
. - visitQualifyComponent(FieldSpecifierNotationParser.QualifyComponentContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseVisitor
-
Visit a parse tree produced by the
qualifyComponent
labeled alternative inFieldSpecifierNotationParser.dotExpressionComponent()
. - visitQualifyComponent(FieldSpecifierNotationParser.QualifyComponentContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationVisitor
-
Visit a parse tree produced by the
qualifyComponent
labeled alternative inFieldSpecifierNotationParser.dotExpressionComponent()
. - visitSimpleIdentifier(FieldSpecifierNotationParser.SimpleIdentifierContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseVisitor
-
Visit a parse tree produced by the
simpleIdentifier
labeled alternative inFieldSpecifierNotationParser.dotExpressionComponent()
. - visitSimpleIdentifier(FieldSpecifierNotationParser.SimpleIdentifierContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationVisitor
-
Visit a parse tree produced by the
simpleIdentifier
labeled alternative inFieldSpecifierNotationParser.dotExpressionComponent()
. - visitTerminal(TerminalNode) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
- visitValue(PValue, TransformHierarchy.Node) - Method in class org.apache.beam.sdk.Pipeline.PipelineVisitor.Defaults
- visitValue(PValue, TransformHierarchy.Node) - Method in interface org.apache.beam.sdk.Pipeline.PipelineVisitor
-
Called for each value after the transform that produced the value has been visited.
- visitWildcard(FieldSpecifierNotationParser.WildcardContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseVisitor
-
Visit a parse tree produced by the
wildcard
labeled alternative inFieldSpecifierNotationParser.dotExpressionComponent()
. - visitWildcard(FieldSpecifierNotationParser.WildcardContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationVisitor
-
Visit a parse tree produced by the
wildcard
labeled alternative inFieldSpecifierNotationParser.dotExpressionComponent()
. - VOCABULARY - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
- VOCABULARY - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- VoidCoder - Class in org.apache.beam.sdk.coders
- voids() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
- VolatileAccumulatorState() - Constructor for class org.apache.beam.sdk.io.kafka.jmh.KafkaIOUtilsBenchmark.VolatileAccumulatorState
- volatileRead(KafkaIOUtilsBenchmark.VolatileAccumulatorState) - Method in class org.apache.beam.sdk.io.kafka.jmh.KafkaIOUtilsBenchmark
- volatileReadWhileWriting(KafkaIOUtilsBenchmark.VolatileAccumulatorState) - Method in class org.apache.beam.sdk.io.kafka.jmh.KafkaIOUtilsBenchmark
- volatileWrite(KafkaIOUtilsBenchmark.VolatileAccumulatorState, KafkaIOUtilsBenchmark.ProducerState) - Method in class org.apache.beam.sdk.io.kafka.jmh.KafkaIOUtilsBenchmark
- volatileWriteWhileReading(KafkaIOUtilsBenchmark.VolatileAccumulatorState, KafkaIOUtilsBenchmark.ProducerState) - Method in class org.apache.beam.sdk.io.kafka.jmh.KafkaIOUtilsBenchmark
- vpnName() - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionServiceFactory
-
The name of the VPN to connect to.
- vpnName() - Method in class org.apache.beam.sdk.io.solace.broker.GCPSecretSessionServiceFactory
- vpnName(String) - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionServiceFactory.Builder
-
Set Solace vpn name.
- vpnName(String) - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthSempClientFactory.Builder
-
Set Solace vpn name.
- vpnName(String) - Method in class org.apache.beam.sdk.io.solace.broker.GCPSecretSessionServiceFactory.Builder
-
Optional.
W
- Wait - Class in org.apache.beam.sdk.transforms
-
Delays processing of each window in a
PCollection
until signaled. - Wait() - Constructor for class org.apache.beam.sdk.transforms.Wait
- Wait.OnSignal<T> - Class in org.apache.beam.sdk.transforms
-
Implementation of
Wait.on(org.apache.beam.sdk.values.PCollection<?>...)
. - waitForCompletion() - Method in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
-
Wait for all pending requests to complete and check for failures.
- waitForNMessages(int, Duration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
-
Repeatedly pull messages from
TestPubsub.subscriptionPath()
, returns after receivingn
messages or after waiting fortimeoutDuration
. - waitForPort(String, int, int) - Static method in class org.apache.beam.sdk.extensions.python.PythonService
- waitForStart(Duration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal
-
Future that waits for a start signal for
duration
. - waitForSuccess(Duration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal
-
Wait for a success signal for
duration
. - waitForUpTo(Duration) - Method in interface org.apache.beam.sdk.io.gcp.pubsub.TestPubsub.PollingAssertion
- waitTillUp(int) - Method in class org.apache.beam.sdk.transformservice.launcher.TransformServiceLauncher
- waitUntilFinish() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
- waitUntilFinish() - Method in class org.apache.beam.runners.direct.DirectRunner.DirectPipelineResult
-
Waits until the pipeline finishes and returns the final status.
- waitUntilFinish() - Method in class org.apache.beam.runners.flink.FlinkDetachedRunnerResult
- waitUntilFinish() - Method in class org.apache.beam.runners.flink.FlinkRunnerResult
- waitUntilFinish() - Method in class org.apache.beam.runners.jet.FailedRunningPipelineResults
- waitUntilFinish() - Method in class org.apache.beam.runners.jet.JetPipelineResult
- waitUntilFinish() - Method in class org.apache.beam.runners.spark.SparkPipelineResult
- waitUntilFinish() - Method in class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingPipelineResult
- waitUntilFinish() - Method in class org.apache.beam.runners.twister2.Twister2PipelineResult
- waitUntilFinish() - Method in interface org.apache.beam.sdk.PipelineResult
-
Waits until the pipeline finishes and returns the final status.
- waitUntilFinish(Duration) - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
- waitUntilFinish(Duration) - Method in class org.apache.beam.runners.direct.DirectRunner.DirectPipelineResult
-
Waits until the pipeline finishes and returns the final status.
- waitUntilFinish(Duration) - Method in class org.apache.beam.runners.flink.FlinkDetachedRunnerResult
- waitUntilFinish(Duration) - Method in class org.apache.beam.runners.flink.FlinkRunnerResult
- waitUntilFinish(Duration) - Method in class org.apache.beam.runners.jet.FailedRunningPipelineResults
- waitUntilFinish(Duration) - Method in class org.apache.beam.runners.jet.JetPipelineResult
- waitUntilFinish(Duration) - Method in class org.apache.beam.runners.spark.SparkPipelineResult
- waitUntilFinish(Duration) - Method in class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingPipelineResult
- waitUntilFinish(Duration) - Method in class org.apache.beam.runners.twister2.Twister2PipelineResult
- waitUntilFinish(Duration) - Method in interface org.apache.beam.sdk.PipelineResult
-
Waits until the pipeline finishes and returns the final status.
- waitUntilFinish(Duration, MonitoringUtil.JobMessagesHandler) - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
-
Waits until the pipeline finishes and returns the final status.
- WallTime(Instant) - Constructor for class org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimators.WallTime
- WARN - Enum constant in enum class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.Level
-
Deprecated.Level for logging warning messages.
- WARN - Enum constant in enum class org.apache.beam.sdk.options.SdkHarnessOptions.LogLevel
-
LogLevel for logging warning messages.
- WARNING - Enum constant in enum class org.apache.beam.sdk.options.PipelineOptions.CheckEnabled
- Watch - Class in org.apache.beam.sdk.transforms
-
Given a "poll function" that produces a potentially growing set of outputs for an input, this transform simultaneously continuously watches the growth of output sets of all inputs, until a per-input termination condition is reached.
- Watch() - Constructor for class org.apache.beam.sdk.transforms.Watch
- Watch.Growth<InputT,
OutputT, - Class in org.apache.beam.sdk.transformsKeyT> - Watch.Growth.PollFn<InputT,
OutputT> - Class in org.apache.beam.sdk.transforms -
A function that computes the current set of outputs for the given input, in the form of a
Watch.Growth.PollResult
. - Watch.Growth.PollResult<OutputT> - Class in org.apache.beam.sdk.transforms
-
The result of a single invocation of a
Watch.Growth.PollFn
. - Watch.Growth.TerminationCondition<InputT,
StateT> - Interface in org.apache.beam.sdk.transforms -
A strategy for determining whether it is time to stop polling the current input regardless of whether its output is complete or not.
- Watch.WatchGrowthFn<InputT,
OutputT, - Class in org.apache.beam.sdk.transformsKeyT, TerminationStateT> - watchForNewFiles(Duration, Watch.Growth.TerminationCondition<String, ?>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Parse
- watchForNewFiles(Duration, Watch.Growth.TerminationCondition<String, ?>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ParseAll
-
Deprecated.
- watchForNewFiles(Duration, Watch.Growth.TerminationCondition<String, ?>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Read
-
Same as
AvroIO.Read.watchForNewFiles(Duration, TerminationCondition, boolean)
withmatchUpdatedFiles=false
. - watchForNewFiles(Duration, Watch.Growth.TerminationCondition<String, ?>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ReadAll
-
Deprecated.
- watchForNewFiles(Duration, Watch.Growth.TerminationCondition<String, ?>) - Method in class org.apache.beam.sdk.io.TextIO.Read
-
Same as
TextIO.Read.watchForNewFiles(Duration, TerminationCondition, boolean)
withmatchUpdatedFiles=false
. - watchForNewFiles(Duration, Watch.Growth.TerminationCondition<String, ?>) - Method in class org.apache.beam.sdk.io.TextIO.ReadAll
-
Deprecated.
- watchForNewFiles(Duration, Watch.Growth.TerminationCondition<String, ?>, boolean) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ParseAll
-
Deprecated.
- watchForNewFiles(Duration, Watch.Growth.TerminationCondition<String, ?>, boolean) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Read
-
Continuously watches for new files matching the filepattern, polling it at the given interval, until the given termination condition is reached.
- watchForNewFiles(Duration, Watch.Growth.TerminationCondition<String, ?>, boolean) - Method in class org.apache.beam.sdk.io.TextIO.Read
- watchForNewFiles(Duration, Watch.Growth.TerminationCondition<String, ?>, boolean) - Method in class org.apache.beam.sdk.io.TextIO.ReadAll
-
Deprecated.
- WATERMARK - Enum constant in enum class org.apache.beam.sdk.testing.TestStream.EventType
- WatermarkAdvancingStreamingListener() - Constructor for class org.apache.beam.runners.spark.util.GlobalWatermarkHolder.WatermarkAdvancingStreamingListener
- WatermarkCache - Interface in org.apache.beam.sdk.io.gcp.spanner.changestreams.cache
- WatermarkEstimator<WatermarkEstimatorStateT> - Interface in org.apache.beam.sdk.transforms.splittabledofn
-
A
WatermarkEstimator
which is used for estimating output watermarks of a splittableDoFn
. - WatermarkEstimators - Class in org.apache.beam.sdk.fn.splittabledofn
-
Support utilties for interacting with
WatermarkEstimator
s. - WatermarkEstimators - Class in org.apache.beam.sdk.transforms.splittabledofn
-
A set of
WatermarkEstimator
s that users can use to advance the output watermark for their associatedsplittable DoFn
s. - WatermarkEstimators() - Constructor for class org.apache.beam.sdk.fn.splittabledofn.WatermarkEstimators
- WatermarkEstimators.Manual - Class in org.apache.beam.sdk.transforms.splittabledofn
-
Concrete implementation of a
ManualWatermarkEstimator
. - WatermarkEstimators.MonotonicallyIncreasing - Class in org.apache.beam.sdk.transforms.splittabledofn
-
A watermark estimator that observes timestamps of records output from a DoFn reporting the timestamp of the last element seen as the current watermark.
- WatermarkEstimators.WallTime - Class in org.apache.beam.sdk.transforms.splittabledofn
-
A watermark estimator that tracks wall time.
- WatermarkEstimators.WatermarkAndStateObserver<WatermarkEstimatorStateT> - Interface in org.apache.beam.sdk.fn.splittabledofn
-
Interface which allows for accessing the current watermark and watermark estimator state.
- WatermarkEvent() - Constructor for class org.apache.beam.sdk.testing.TestStream.WatermarkEvent
- WatermarkHoldState - Interface in org.apache.beam.sdk.state
-
For internal use only; no backwards-compatibility guarantees.
- WatermarkParameters - Class in org.apache.beam.sdk.io.aws2.kinesis
-
WatermarkParameters
contains the parameters used for watermark computation. - WatermarkParameters() - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.WatermarkParameters
- WatermarkPolicy - Interface in org.apache.beam.sdk.io.aws2.kinesis
-
Implement this interface to define a custom watermark calculation heuristic.
- WatermarkPolicyFactory - Interface in org.apache.beam.sdk.io.aws2.kinesis
-
Implement this interface to create a
WatermarkPolicy
. - WatermarkPolicyFactory.ArrivalTimeWatermarkPolicy - Class in org.apache.beam.sdk.io.aws2.kinesis
-
ArrivalTimeWatermarkPolicy uses
WatermarkPolicyFactory.CustomWatermarkPolicy
for watermark computation. - WatermarkPolicyFactory.CustomWatermarkPolicy - Class in org.apache.beam.sdk.io.aws2.kinesis
-
CustomWatermarkPolicy uses parameters defined in
WatermarkParameters
to compute watermarks. - WatermarkPolicyFactory.ProcessingTimeWatermarkPolicy - Class in org.apache.beam.sdk.io.aws2.kinesis
-
Watermark policy where the processing time is used as the event time.
- Watermarks - Search tag in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO
- Section
- watermarkStateInternal(TimestampCombiner) - Static method in class org.apache.beam.sdk.state.StateSpecs
-
For internal use only; no backwards-compatibility guarantees.
- WEAKEN - Enum constant in enum class org.apache.beam.sdk.schemas.Schema.EquivalenceNullablePolicy
- WebIdTokenProvider - Interface in org.apache.beam.sdk.io.aws2.auth
-
Defines the behavior for a OIDC web identity token provider.
- webIdTokenProviderFQCN() - Method in class org.apache.beam.sdk.io.aws2.auth.StsAssumeRoleForFederatedCredentialsProvider
- WebPathParser - Class in org.apache.beam.sdk.io.gcp.healthcare
- WebPathParser() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser
- WebPathParser.DicomWebPath - Class in org.apache.beam.sdk.io.gcp.healthcare
- weeks(int, int) - Static method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows
-
Returns a
WindowFn
that windows elements into periods measured by weeks. - WeightedList<T> - Class in org.apache.beam.sdk.fn.data
- WeightedList(List<T>, long) - Constructor for class org.apache.beam.sdk.fn.data.WeightedList
- What it does - Search tag in class org.apache.beam.io.debezium.SourceRecordJson
- Section
- where(Type, Type) - Method in class org.apache.beam.sdk.values.TypeDescriptor
-
A more general form of
TypeDescriptor.where(TypeParameter, TypeDescriptor)
that returns a newTypeDescriptor
by matchingformal
againstactual
to resolve type variables in the currentTypeDescriptor
. - where(TypeParameter<X>, TypeDescriptor<X>) - Method in class org.apache.beam.sdk.values.TypeDescriptor
-
Returns a new
TypeDescriptor
where the type variable represented bytypeParameter
are substituted bytype
. - whereFieldId(int, SerializableFunction<FieldT, Boolean>) - Method in class org.apache.beam.sdk.schemas.transforms.Filter.Inner
-
Set a predicate based on the value of a field, where the field is specified by id.
- whereFieldIds(List<Integer>, SerializableFunction<Row, Boolean>) - Method in class org.apache.beam.sdk.schemas.transforms.Filter.Inner
-
Set a predicate based on the value of multipled fields, specified by id.
- whereFieldName(String, SerializableFunction<FieldT, Boolean>) - Method in class org.apache.beam.sdk.schemas.transforms.Filter.Inner
-
Set a predicate based on the value of a field, where the field is specified by name.
- whereFieldNames(List<String>, SerializableFunction<Row, Boolean>) - Method in class org.apache.beam.sdk.schemas.transforms.Filter.Inner
-
Set a predicate based on the value of multipled fields, specified by name.
- widening(Schema) - Static method in class org.apache.beam.sdk.schemas.transforms.Cast
- Widening() - Constructor for class org.apache.beam.sdk.schemas.transforms.Cast.Widening
- WILDCARD - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
- WILDCARD - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- WILDCARD() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.WildcardContext
- WildcardContext(FieldSpecifierNotationParser.DotExpressionComponentContext) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.WildcardContext
- wildcardToRegexp(String) - Static method in class org.apache.beam.sdk.io.FileSystemUtils
-
Expands glob expressions to regular expressions.
- WINDMILL_SERVICE_EXPERIMENT - Static variable in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
-
Deprecated.Use STREAMING_ENGINE_EXPERIMENT instead.
- WindmillServiceStreamingRpcBatchLimitFactory() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions.WindmillServiceStreamingRpcBatchLimitFactory
- window() - Method in class org.apache.beam.runners.twister2.utils.Twister2AssignContext
- window() - Method in interface org.apache.beam.sdk.state.StateContext
-
Returns the window corresponding to the state.
- window() - Method in class org.apache.beam.sdk.transforms.DoFn.OnTimerContext
-
Returns the window in which the timer is firing.
- window() - Method in class org.apache.beam.sdk.transforms.DoFn.OnWindowExpirationContext
-
Returns the window in which the window expiration is firing.
- window() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn.AssignContext
-
Returns the window of the current element prior to this
WindowFn
being called. - Window<T> - Class in org.apache.beam.sdk.transforms.windowing
-
Window
logically divides up or groups the elements of aPCollection
into finite windows according to aWindowFn
. - Window() - Constructor for class org.apache.beam.sdk.transforms.windowing.Window
- WINDOW_END - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.TVFStreamingUtils
- WINDOW_START - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.TVFStreamingUtils
- Window.Assign<T> - Class in org.apache.beam.sdk.transforms.windowing
-
A Primitive
PTransform
that assigns windows to elements based on aWindowFn
. - Window.ClosingBehavior - Enum Class in org.apache.beam.sdk.transforms.windowing
-
Specifies the conditions under which a final pane will be created when a window is permanently closed.
- Window.OnTimeBehavior - Enum Class in org.apache.beam.sdk.transforms.windowing
-
Specifies the conditions under which an on-time pane will be created when a window is closed.
- windowCoder() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.BagUserStateSpec
- windowCoder() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.SideInputSpec
- windowCoder() - Method in class org.apache.beam.sdk.extensions.sql.impl.TVFSlidingWindowFn
- windowCoder() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
- windowCoder() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
- windowCoder() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
- windowCoder() - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
- windowCoder() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
- windowCoder() - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
- windowCoder() - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
- windowCoder() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
-
Returns the
Coder
used for serializing the windows used by this windowFn. - windowCoder(PCollection<T>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator
- WindowDoFnOperator<K,
InputT, - Class in org.apache.beam.runners.flink.translation.wrappers.streamingOutputT> -
Flink operator for executing window
DoFns
. - WindowDoFnOperator(SystemReduceFn<K, InputT, ?, OutputT, BoundedWindow>, String, Coder<WindowedValue<KeyedWorkItem<K, InputT>>>, TupleTag<KV<K, OutputT>>, List<TupleTag<?>>, DoFnOperator.OutputManagerFactory<KV<K, OutputT>>, WindowingStrategy<?, ?>, Map<Integer, PCollectionView<?>>, Collection<PCollectionView<?>>, PipelineOptions, Coder<K>, KeySelector<WindowedValue<KeyedWorkItem<K, InputT>>, ?>) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.WindowDoFnOperator
- WindowedContext() - Constructor for class org.apache.beam.sdk.transforms.DoFn.WindowedContext
- windowedEncoder(Coder<T>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
- windowedEncoder(Coder<T>, Coder<W>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
- windowedEncoder(Encoder<T>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
- windowedFilename(int, int, BoundedWindow, PaneInfo, FileBasedSink.OutputFileHints) - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy
- windowedFilename(int, int, BoundedWindow, PaneInfo, FileBasedSink.OutputFileHints) - Method in class org.apache.beam.sdk.io.FileBasedSink.FilenamePolicy
-
When a sink has requested windowed or triggered output, this method will be invoked to return the file
resource
to be created given the base output directory and aFileBasedSink.OutputFileHints
containing information about the file, including a suggested extension (e.g. - WindowedKvKeySelector<InputT,
K> - Class in org.apache.beam.runners.flink.translation.types - WindowedKvKeySelector(Coder<K>, Coder<? extends BoundedWindow>) - Constructor for class org.apache.beam.runners.flink.translation.types.WindowedKvKeySelector
- windowedMultiReceiver(DoFn.WindowedContext) - Static method in class org.apache.beam.sdk.transforms.DoFnOutputReceivers
-
Returns a
DoFn.MultiOutputReceiver
that delegates to aDoFn.WindowedContext
. - windowedMultiReceiver(DoFn.WindowedContext, Map<TupleTag<?>, Coder<?>>) - Static method in class org.apache.beam.sdk.transforms.DoFnOutputReceivers
-
Returns a
DoFn.MultiOutputReceiver
that delegates to aDoFn.WindowedContext
. - windowedReceiver(DoFn.WindowedContext, TupleTag<T>) - Static method in class org.apache.beam.sdk.transforms.DoFnOutputReceivers
-
Returns a
DoFn.OutputReceiver
that delegates to aDoFn.WindowedContext
. - WindowedValue<T> - Interface in org.apache.beam.sdk.values
-
A value along with Beam's windowing information and all other metadata.
- windowedValueEncoder(Encoder<T>, Encoder<W>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers
-
Creates a Spark
Encoder
forEncoderHelpers
ofStructType
with fieldsvalue
,timestamp
,windows
andpane
. - windowedValues(Iterable<WindowedValue<T>>) - Static method in class org.apache.beam.sdk.transforms.Create
-
Returns a new
Create.WindowedValues
transform that produces aPCollection
containing the elements of the providedIterable
with the specified windowing metadata. - windowedValues(WindowedValue<T>, WindowedValue<T>...) - Static method in class org.apache.beam.sdk.transforms.Create
-
Returns a new
Create.WindowedValues
transform that produces aPCollection
containing the specified elements with the specified windowing metadata. - WindowedValues - Class in org.apache.beam.sdk.values
-
Implementations of
WindowedValue
and static utility methods. - WindowedValues.FullWindowedValueCoder<T> - Class in org.apache.beam.sdk.values
-
Coder for
WindowedValue
. - WindowedValues.ParamWindowedValueCoder<T> - Class in org.apache.beam.sdk.values
-
A parameterized coder for
WindowedValue
. - WindowedValues.SingleWindowedValue - Interface in org.apache.beam.sdk.values
-
A
WindowedValues
which holds exactly single window per value. - WindowedValues.ValueOnlyWindowedValueCoder<T> - Class in org.apache.beam.sdk.values
-
Deprecated.Use ParamWindowedValueCoder instead, it is a general purpose implementation of the same concept but makes timestamp, windows and pane info configurable.
- WindowedValues.WindowedValueCoder<T> - Class in org.apache.beam.sdk.values
-
Abstract class for
WindowedValue
coder. - windowedWrites - Variable in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
-
Whether windowed writes are being used.
- windowEncoder() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
- WindowFn<T,
W> - Class in org.apache.beam.sdk.transforms.windowing -
The argument to the
Window
transform used to assign elements into windows and to determine how windows are merged. - WindowFn() - Constructor for class org.apache.beam.sdk.transforms.windowing.WindowFn
- WindowFn.AssignContext - Class in org.apache.beam.sdk.transforms.windowing
-
Information available when running
WindowFn.assignWindows(org.apache.beam.sdk.transforms.windowing.WindowFn.AssignContext)
. - WindowFn.MergeContext - Class in org.apache.beam.sdk.transforms.windowing
-
Information available when running
WindowFn.mergeWindows(org.apache.beam.sdk.transforms.windowing.WindowFn.MergeContext)
. - WindowFnTestUtils - Class in org.apache.beam.sdk.testing
-
A utility class for testing
WindowFn
s. - WindowFnTestUtils() - Constructor for class org.apache.beam.sdk.testing.WindowFnTestUtils
- WindowGroupP<K,
V> - Class in org.apache.beam.runners.jet.processors -
Jet
Processor
implementation for Beam's GroupByKeyOnly + GroupAlsoByWindow primitives. - Windowing - Search tag in class org.apache.beam.sdk.transforms.windowing.Window
- Section
- WINDOWING_STRATEGY - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
- windowingStrategy - Variable in class org.apache.beam.runners.flink.translation.functions.FlinkPartialReduceFunction
- windowingStrategy - Variable in class org.apache.beam.runners.flink.translation.functions.FlinkReduceFunction
- windowingStrategy - Variable in class org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator
- WindowingStrategy<T,
W> - Class in org.apache.beam.sdk.values -
A
WindowingStrategy
describes the windowing behavior for a specific collection of values. - WindowingStrategy.AccumulationMode - Enum Class in org.apache.beam.sdk.values
-
The accumulation modes that can be used with windowing.
- WindowIntoTransformProvider - Class in org.apache.beam.sdk.expansion.service
-
An implementation of
TypedSchemaTransformProvider
for WindowInto. - WindowIntoTransformProvider() - Constructor for class org.apache.beam.sdk.expansion.service.WindowIntoTransformProvider
- WindowIntoTransformProvider.Configuration - Class in org.apache.beam.sdk.expansion.service
- WindowIntoTransformProvider.Configuration.Builder - Class in org.apache.beam.sdk.expansion.service
- WindowMappingFn<TargetWindowT> - Class in org.apache.beam.sdk.transforms.windowing
-
A function that takes the windows of elements in a main input and maps them to the appropriate window in a
PCollectionView
consumed as aside input
. - WindowMappingFn() - Constructor for class org.apache.beam.sdk.transforms.windowing.WindowMappingFn
-
Create a new
WindowMappingFn
withzero
maximum lookback. - WindowMappingFn(Duration) - Constructor for class org.apache.beam.sdk.transforms.windowing.WindowMappingFn
-
Create a new
WindowMappingFn
with the specified maximum lookback. - windowOnlyContext(W) - Static method in class org.apache.beam.sdk.state.StateContexts
- windows() - Static method in class org.apache.beam.sdk.transforms.Reify
-
Create a
PTransform
that will reify information from the processing context into instances ofValueInSingleWindow
. - windows() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn.MergeContext
-
Returns the current set of windows.
- windowsInValue() - Static method in class org.apache.beam.sdk.transforms.Reify
-
Create a
PTransform
that will output all inputKVs
with the window pane info inside the value. - WireCoders - Class in org.apache.beam.runners.fnexecution.wire
-
Helpers to construct coders for gRPC port reads and writes.
- with(PTransform<PBegin, PCollection<T>>) - Static method in class org.apache.beam.sdk.transforms.Flatten
-
Returns a
PTransform
that flattens the inputPCollection
with the output of anotherPTransform
resulting in aPCollection
containing all the elements of both the inputPCollection
s and the output of the givenPTransform
as its output. - with(SimpleFunction<DataT, InputT>, Coder<InputT>, CombineFnBase.GlobalCombineFn<InputT, ?, OutputT>, TupleTag<OutputT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
-
Returns a
CombineFns.ComposedCombineFnWithContext
with an additionalCombineFnBase.GlobalCombineFn
. - with(SimpleFunction<DataT, InputT>, Coder, Combine.CombineFn<InputT, ?, OutputT>, TupleTag<OutputT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposeCombineFnBuilder
-
Like
CombineFns.ComposeCombineFnBuilder.with(SimpleFunction, CombineFn, TupleTag)
but with an explicit input coder. - with(SimpleFunction<DataT, InputT>, Coder, Combine.CombineFn<InputT, ?, OutputT>, TupleTag<OutputT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
-
Returns a
CombineFns.ComposedCombineFn
with an additionalCombine.CombineFn
. - with(SimpleFunction<DataT, InputT>, Coder, CombineWithContext.CombineFnWithContext<InputT, ?, OutputT>, TupleTag<OutputT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposeCombineFnBuilder
-
Like
CombineFns.ComposeCombineFnBuilder.with(SimpleFunction, CombineFnWithContext, TupleTag)
but with input coder. - with(SimpleFunction<DataT, InputT>, Coder, CombineWithContext.CombineFnWithContext<InputT, ?, OutputT>, TupleTag<OutputT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
-
Returns a
CombineFns.ComposedCombineFnWithContext
with an additionalCombineWithContext.CombineFnWithContext
. - with(SimpleFunction<DataT, InputT>, Combine.CombineFn<InputT, ?, OutputT>, TupleTag<OutputT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposeCombineFnBuilder
-
Returns a
CombineFns.ComposedCombineFn
that can take additionalGlobalCombineFns
and apply them as a single combine function. - with(SimpleFunction<DataT, InputT>, Combine.CombineFn<InputT, ?, OutputT>, TupleTag<OutputT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
-
Returns a
CombineFns.ComposedCombineFn
with an additionalCombine.CombineFn
. - with(SimpleFunction<DataT, InputT>, CombineFnBase.GlobalCombineFn<InputT, ?, OutputT>, TupleTag<OutputT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
-
Returns a
CombineFns.ComposedCombineFnWithContext
with an additionalCombineFnBase.GlobalCombineFn
. - with(SimpleFunction<DataT, InputT>, CombineWithContext.CombineFnWithContext<InputT, ?, OutputT>, TupleTag<OutputT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposeCombineFnBuilder
-
Returns a
CombineFns.ComposedCombineFnWithContext
that can take additionalGlobalCombineFns
and apply them as a single combine function. - with(SimpleFunction<DataT, InputT>, CombineWithContext.CombineFnWithContext<InputT, ?, OutputT>, TupleTag<OutputT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
-
Returns a
CombineFns.ComposedCombineFnWithContext
with an additionalCombineWithContext.CombineFnWithContext
. - with(PCollection<KV<K, V2>>) - Static method in class org.apache.beam.sdk.extensions.joinlibrary.Join.InnerJoin
- with(PCollection<KV<K, V2>>, V1) - Static method in class org.apache.beam.sdk.extensions.joinlibrary.Join.RightOuterJoin
- with(PCollection<KV<K, V2>>, V1, V2) - Static method in class org.apache.beam.sdk.extensions.joinlibrary.Join.FullOuterJoin
- with(PCollection<KV<K, V2>>, V2) - Static method in class org.apache.beam.sdk.extensions.joinlibrary.Join.LeftOuterJoin
- with(PCollection<T>) - Static method in class org.apache.beam.sdk.transforms.Flatten
-
Returns a
PTransform
that flattens the inputPCollection
with a givenPCollection
resulting in aPCollection
containing all the elements of bothPCollection
s as its output. - WITH_FIELD_REORDERING - Enum constant in enum class org.apache.beam.sdk.extensions.sql.meta.ProjectSupport
- withAccelerator(String) - Method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHints
-
Declares hardware accelerators that are desired to have in the execution environment.
- withAccuracy(double, double) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
-
Returns a new
SketchFrequencies.CountMinSketchFn
combiner with new precision accuracy parametersepsilon
andconfidence
. - withAddresses(List<String>) - Method in class org.apache.beam.sdk.io.amqp.AmqpIO.Read
-
Define the AMQP addresses where to receive messages.
- withAdminUrl(String) - Method in class org.apache.beam.sdk.io.pulsar.PulsarIO.Read
- withAllFields() - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
- withAllowableResponseErrors(Set<String>) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BulkIO
-
Provide a set of textual error types which can be contained in Bulk API response items[].error.type field.
- withAllowableResponseErrors(Set<String>) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
- withAllowDuplicates() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
- withAllowDuplicates(boolean) - Method in class org.apache.beam.sdk.transforms.Redistribute.RedistributeArbitrarily
- withAllowDuplicates(boolean) - Method in class org.apache.beam.sdk.transforms.Redistribute.RedistributeByKey
- withAllowDuplicates(Boolean) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- withAllowedLateness(Duration) - Method in class org.apache.beam.sdk.transforms.windowing.Window
-
Override the amount of lateness allowed for data elements in the output
PCollection
and downstreamPCollections
until explicitly set again. - withAllowedLateness(Duration) - Method in class org.apache.beam.sdk.values.WindowingStrategy
- withAllowedLateness(Duration, Window.ClosingBehavior) - Method in class org.apache.beam.sdk.transforms.windowing.Window
-
Override the amount of lateness allowed for data elements in the pipeline.
- withAllowedTimestampSkew(Duration) - Method in class org.apache.beam.sdk.transforms.WithTimestamps
-
Deprecated.This method permits a to elements to be emitted behind the watermark. These elements are considered late, and if behind the
allowed lateness
of a downstreamPCollection
may be silently dropped. See https://github.com/apache/beam/issues/18065 for details on a replacement. - withAlreadyMerged(boolean) - Method in class org.apache.beam.sdk.values.WindowingStrategy
- withApiKey(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
If Elasticsearch authentication is enabled, provide an API key.
- withAppendOnly(boolean) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocToBulk
-
Provide an instruction to control whether the target index should be considered append-only.
- withAppendOnly(boolean) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
- withAppProfileId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Returns a new
BigtableIO.Read
that will read using the specified app profile id. - withAppProfileId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
-
Returns a new
BigtableIO.ReadChangeStream
that will stream from the cluster specified by app profile id. - withAppProfileId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
-
Returns a new
BigtableIO.Write
that will write using the specified app profile id. - withAppProfileId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Returns a new
BigtableIO.Read
that will read using the specified app profile id. - withAppProfileId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
-
Returns a new
BigtableIO.Write
that will write using the specified app profile id. - withApproximateTrim(boolean) - Method in class org.apache.beam.sdk.io.redis.RedisIO.WriteStreams
-
If
RedisIO.WriteStreams.withMaxLen(long)
is used, set the "~" prefix to the MAXLEN value, indicating to the server that it should use "close enough" trimming. - withArgs(Object...) - Method in class org.apache.beam.sdk.extensions.python.PythonExternalTransform
-
Positional arguments for the Python cross-language transform.
- withArrivalTimePolicy() - Static method in interface org.apache.beam.sdk.io.aws2.kinesis.WatermarkPolicyFactory
-
Returns an ArrivalTimeWatermarkPolicy.
- withArrivalTimePolicy(Duration) - Static method in interface org.apache.beam.sdk.io.aws2.kinesis.WatermarkPolicyFactory
-
Returns an ArrivalTimeWatermarkPolicy.
- withArrivalTimeWatermarkPolicy() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
-
Specifies the
WatermarkPolicyFactory
as ArrivalTimeWatermarkPolicyFactory. - withArrivalTimeWatermarkPolicy(Duration) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
-
Specifies the
WatermarkPolicyFactory
as ArrivalTimeWatermarkPolicyFactory. - withAttemptTimeout(Duration) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Returns a new
BigtableIO.Read
with the attempt timeout. - withAttemptTimeout(Duration) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
-
Returns a new
BigtableIO.Write
with the attempt timeout. - withAuth(String) - Method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
-
Define the password to authenticate on the Redis server.
- withAuth(String) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Read
- withAuth(String) - Method in class org.apache.beam.sdk.io.redis.RedisIO.ReadKeyPatterns
- withAuth(String) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Write
- withAuth(String) - Method in class org.apache.beam.sdk.io.redis.RedisIO.WriteStreams
-
Use the redis AUTH command when connecting to the server; the format of the string can be either just a password or a username and password separated by a space.
- withAuth(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
- withAuthenticator(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
Sets authenticator for Snowflake.
- withAutoLoading(boolean) - Method in class org.apache.beam.sdk.extensions.sql.SqlTransform
- withAutoScaler(AutoScaler) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
-
Sets the
AutoScaler
to use for reporting backlog during the execution of this source. - withAutoSchemaUpdate(boolean) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
If true, enables automatically detecting BigQuery table schema updates.
- withAutoSharding() - Method in class org.apache.beam.sdk.io.FileIO.Write
- withAutoSharding() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
If true, enables using a dynamically determined number of shards to write to BigQuery.
- withAutoSharding() - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
- withAutoSharding() - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteVoid
-
If true, enables using a dynamically determined number of shards to write.
- withAutoSharding() - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteWithResults
-
If true, enables using a dynamically determined number of shards to write.
- withAutoSharding() - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
- withAutoSharding() - Method in class org.apache.beam.sdk.io.TextIO.Write
- withAutoSharding() - Method in class org.apache.beam.sdk.io.WriteFiles
- withAvroDataModel(GenericData) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Read
-
Define the Avro data model; see
AvroParquetReader.Builder.withDataModel(GenericData)
. - withAvroDataModel(GenericData) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.ReadFiles
-
Define the Avro data model; see
AvroParquetReader.Builder.withDataModel(GenericData)
. - withAvroDataModel(GenericData) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Sink
-
Define the Avro data model; see
AvroParquetWriter.Builder.withDataModel(GenericData)
. - withAvroFormatFunction(SerializableFunction<AvroWriteRequest<T>, GenericRecord>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Formats the user's type into a
GenericRecord
to be written to BigQuery. - withAvroSchemaFactory(SerializableFunction<TableSchema, Schema>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Uses the specified function to convert a
TableSchema
to aSchema
. - withAvroWriter(SerializableFunction<Schema, DatumWriter<T>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Writes the user's type as avro using the supplied
DatumWriter
. - withAvroWriter(SerializableFunction<AvroWriteRequest<T>, AvroT>, SerializableFunction<Schema, DatumWriter<AvroT>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Convert's the user's type to an avro record using the supplied avroFormatFunction.
- withBackendVersion(int) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocToBulk
-
Use to set explicitly which version of Elasticsearch the destination cluster is running.
- withBackendVersion(int) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
- withBacklogReplicationAdjustment(Duration) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
-
Returns a new
BigtableIO.ReadChangeStream
that overrides the replication delay adjustment duration with the provided duration. - withBadRecordErrorHandler(ErrorHandler<BadRecord, ?>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
-
See
FileIO.Write.withBadRecordErrorHandler(ErrorHandler)
for details on usage. - withBadRecordErrorHandler(ErrorHandler<BadRecord, ?>) - Method in class org.apache.beam.sdk.io.FileIO.Write
-
Configures a new
FileIO.Write
with an ErrorHandler. - withBadRecordErrorHandler(ErrorHandler<BadRecord, ?>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- withBadRecordErrorHandler(ErrorHandler<BadRecord, ?>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
- withBadRecordErrorHandler(ErrorHandler<BadRecord, ?>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
-
Configure a
ErrorHandler.BadRecordErrorHandler
for sending records to if they fail to serialize when being sent to Kafka. - withBadRecordErrorHandler(ErrorHandler<BadRecord, ?>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- withBadRecordErrorHandler(ErrorHandler<BadRecord, ?>) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
-
See
FileIO.Write.withBadRecordErrorHandler(ErrorHandler)
for details on usage. - withBadRecordErrorHandler(ErrorHandler<BadRecord, ?>) - Method in class org.apache.beam.sdk.io.WriteFiles
-
See
FileIO.Write.withBadRecordErrorHandler(ErrorHandler)
for details on usage. - withBaseFilename(ResourceId) - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy.Params
-
Sets the base filename.
- withBaseFilename(ValueProvider<ResourceId>) - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy.Params
-
Like
DefaultFilenamePolicy.Params.withBaseFilename(ResourceId)
, but takes in aValueProvider
. - withBasicCredentials(String, String) - Method in class org.apache.beam.sdk.io.solr.SolrIO.ConnectionConfiguration
-
If Solr basic authentication is enabled, provide the username and password.
- withBatchCount(Integer) - Method in class org.apache.beam.sdk.io.splunk.SplunkIO.Write
-
Sets batchCount for sending multiple events in a single request to the HEC.
- withBatchCount(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.splunk.SplunkIO.Write
-
Same as
SplunkIO.Write.withBatchCount(Integer)
but withValueProvider
. - withBatching(boolean) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
-
If true the uses Cloud Spanner batch API.
- withBatching(boolean) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
-
By default the PartitionQuery API is used to read data from Cloud Spanner.
- withBatchInitialCount(int) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
-
Configure the initial size of a batch; used in the absence of the QoS system having significant data to determine a better batch size.
- withBatchMaxBufferingDuration(Duration) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
-
Returns a new
TextIO.TypedWrite
that will batch the input records using specified max buffering duration. - withBatchMaxBufferingDuration(Duration) - Method in class org.apache.beam.sdk.io.TextIO.Write
- withBatchMaxBufferingDuration(Duration) - Method in class org.apache.beam.sdk.io.WriteFiles
-
Returns a new
WriteFiles
that will batch the input records using specified max buffering duration. - withBatchMaxBytes(int) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Write
-
Max.
- withBatchMaxBytes(long) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
-
Configure the maximum number of bytes to include in a batch.
- withBatchMaxCount(int) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
-
Configure the maximum number of writes to include in a batch.
- withBatchMaxRecords(int) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Write
-
Max.
- withBatchSize(int) - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.WriteBatches
-
The batch size to use, default (and AWS limit) is
10
. - withBatchSize(int) - Method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Write
-
Number of elements to batch.
- withBatchSize(int) - Method in class org.apache.beam.sdk.io.kudu.KuduIO.Read
-
Reads from the table in batches of the specified size.
- withBatchSize(int) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Read
- withBatchSize(int) - Method in class org.apache.beam.sdk.io.redis.RedisIO.ReadKeyPatterns
- withBatchSize(long) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
-
Provide a size for the scroll read.
- withBatchSize(long) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Write
-
Sets batch size for the write operation.
- withBatchSize(long) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
- withBatchSize(long) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteVoid
-
Provide a maximum size in number of SQL statement for the batch.
- withBatchSize(long) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteWithResults
- withBatchSize(long) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
-
Define the size of the batch to group write operations.
- withBatchSize(long) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.WriteUnwind
- withBatchSize(Integer) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
-
Returns a new
TextIO.TypedWrite
that will batch the input records using specified batch size. - withBatchSize(Integer) - Method in class org.apache.beam.sdk.io.TextIO.Write
- withBatchSize(Integer) - Method in class org.apache.beam.sdk.io.WriteFiles
-
Returns a new
WriteFiles
that will batch the input records using specified batch size. - withBatchSize(Integer) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportCatalogItems
- withBatchSize(Integer) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportUserEvents
- withBatchSize(Integer) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.Write
-
Provide a maximum number of rows that is written by one SQL statement.
- withBatchSize(ValueProvider<Long>) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.WriteUnwind
- withBatchSizeBytes(long) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
Specifies the batch size limit (max number of bytes mutated per batch).
- withBatchSizeBytes(Integer) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
-
Returns a new
TextIO.TypedWrite
that will batch the input records using specified batch size in bytes. - withBatchSizeBytes(Integer) - Method in class org.apache.beam.sdk.io.TextIO.Write
- withBatchSizeBytes(Integer) - Method in class org.apache.beam.sdk.io.WriteFiles
-
Returns a new
WriteFiles
that will batch the input records using specified batch size in bytes. - withBatchTargetLatency(Duration) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
-
Target latency for batch requests.
- withBatchTimeout(Duration) - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.WriteBatches
-
The duration to accumulate records before timing out, default is 3 secs.
- withBatchTimeout(Duration, boolean) - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.WriteBatches
-
The duration to accumulate records before timing out, default is 3 secs.
- withBeamRowConverters(TypeDescriptor<Struct>, SpannerIO.Read.ToBeamRowFunction, SpannerIO.Read.FromBeamRowFunction) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
- withBeamRowConverters(TypeDescriptor<T>, BigQueryIO.TypedRead.ToBeamRowFunction<T>, BigQueryIO.TypedRead.FromBeamRowFunction<T>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
-
Sets the functions to convert elements to/from
Row
objects. - withBeamSchemas(boolean) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Read
-
If set to true, a Beam schema will be inferred from the AVRO schema.
- withBeamSchemas(boolean) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ReadAll
-
Deprecated.If set to true, a Beam schema will be inferred from the AVRO schema.
- withBeamSchemas(boolean) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ReadFiles
-
If set to true, a Beam schema will be inferred from the AVRO schema.
- withBeamSchemas(boolean) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Read
- withBeamSchemas(boolean) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.ReadFiles
- withBearerToken(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
If Elasticsearch authentication is enabled, provide a bearer token.
- withBigLakeConfiguration(Map<String, String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Specifies a configuration to create BigLake tables.
- withBigtableOptions(BigtableOptions) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
-
Deprecated.please set the options directly in BigtableIO.
- withBigtableOptions(BigtableOptions) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Deprecated.please set the configurations directly: BigtableIO.read().withProjectId(projectId).withInstanceId(instanceId).withTableId(tableId) and set credentials in
PipelineOptions
. - withBigtableOptions(BigtableOptions) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
-
Deprecated.please configure the write options directly: BigtableIO.write().withProjectId(projectId).withInstanceId(instanceId).withTableId(tableId) and set credentials in
PipelineOptions
. - withBigtableOptions(BigtableOptions.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Deprecated.please set the configurations directly: BigtableIO.read().withProjectId(projectId).withInstanceId(instanceId).withTableId(tableId) and set credentials in
PipelineOptions
. - withBigtableOptions(BigtableOptions.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
-
Deprecated.please configure the write options directly: BigtableIO.write().withProjectId(projectId).withInstanceId(instanceId).withTableId(tableId) and set credentials in
PipelineOptions
. - withBigtableOptionsConfigurator(SerializableFunction<BigtableOptions.Builder, BigtableOptions.Builder>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
-
Deprecated.please set the options directly in BigtableIO.
- withBigtableOptionsConfigurator(SerializableFunction<BigtableOptions.Builder, BigtableOptions.Builder>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Deprecated.please set the configurations directly: BigtableIO.read().withProjectId(projectId).withInstanceId(instanceId).withTableId(tableId) and set credentials in
PipelineOptions
. - withBigtableOptionsConfigurator(SerializableFunction<BigtableOptions.Builder, BigtableOptions.Builder>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
-
Deprecated.please configure the write options directly: BigtableIO.write().withProjectId(projectId).withInstanceId(instanceId).withTableId(tableId) and set credentials in
PipelineOptions
. - withBootstrapServers(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Sets the bootstrap servers for the Kafka consumer.
- withBootstrapServers(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
Sets the bootstrap servers to use for the Kafka consumer if unspecified via KafkaSourceDescriptor#getBootStrapServers()}.
- withBootstrapServers(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
-
Wrapper method over
KafkaIO.WriteRecords.withBootstrapServers(String)
, used to keep the compatibility with old API based on KV type of element. - withBootstrapServers(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
-
Returns a new
KafkaIO.Write
transform with Kafka producer pointing tobootstrapServers
. - withBucket(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
- withBucket(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Write
- withBucketAuto(boolean) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
-
Sets weather to use $bucketAuto or not.
- withBulkDirective(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Document
-
Sets the bulk directive representation of an input document.
- withByteSize(long) - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches
- withByteSize(long, SerializableFunction<InputT, Long>) - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches
- withCache(Cache.Pair<RequestT, ResponseT>) - Method in class org.apache.beam.io.requestresponse.RequestResponseIO
-
Configures
RequestResponseIO
for reading and writingRequestResponseIO
andRequestResponseIO
pairs using a cache. - withCallShouldBackoff(CallShouldBackoff<ResponseT>) - Method in class org.apache.beam.io.requestresponse.RequestResponseIO
-
Overrides the package private implementation of
CallShouldBackoff
, based on https://sre.google/sre-book/handling-overload, that determines whether the underlylingDoFn
should holdRequestResponseIO
s. - withCatalogName(String) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAICreateCatalogItem
- withCatalogName(String) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportCatalogItems
- withCatalogName(String) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportUserEvents
- withCatalogName(String) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIPredict
- withCatalogName(String) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIWriteUserEvent
- withCdapPlugin(Plugin<K, V>) - Method in class org.apache.beam.sdk.io.cdap.CdapIO.Read
-
Sets a CDAP
Plugin
. - withCdapPlugin(Plugin<K, V>) - Method in class org.apache.beam.sdk.io.cdap.CdapIO.Write
-
Sets a CDAP
Plugin
. - withCdapPluginClass(Class<?>) - Method in class org.apache.beam.sdk.io.cdap.CdapIO.Read
-
Sets a CDAP Plugin class.
- withCdapPluginClass(Class<?>) - Method in class org.apache.beam.sdk.io.cdap.CdapIO.Write
-
Sets a CDAP Plugin class.
- withCdc() - Method in class org.apache.beam.sdk.io.iceberg.IcebergIO.ReadRows
- withChangeStreamName(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
-
Returns a new
BigtableIO.ReadChangeStream
that uses changeStreamName as prefix for the metadata table. - withChangeStreamName(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
-
Specifies the change stream name.
- withCharset(Charset) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
-
Sets the XML file charset.
- withCharset(Charset) - Method in class org.apache.beam.sdk.io.xml.XmlIO.ReadFiles
- withCharset(Charset) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Sink
- withCharset(Charset) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Write
-
Sets the charset used to write the file.
- withCheckStopReadingFn(CheckStopReadingFn) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
A custom
CheckStopReadingFn
that determines whether theReadFromKafkaDoFn
should stop reading from the givenTopicPartition
. - withCheckStopReadingFn(CheckStopReadingFn) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
A custom
CheckStopReadingFn
that determines whether theReadFromKafkaDoFn
should stop reading from the givenTopicPartition
. - withCheckStopReadingFn(SerializableFunction<TopicPartition, Boolean>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
A custom
SerializableFunction
that determines whether theReadFromKafkaDoFn
should stop reading from the givenTopicPartition
. - withCheckStopReadingFn(SerializableFunction<TopicPartition, Boolean>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
A custom
SerializableFunction
that determines whether theReadFromKafkaDoFn
should stop reading from the givenTopicPartition
. - withChunkSize(Long) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Write
- withClientConfiguration(ClientConfiguration) - Method in class org.apache.beam.sdk.io.aws2.dynamodb.DynamoDBIO.Read
-
Configuration of DynamoDB client.
- withClientConfiguration(ClientConfiguration) - Method in class org.apache.beam.sdk.io.aws2.dynamodb.DynamoDBIO.Write
-
Configuration of DynamoDB client.
- withClientConfiguration(ClientConfiguration) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
-
Configuration of Kinesis invalid input: '&' Cloudwatch clients.
- withClientConfiguration(ClientConfiguration) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Write
-
Configuration of Kinesis client.
- withClientConfiguration(ClientConfiguration) - Method in class org.apache.beam.sdk.io.aws2.sns.SnsIO.Write
-
Configuration of SNS client.
- withClientConfiguration(ClientConfiguration) - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.Read
-
Configuration of SQS client.
- withClientConfiguration(ClientConfiguration) - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.Write
-
Deprecated.Configuration of SQS client.
- withClientConfiguration(ClientConfiguration) - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.WriteBatches
-
Configuration of SQS client.
- withClientFactory(PubsubClient.PubsubClientFactory) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
-
The default client to write to Pub/Sub is the
PubsubJsonClient
, created by theinvalid reference
PubsubJsonClient.PubsubJsonClientFactory
- withClientFactory(PubsubClient.PubsubClientFactory) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
-
The default client to write to Pub/Sub is the
PubsubJsonClient
, created by theinvalid reference
PubsubJsonClient.PubsubJsonClientFactory
- withClientId(String) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.ConnectionConfiguration
-
Set up the client ID prefix, which is used to construct a unique client ID.
- withClientUrl(String) - Method in class org.apache.beam.sdk.io.pulsar.PulsarIO.Read
- withClientUrl(String) - Method in class org.apache.beam.sdk.io.pulsar.PulsarIO.Write
- withCloseTimeout(Duration) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
-
Sets the amount of time to wait for callbacks from the runner stating that the output has been durably persisted before closing the connection to the JMS broker.
- withClosingBehavior(Window.ClosingBehavior) - Method in class org.apache.beam.sdk.values.WindowingStrategy
- withClustering() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Allows writing to clustered tables when
BigQueryIO.Write.to(SerializableFunction)
orBigQueryIO.Write.to(DynamicDestinations)
is used. - withClustering(Clustering) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Specifies the clustering fields to use when writing to a single output table.
- withCodec(CodecFactory) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Sink
-
Specifies to use the given
CodecFactory
for each generated file. - withCodec(CodecFactory) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
-
Writes to Avro file(s) compressed using specified codec.
- withCodec(CodecFactory) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
- withCoder(Coder<OutputT>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadAll
-
Deprecated.
JdbcIO
is able to infer appropriate coders from other parameters. - withCoder(Coder<OutputT>) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.ReadAll
- withCoder(Coder<T>) - Method in class org.apache.beam.io.debezium.DebeziumIO.Read
-
Applies a
Coder
to the connector. - withCoder(Coder<T>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Parse
-
Sets a coder for the result of the parse function.
- withCoder(Coder<T>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ParseAll
-
Deprecated.Specifies the coder for the result of the
parseFn
. - withCoder(Coder<T>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ParseFiles
-
Specifies the coder for the result of the
parseFn
. - withCoder(Coder<T>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Read
-
Sets a coder for the result of the read function.
- withCoder(Coder<T>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ReadFiles
-
Sets a coder for the result of the read function.
- withCoder(Coder<T>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource
-
Specifies the coder for the result of the
AvroSource
. - withCoder(Coder<T>) - Method in class org.apache.beam.sdk.io.aws2.dynamodb.DynamoDBIO.Read
- withCoder(Coder<T>) - Method in class org.apache.beam.sdk.io.azure.cosmos.CosmosIO.Read
-
Specify the
Coder
used to serialize the document in thePCollection
. - withCoder(Coder<T>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
-
Specify the
Coder
used to serialize the entity in thePCollection
. - withCoder(Coder<T>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.ReadAll
-
Specify the
Coder
used to serialize the entity in thePCollection
. - withCoder(Coder<T>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
-
Sets a
Coder
for the result of the parse function. - withCoder(Coder<T>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
-
Deprecated.
JdbcIO
is able to infer appropriate coders from other parameters. - withCoder(Coder<T>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadWithPartitions
-
Deprecated.
JdbcIO
is able to infer appropriate coders from other parameters. - withCoder(Coder<T>) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
- withCoder(Coder<T>) - Method in class org.apache.beam.sdk.io.kudu.KuduIO.Read
-
Sets a
Coder
for the result of the parse function. - withCoder(Coder<T>) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
- withCoder(Coder<T>) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Parse
-
Specify the output coder to use for output of the
ParseFn
. - withCoder(Coder<T>) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.ParseFiles
-
Specify the output coder to use for output of the
ParseFn
. - withCoder(Coder<T>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
-
A Coder to be used by the output PCollection generated by the source.
- withCoder(Coder<T>) - Method in class org.apache.beam.sdk.transforms.Create.TimestampedValues
-
Returns a
Create.TimestampedValues
PTransform like this one that uses the givenCoder<T>
to decode each of the objects into a value of typeT
. - withCoder(Coder<T>) - Method in class org.apache.beam.sdk.transforms.Create.Values
-
Returns a
Create.Values
PTransform like this one that uses the givenCoder<T>
to decode each of the objects into a value of typeT
. - withCoder(Coder<T>) - Method in class org.apache.beam.sdk.transforms.Create.WindowedValues
-
Returns a
Create.WindowedValues
PTransform like this one that uses the givenCoder<T>
to decode each of the objects into a value of typeT
. - withCoderAndParseFn(Coder<T>, SimpleFunction<PubsubMessage, T>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
-
Causes the source to return a PubsubMessage that includes Pubsub attributes, and uses the given parsing function to transform the PubsubMessage into an output type.
- withCollection(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
-
Sets the collection to consider in the database.
- withCollection(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
-
Sets the collection where to write data in the database.
- withColumns(String...) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- withColumns(String...) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
- withColumns(List<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- withColumns(List<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
- withCommitDeadline(ValueProvider<Duration>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the commit deadline.
- withCommitDeadline(Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the commit deadline.
- withCommitDeadline(Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
Specifies the deadline for the Commit API call.
- withCommitRetrySettings(RetrySettings) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the commit retry settings.
- withCompression(double) - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.GlobalDigest
-
Sets the compression factor
cf
. - withCompression(double) - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.PerKeyDigest
-
Sets the compression factor
cf
. - withCompression(Compression) - Method in class org.apache.beam.sdk.io.CompressedSource
-
Like
CompressedSource.withDecompression(org.apache.beam.sdk.io.CompressedSource.DecompressingChannelFactory)
but takes a canonicalCompression
. - withCompression(Compression) - Method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.Read
-
Reads from input sources using the specified compression type.
- withCompression(Compression) - Method in class org.apache.beam.sdk.io.csv.CsvIO.Write
-
Specifies the
Compression
of all generated shard files. - withCompression(Compression) - Method in class org.apache.beam.sdk.io.FileIO.ReadMatches
-
Reads files using the given
Compression
. - withCompression(Compression) - Method in class org.apache.beam.sdk.io.FileIO.Write
-
Specifies to compress all generated shard files using the given
Compression
and, by default, append the respective extension to the filename. - withCompression(Compression) - Method in class org.apache.beam.sdk.io.json.JsonIO.Write
-
Specifies the
Compression
of all generated shard files. - withCompression(Compression) - Method in class org.apache.beam.sdk.io.TextIO.Read
-
Reads from input sources using the specified compression type.
- withCompression(Compression) - Method in class org.apache.beam.sdk.io.TextIO.ReadAll
-
Deprecated.Reads from input sources using the specified compression type.
- withCompression(Compression) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
-
Returns a transform for writing to text files like this one but that compresses output using the given
Compression
. - withCompression(Compression) - Method in class org.apache.beam.sdk.io.TextIO.Write
- withCompression(Compression) - Method in class org.apache.beam.sdk.io.TFRecordIO.Read
-
Returns a transform for reading TFRecord files that decompresses all input files using the specified compression type.
- withCompression(Compression) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
-
Writes to output files using the specified compression type.
- withCompression(Compression) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
-
Decompresses all input files using the specified compression type.
- withCompressionCodec(CompressionCodecName) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Sink
-
Specifies compression codec.
- withCompressionEnabled(boolean) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
Configure whether the REST client should compress requests using gzip content encoding and add the "Accept-Encoding: gzip".
- withCompressionType(TextIO.CompressionType) - Method in class org.apache.beam.sdk.io.TextIO.Read
-
Deprecated.
- withCompressionType(TextIO.CompressionType) - Method in class org.apache.beam.sdk.io.TextIO.ReadAll
-
Deprecated.
- withCompressionType(TFRecordIO.CompressionType) - Method in class org.apache.beam.sdk.io.TFRecordIO.Read
-
Deprecated.
- withCompressionType(TFRecordIO.CompressionType) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
-
Deprecated.
- withCompressionType(XmlIO.Read.CompressionType) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
-
Deprecated.
- withConcurrentRequests(int) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Write
-
Max number of concurrent batch write requests per bundle.
- withConcurrentRequests(int) - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.WriteBatches
-
Max number of concurrent batch write requests per bundle, default is
5
. - withConfidence(double) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.GlobalSketch
-
Sets the
confidence
value, i.e. - withConfidence(double) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.PerKeySketch
-
Sets the
confidence
value, i.e. - withConfig(PluginConfig) - Method in class org.apache.beam.sdk.io.cdap.Plugin
-
Sets a plugin config.
- withConfig(Map<String, Object>) - Method in class org.apache.beam.sdk.managed.Managed.ManagedTransform
-
Use the input Map of configuration arguments to build and instantiate the underlying transform.
- withConfig(Config) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverConfiguration
- withConfigProperties(Map<String, String>) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Read
-
Sets the configuration properties like metastore URI.
- withConfigProperties(Map<String, String>) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Write
-
Sets the configuration properties like metastore URI.
- withConfiguration(Map<String, String>) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Parse
-
Specify Hadoop configuration for ParquetReader.
- withConfiguration(Map<String, String>) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.ParseFiles
-
Specify Hadoop configuration for ParquetReader.
- withConfiguration(Map<String, String>) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Read
-
Specify Hadoop configuration for ParquetReader.
- withConfiguration(Map<String, String>) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.ReadFiles
-
Specify Hadoop configuration for ParquetReader.
- withConfiguration(Map<String, String>) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Sink
-
Specify Hadoop configuration for ParquetWriter.
- withConfiguration(FileIO.MatchConfiguration) - Method in class org.apache.beam.sdk.io.FileIO.Match
-
Sets the
FileIO.MatchConfiguration
. - withConfiguration(FileIO.MatchConfiguration) - Method in class org.apache.beam.sdk.io.FileIO.MatchAll
- withConfiguration(Configuration) - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
-
Reads from the source using the options provided by the given configuration.
- withConfiguration(Configuration) - Method in interface org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Write.WriteBuilder
-
Writes to the sink using the options provided by the given hadoop configuration.
- withConfiguration(Configuration) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
-
Reads from the HBase instance indicated by the* given configuration.
- withConfiguration(Configuration) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Write
-
Writes to the HBase instance indicated by the* given Configuration.
- withConfiguration(Configuration) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.WriteRowMutations
-
Writes to the HBase instance indicated by the given Configuration.
- withConfiguration(Configuration) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Parse
-
Specify Hadoop configuration for ParquetReader.
- withConfiguration(Configuration) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.ParseFiles
-
Specify Hadoop configuration for ParquetReader.
- withConfiguration(Configuration) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Read
-
Specify Hadoop configuration for ParquetReader.
- withConfiguration(Configuration) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.ReadFiles
-
Specify Hadoop configuration for ParquetReader.
- withConfiguration(Configuration) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Sink
-
Specify Hadoop configuration for ParquetWriter.
- withConfigurationTransform(PTransform<PCollection<? extends KV<KeyT, ValueT>>, PCollectionView<Configuration>>) - Method in interface org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Write.WriteBuilder
-
Writes to the sink using configuration created by provided
configurationTransformation
. - withConfigUrl(String) - Method in class org.apache.beam.sdk.managed.Managed.ManagedTransform
-
Like
Managed.ManagedTransform.withConfig(Map)
, but instead extracts the configuration arguments from a specified YAML file location. - withConnectionConfiguration(ElasticsearchIO.ConnectionConfiguration) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BulkIO
-
Provide the Elasticsearch connection configuration object.
- withConnectionConfiguration(ElasticsearchIO.ConnectionConfiguration) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocToBulk
-
Provide the Elasticsearch connection configuration object.
- withConnectionConfiguration(ElasticsearchIO.ConnectionConfiguration) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
-
Provide the Elasticsearch connection configuration object.
- withConnectionConfiguration(ElasticsearchIO.ConnectionConfiguration) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
- withConnectionConfiguration(MqttIO.ConnectionConfiguration) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Read
-
Define the MQTT connection configuration used to connect to the MQTT broker.
- withConnectionConfiguration(MqttIO.ConnectionConfiguration) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Write
-
Define MQTT connection configuration used to connect to the MQTT broker.
- withConnectionConfiguration(RedisConnectionConfiguration) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Read
- withConnectionConfiguration(RedisConnectionConfiguration) - Method in class org.apache.beam.sdk.io.redis.RedisIO.ReadKeyPatterns
- withConnectionConfiguration(RedisConnectionConfiguration) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Write
- withConnectionConfiguration(RedisConnectionConfiguration) - Method in class org.apache.beam.sdk.io.redis.RedisIO.WriteStreams
-
Predefine a
RedisConnectionConfiguration
and pass it to the builder. - withConnectionConfiguration(SolrIO.ConnectionConfiguration) - Method in class org.apache.beam.sdk.io.solr.SolrIO.Read
-
Provide the Solr connection configuration object.
- withConnectionConfiguration(SolrIO.ConnectionConfiguration) - Method in class org.apache.beam.sdk.io.solr.SolrIO.Write
-
Provide the Solr connection configuration object.
- withConnectionFactory(ConnectionFactory) - Method in interface org.apache.beam.sdk.io.jms.JmsIO.ConnectionFactoryContainer
- withConnectionFactory(ConnectionFactory) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
-
Specify the JMS connection factory to connect to the JMS broker.
- withConnectionFactory(ConnectionFactory) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Write
-
Specify the JMS connection factory to connect to the JMS broker.
- withConnectionFactoryProviderFn(SerializableFunction<Void, ? extends ConnectionFactory>) - Method in interface org.apache.beam.sdk.io.jms.JmsIO.ConnectionFactoryContainer
- withConnectionFactoryProviderFn(SerializableFunction<Void, ? extends ConnectionFactory>) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
-
Specify a JMS connection factory provider function to connect to the JMS broker.
- withConnectionFactoryProviderFn(SerializableFunction<Void, ? extends ConnectionFactory>) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Write
-
Specify a JMS connection factory provider function to connect to the JMS broker.
- withConnectionInitSqls(Collection<String>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
-
Sets the connection init sql statements to driver.connect(...).
- withConnectionInitSqls(ValueProvider<Collection<String>>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
-
Same as
JdbcIO.DataSourceConfiguration.withConnectionInitSqls(Collection)
but accepting a ValueProvider. - withConnectionProperties(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
-
Sets the connection properties passed to driver.connect(...).
- withConnectionProperties(String) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.DataSourceConfiguration
-
Sets the connection properties passed to driver.connect(...).
- withConnectionProperties(Map<String, String>) - Method in class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
-
Sets a custom property to be used within the connection to your database.
- withConnectionProperties(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
-
Same as
JdbcIO.DataSourceConfiguration.withConnectionProperties(String)
but accepting a ValueProvider. - withConnectionProperties(ValueProvider<Map<String, String>>) - Method in class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
-
Sets a custom property to be used within the connection to your database.
- withConnectionProperty(String, String) - Method in class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
-
Sets a custom property to be used within the connection to your database.
- withConnectorClass(Class<?>) - Method in class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
-
Applies the connectorClass to be used to connect to your database.
- withConnectorClass(ValueProvider<Class<?>>) - Method in class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
-
Sets the connectorClass to be used to connect to your database.
- withConnectorConfiguration(DebeziumIO.ConnectorConfiguration) - Method in class org.apache.beam.io.debezium.DebeziumIO.Read
-
Applies the given configuration to the connector.
- withConnectTimeout(Integer) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
-
Specify the Cassandra client connect timeout in ms.
- withConnectTimeout(Integer) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
-
Cassandra client socket option for connect timeout in ms.
- withConnectTimeout(Integer) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
If set, overwrites the default connect timeout (1000ms) in the
RequestConfig
of the ElasticRestClient
. - withConnectTimeout(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
-
Specify the Cassandra client connect timeout in ms.
- withConnectTimeout(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
-
Cassandra client socket option for connect timeout in ms.
- withConsistencyLevel(String) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
-
Specify the consistency level for the request (e.g.
- withConsistencyLevel(String) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
-
Specify the consistency level for the request (e.g.
- withConsistencyLevel(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
-
Specify the consistency level for the request (e.g.
- withConsistencyLevel(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
-
Specify the consistency level for the request (e.g.
- withConsistencyLevel(InfluxDB.ConsistencyLevel) - Method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Write
-
Sets the consistency level to use.
- withConstructorArgs(Object...) - Method in class org.apache.beam.sdk.io.sparkreceiver.ReceiverBuilder
-
Method for specifying constructor arguments for corresponding
ReceiverBuilder.sparkReceiverClass
. - withConsumerArn(String) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
-
Specify consumer ARN to enable Enhanced Fan-Out.
- withConsumerConfigOverrides(Map<String, Object>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
Replaces the configuration for the main consumer.
- withConsumerConfigUpdates(Map<String, Object>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Update configuration for the backend main consumer.
- withConsumerConfigUpdates(Map<String, Object>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
Updates configuration for the main consumer.
- withConsumerFactoryFn(SerializableFunction<Map<String, Object>, ? extends Consumer<?, ?>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
-
Wrapper method over
KafkaIO.WriteRecords.withConsumerFactoryFn(SerializableFunction)
, used to keep the compatibility with old API based on KV type of element. - withConsumerFactoryFn(SerializableFunction<Map<String, Object>, ? extends Consumer<?, ?>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
-
When exactly-once semantics are enabled (see
KafkaIO.WriteRecords.withEOS(int, String)
), the sink needs to fetch previously stored state with Kafka topic. - withConsumerFactoryFn(SerializableFunction<Map<String, Object>, Consumer<byte[], byte[]>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
A factory to create Kafka
Consumer
from consumer configuration. - withConsumerFactoryFn(SerializableFunction<Map<String, Object>, Consumer<byte[], byte[]>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
A factory to create Kafka
Consumer
from consumer configuration. - withConsumerPollingTimeout(long) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Sets the timeout time in seconds for Kafka consumer polling request in the
ReadFromKafkaDoFn
. - withConsumerPollingTimeout(long) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
Sets the timeout time in seconds for Kafka consumer polling request in the
ReadFromKafkaDoFn
. - withContainer(String) - Method in class org.apache.beam.sdk.io.azure.cosmos.CosmosIO.Read
-
Specify the Cosmos container to read from.
- withContentTypeHint(String) - Method in class org.apache.beam.sdk.io.tika.TikaIO.ParseFiles
-
Sets a content type hint to make the file parser detection more efficient.
- withCPUCount(int) - Method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHints
-
Sets desired minimal CPU or vCPU count to have in transform's execution environment.
- withCreateDisposition(BigQueryIO.Write.CreateDisposition) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Specifies whether the table should be created if it does not exist.
- withCreateDisposition(CreateDisposition) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
-
A disposition to be used during table preparation.
- withCreateOrUpdateMetadataTable(boolean) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
-
Returns a new
BigtableIO.ReadChangeStream
that, if set to true, will create or update metadata table before launching pipeline. - withCreateTime() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
Use the creation time of
KafkaRecord
as the output timestamp. - withCreateTime(Duration) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Sets the timestamps policy based on
KafkaTimestampType.CREATE_TIME
timestamp of the records. - withCreateTime(Duration) - Static method in interface org.apache.beam.sdk.io.kafka.TimestampPolicyFactory
-
CustomTimestampPolicyWithLimitedDelay
usingKafkaTimestampType.CREATE_TIME
from the record for timestamp. - withCreatWatermarkEstimatorFn(SerializableFunction<Instant, WatermarkEstimator<Instant>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
A function to create a
WatermarkEstimator
. - withCredentials(Credentials) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the credentials.
- withCredentials(ValueProvider<Credentials>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the credentials.
- withCsvMapper(SnowflakeIO.CsvMapper<T>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
-
User-defined function mapping CSV lines into user data.
- withCustomBeamRequirement(String) - Method in class org.apache.beam.sdk.extensions.python.PythonService
-
Override the Beam version to be installed in the service environment.
- withCustomGcsTempLocation(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Provides a custom location on GCS for storing temporary files to be loaded via BigQuery batch load jobs.
- withCustomRateLimitPolicy(RateLimitPolicyFactory) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
-
Specifies the
RateLimitPolicyFactory
for a custom rate limiter. - withCustomRecordParsing(String, SerializableFunction<String, OutputT>) - Method in class org.apache.beam.sdk.io.csv.CsvIOParse
-
Configures custom cell parsing.
- withCustomWatermarkPolicy(WatermarkParameters) - Static method in interface org.apache.beam.sdk.io.aws2.kinesis.WatermarkPolicyFactory
-
Returns an custom WatermarkPolicyFactory.
- withCustomWatermarkPolicy(WatermarkPolicyFactory) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
-
Specifies the
WatermarkPolicyFactory
as a custom watermarkPolicyFactory. - withCypher(String) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.ReadAll
- withCypher(String) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.WriteUnwind
- withCypher(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.ReadAll
- withCypher(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.WriteUnwind
- withCypherLogging() - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.ReadAll
- withCypherLogging() - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.WriteUnwind
- withDatabase(String) - Method in class org.apache.beam.sdk.io.azure.cosmos.CosmosIO.Read
-
Specify the Cosmos database to read from.
- withDatabase(String) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Read
-
Sets the database name.
- withDatabase(String) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Write
-
Sets the database name.
- withDatabase(String) - Method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Read
-
Reads from the specified database.
- withDatabase(String) - Method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Write
-
Writes to the specified database.
- withDatabase(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
- withDatabase(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Write
- withDatabase(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
-
Sets the database to use.
- withDatabase(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
-
Sets the database to use.
- withDatabase(String) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.DataSourceConfiguration
- withDatabase(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
Sets database to use.
- withDatabase(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
-
Returns a new
DatastoreV1.DeleteEntity
that deletes entities from the Cloud Datastore for the specified database. - withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntityWithSummary
-
Returns a new
DatastoreV1.DeleteEntityWithSummary
that deletes entities from the Cloud Datastore for the specified database. - withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
-
Returns a new
DatastoreV1.DeleteKey
that deletes entities from the Cloud Datastore for the specified database. - withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKeyWithSummary
-
Returns a new
DatastoreV1.DeleteKeyWithSummary
that deletes entities from the Cloud Datastore for the specified database. - withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
-
Returns a new
DatastoreV1.Read
that reads from the Cloud Datastore for the specified database. - withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
-
Returns a new
DatastoreV1.Write
that writes to the Cloud Datastore for the database id. - withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.WriteWithSummary
-
Returns a new
DatastoreV1.WriteWithSummary
that writes to the Cloud Datastore for the database id. - withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.Write
- withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the Cloud Spanner database ID.
- withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
-
Specifies the Cloud Spanner database.
- withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
-
Specifies the Cloud Spanner database.
- withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
-
Specifies the Cloud Spanner database.
- withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
-
Specifies the Cloud Spanner database.
- withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
Specifies the Cloud Spanner database.
- withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
-
Same as
DatastoreV1.DeleteEntity.withDatabaseId(String)
but with aValueProvider
. - withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntityWithSummary
-
Same as
DatastoreV1.DeleteEntityWithSummary.withDatabaseId(String)
but with aValueProvider
. - withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
-
Same as
DatastoreV1.DeleteKey.withDatabaseId(String)
but with aValueProvider
. - withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKeyWithSummary
-
Same as
DatastoreV1.DeleteKeyWithSummary.withDatabaseId(String)
but with aValueProvider
. - withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
-
Same as
DatastoreV1.Write.withDatabaseId(String)
but with aValueProvider
. - withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.WriteWithSummary
-
Same as
DatastoreV1.WriteWithSummary.withDatabaseId(String)
but with aValueProvider
. - withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the Cloud Spanner database ID.
- withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
-
Specifies the Cloud Spanner database.
- withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
-
Specifies the Cloud Spanner database.
- withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
-
Specifies the Cloud Spanner database.
- withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
-
Specifies the Cloud Spanner database.
- withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
Specifies the Cloud Spanner database.
- withDatabaseRole(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the Cloud Spanner database role.
- withDataBoostEnabled(ValueProvider<Boolean>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies if the pipeline has to be run on the independent compute resource.
- withDatasetService(FakeDatasetService) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
- withDataSourceConfiguration(InfluxDbIO.DataSourceConfiguration) - Method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Read
-
Reads from the InfluxDB instance indicated by the given configuration.
- withDataSourceConfiguration(InfluxDbIO.DataSourceConfiguration) - Method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Write
-
Writes to the InfluxDB instance indicated by the given configuration.
- withDataSourceConfiguration(JdbcIO.DataSourceConfiguration) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
- withDataSourceConfiguration(JdbcIO.DataSourceConfiguration) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadAll
- withDataSourceConfiguration(JdbcIO.DataSourceConfiguration) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadRows
- withDataSourceConfiguration(JdbcIO.DataSourceConfiguration) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadWithPartitions
- withDataSourceConfiguration(JdbcIO.DataSourceConfiguration) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
- withDataSourceConfiguration(JdbcIO.DataSourceConfiguration) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteVoid
- withDataSourceConfiguration(JdbcIO.DataSourceConfiguration) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteWithResults
- withDataSourceConfiguration(SingleStoreIO.DataSourceConfiguration) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.Read
- withDataSourceConfiguration(SingleStoreIO.DataSourceConfiguration) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.ReadWithPartitions
- withDataSourceConfiguration(SingleStoreIO.DataSourceConfiguration) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.Write
- withDataSourceConfiguration(SnowflakeIO.DataSourceConfiguration) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
-
Setting information about Snowflake server.
- withDataSourceConfiguration(SnowflakeIO.DataSourceConfiguration) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
-
Setting information about Snowflake server.
- withDataSourceProviderFn(SerializableFunction<Void, DataSource>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
- withDataSourceProviderFn(SerializableFunction<Void, DataSource>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadAll
- withDataSourceProviderFn(SerializableFunction<Void, DataSource>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadRows
- withDataSourceProviderFn(SerializableFunction<Void, DataSource>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadWithPartitions
- withDataSourceProviderFn(SerializableFunction<Void, DataSource>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
- withDataSourceProviderFn(SerializableFunction<Void, DataSource>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteVoid
- withDataSourceProviderFn(SerializableFunction<Void, DataSource>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteWithResults
- withDataSourceProviderFn(SerializableFunction<Void, DataSource>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
-
Setting function that will provide
SnowflakeIO.DataSourceConfiguration
in runtime. - withDataSourceProviderFn(SerializableFunction<Void, DataSource>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
-
Setting function that will provide
SnowflakeIO.DataSourceConfiguration
in runtime. - withDatumReaderFactory(AvroSource.DatumReaderFactory<?>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource
-
Sets a custom
AvroSource.DatumReaderFactory
for reading. - withDatumReaderFactory(AvroSource.DatumReaderFactory<T>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Read
-
Sets a custom
AvroSource.DatumReaderFactory
for reading. - withDatumReaderFactory(AvroSource.DatumReaderFactory<T>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ReadFiles
-
Sets a custom
AvroSource.DatumReaderFactory
for reading. - withDatumWriterFactory(AvroSink.DatumWriterFactory<ElementT>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Sink
-
Sets a custom
AvroSource.DatumReaderFactory
for writing. - withDatumWriterFactory(AvroSink.DatumWriterFactory<OutputT>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
-
Specifies a
AvroSink.DatumWriterFactory
to use for creatingDatumWriter
instances. - withDatumWriterFactory(AvroSink.DatumWriterFactory<T>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
- withDdlString(String) - Method in class org.apache.beam.sdk.extensions.sql.SqlTransform
- withDeadLetterQueue() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchWriteWithSummary.Builder
- withDeadLetterTopic(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
-
Creates and returns a transform for writing read failures out to a dead-letter topic.
- withDeadLetterTopic(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
-
Like
PubsubIO.Read.withDeadLetterTopic(String)
but with aValueProvider
. - withDebugMode(StreamingLogLevel) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
-
The option to verbose info (or only errors) of loaded files while streaming.
- withDecompression(CompressedSource.DecompressingChannelFactory) - Method in class org.apache.beam.sdk.io.CompressedSource
-
Return a
CompressedSource
that is like this one but will decompress its underlying file with the givenCompressedSource.DecompressingChannelFactory
. - withDeduplicateKeys(List<String>) - Method in class org.apache.beam.sdk.io.aws2.dynamodb.DynamoDBIO.Write
- withDeduplicateRecords(boolean) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Read
-
Optional, default: false.
- WithDefault() - Constructor for class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
- withDefaultConfig(boolean) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverConfiguration
- withDefaultConfig(ValueProvider<Boolean>) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverConfiguration
- withDefaultHeaders(Header[]) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
For authentication or custom requirements, provide a set if default headers for the client.
- withDefaultMissingValueInterpretation(AppendRowsRequest.MissingValueInterpretation) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Specify how missing values should be interpreted when there is a default value in the schema.
- withDefaultRateLimiter() - Static method in interface org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory
- withDefaultRateLimiter(Duration, Duration, Duration) - Static method in interface org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory
- withDefaultTableProvider(String, TableProvider) - Method in class org.apache.beam.sdk.extensions.sql.SqlTransform
- withDefaultThreadPool() - Static method in class org.apache.beam.runners.dataflow.util.PackageUtil
- withDefaultValue(T) - Method in class org.apache.beam.sdk.transforms.View.AsSingleton
-
Default value to return for windows with no value in them.
- withDelay(Supplier<Duration>) - Static method in interface org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory
- withDelimiter(byte[]) - Method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.Read
-
Set the custom delimiter to be used in place of the default ones ('\r', '\n' or '\r\n').
- withDelimiter(byte[]) - Method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.ReadFiles
- withDelimiter(byte[]) - Method in class org.apache.beam.sdk.io.TextIO.Read
-
Set the custom delimiter to be used in place of the default ones ('\r', '\n' or '\r\n').
- withDelimiter(byte[]) - Method in class org.apache.beam.sdk.io.TextIO.ReadFiles
- withDelimiter(char[]) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
-
Specifies the delimiter after each string written.
- withDelimiter(char[]) - Method in class org.apache.beam.sdk.io.TextIO.Write
- withDeliveryMode(DeliveryMode) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Write
-
Set the delivery mode.
- withDescription(String) - Method in class org.apache.beam.sdk.schemas.Schema.Field
-
Returns a copy of the Field with the description set.
- withDesiredBundleSizeBytes(long) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ParseAll
-
Deprecated.Set a value for the bundle size for parallel reads.
- withDesiredBundleSizeBytes(long) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ParseFiles
-
Set a value for the bundle size for parallel reads.
- withDesiredBundleSizeBytes(long) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ReadAll
-
Deprecated.Set a value for the bundle size for parallel reads.
- withDesiredBundleSizeBytes(long) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ReadFiles
-
Set a value for the bundle size for parallel reads.
- withDestinationCoder(Coder<DestinationT>) - Method in class org.apache.beam.sdk.io.FileIO.Write
-
Specifies a
Coder
for the destination type, if it can not be inferred fromFileIO.Write.by(org.apache.beam.sdk.transforms.SerializableFunction<UserT, DestinationT>)
. - withDeterministicRecordIdFn(SerializableFunction<T, String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
- withDeveloperToken(String) - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsV19.Read
-
Creates and returns a new
GoogleAdsV19.Read
transform with the specified developer token. - withDeveloperToken(String) - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsV19.ReadAll
-
Creates and returns a new
GoogleAdsV19.ReadAll
transform with the specified developer token. - withDialectView(PCollectionView<Dialect>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
- withDirectExecutor() - Method in class org.apache.beam.sdk.fn.channel.ManagedChannelFactory
-
Returns a
ManagedChannelFactory
like this one, but will construct the channel to use the direct executor. - withDirectoryTreatment(FileIO.ReadMatches.DirectoryTreatment) - Method in class org.apache.beam.sdk.io.FileIO.ReadMatches
-
Controls how to handle directories in the input
PCollection
. - withDirectWriteProtos(boolean) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
- withDisableAutoCommit(boolean) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
-
Whether to disable auto commit on read.
- withDisableAutoCommit(boolean) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadAll
-
Whether to disable auto commit on read.
- withDisableAutoCommit(boolean) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadRows
-
Whether to disable auto commit on read.
- withDisableAutoCommit(boolean) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadWithPartitions
-
Whether to disable auto commit on read.
- withDisableCertificateValidation(boolean) - Method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Read
-
Disable SSL certification validation.
- withDisableCertificateValidation(boolean) - Method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Write
-
Disable SSL certification validation.
- withDisableCertificateValidation(Boolean) - Method in class org.apache.beam.sdk.io.splunk.SplunkIO.Write
-
Disables ssl certificate validation.
- withDisableCertificateValidation(ValueProvider<Boolean>) - Method in class org.apache.beam.sdk.io.splunk.SplunkIO.Write
-
Same as
SplunkIO.Write.withDisableCertificateValidation(Boolean)
but withValueProvider
. - withDocVersionFn(ElasticsearchIO.Write.FieldValueExtractFn) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocToBulk
-
Provide a function to extract the doc version from the document.
- withDocVersionFn(ElasticsearchIO.Write.FieldValueExtractFn) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
- withDocVersionType(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocToBulk
-
Provide a function to extract the doc version from the document.
- withDocVersionType(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
- withDriverClassLoader(ClassLoader) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
-
Sets the class loader instance to be used to load the JDBC driver.
- withDriverConfiguration(Neo4jIO.DriverConfiguration) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.ReadAll
- withDriverConfiguration(Neo4jIO.DriverConfiguration) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.WriteUnwind
- withDriverJars(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
-
Comma separated paths for JDBC drivers.
- withDriverJars(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
-
Same as
JdbcIO.DataSourceConfiguration.withDriverJars(String)
but accepting a ValueProvider. - withDuration(Duration) - Method in class org.apache.beam.sdk.transforms.Deduplicate.KeyedValues
-
Returns a
KeyedValues
PTransform
like this one but with the specified duration. - withDuration(Duration) - Method in class org.apache.beam.sdk.transforms.Deduplicate.Values
-
Returns a
Values
PTransform
like this one but with the specified duration. - withDuration(Duration) - Method in class org.apache.beam.sdk.transforms.Deduplicate.WithRepresentativeValues
-
Return a
WithRepresentativeValues
PTransform
that is like this one, but with the specified deduplication duration. - withDynamicDelayRateLimitPolicy(Supplier<Duration>) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
-
Specifies a dynamic delay rate limit policy with the given function being called at each polling interval to get the next delay value.
- withDynamicRead(Duration) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Configure the KafkaIO to use
WatchForKafkaTopicPartitions
to detect and emit any new availableTopicPartition
forReadFromKafkaDoFn
to consume during pipeline execution time. - withEarlyFirings(Trigger.OnceTrigger) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
- withEarlyFirings(Trigger.OnceTrigger) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.FromEndOfWindow
-
Creates a new
Trigger
like the this, except that it fires repeatedly whenever the givenTrigger
fires before the watermark has passed the end of the window. - withElementTimestamp() - Static method in interface org.apache.beam.sdk.io.kafka.KafkaPublishTimestampFunction
-
Returns
KafkaPublishTimestampFunction
returns element timestamp from ProcessContext. - withEmptyGlobalWindowDestination(DestinationT) - Method in class org.apache.beam.sdk.io.FileIO.Write
-
If
FileIO.Write.withIgnoreWindowing()
is specified, specifies a destination to be used in case the collection is empty, to generate the (only, empty) output file. - withEmptyMatchTreatment(EmptyMatchTreatment) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Parse
- withEmptyMatchTreatment(EmptyMatchTreatment) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ParseAll
-
Deprecated.
- withEmptyMatchTreatment(EmptyMatchTreatment) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Read
-
Configures whether or not a filepattern matching no files is allowed.
- withEmptyMatchTreatment(EmptyMatchTreatment) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ReadAll
-
Deprecated.
- withEmptyMatchTreatment(EmptyMatchTreatment) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource
- withEmptyMatchTreatment(EmptyMatchTreatment) - Method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.Read
- withEmptyMatchTreatment(EmptyMatchTreatment) - Method in class org.apache.beam.sdk.io.FileIO.Match
- withEmptyMatchTreatment(EmptyMatchTreatment) - Method in class org.apache.beam.sdk.io.FileIO.MatchAll
- withEmptyMatchTreatment(EmptyMatchTreatment) - Method in class org.apache.beam.sdk.io.FileIO.MatchConfiguration
-
Sets the
EmptyMatchTreatment
. - withEmptyMatchTreatment(EmptyMatchTreatment) - Method in class org.apache.beam.sdk.io.TextIO.Read
- withEmptyMatchTreatment(EmptyMatchTreatment) - Method in class org.apache.beam.sdk.io.TextIO.ReadAll
-
Deprecated.
- withEmulator(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
- withEmulator(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Returns a new
BigtableIO.Read
that will use an official Bigtable emulator. - withEmulator(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
-
Returns a new
BigtableIO.Write
that will use an official Bigtable emulator. - withEmulatorHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
- withEmulatorHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
- withEmulatorHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
- withEmulatorHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
- withEmulatorHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the Cloud Spanner host, when an emulator is used.
- withEmulatorHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
-
Specifies the Cloud Spanner emulator host.
- withEmulatorHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
-
Specifies the Cloud Spanner emulator host.
- withEmulatorHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
-
Specifies the Cloud Spanner emulator host.
- withEmulatorHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
Specifies the Cloud Spanner emulator host.
- withEnableBatchLogs(Boolean) - Method in class org.apache.beam.sdk.io.splunk.SplunkIO.Write
-
Method to enable batch logs.
- withEnableBatchLogs(ValueProvider<Boolean>) - Method in class org.apache.beam.sdk.io.splunk.SplunkIO.Write
-
Same as
SplunkIO.Write.withEnableBatchLogs(ValueProvider)
but without aValueProvider
. - withEnableGzipHttpCompression(Boolean) - Method in class org.apache.beam.sdk.io.splunk.SplunkIO.Write
-
Method to specify if HTTP requests sent to Splunk should be GZIP encoded.
- withEnableGzipHttpCompression(ValueProvider<Boolean>) - Method in class org.apache.beam.sdk.io.splunk.SplunkIO.Write
-
Same as
SplunkIO.Write.withEnableGzipHttpCompression(ValueProvider)
but without aValueProvider
. - withEndKey(ByteKey) - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
-
Returns new
ByteKeyRange
like this one, but with the specified end key. - withEndMessageId(MessageId) - Method in class org.apache.beam.sdk.io.pulsar.PulsarIO.Read
- withEndpoint(String, int) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Read
- withEndpoint(String, int) - Method in class org.apache.beam.sdk.io.redis.RedisIO.ReadKeyPatterns
- withEndpoint(String, int) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Write
- withEndpoint(String, int) - Method in class org.apache.beam.sdk.io.redis.RedisIO.WriteStreams
-
Set the hostname and port of the Redis server to connect to.
- withEndTimestamp(Long) - Method in class org.apache.beam.sdk.io.pulsar.PulsarIO.Read
- withEntity(Class<T>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
-
Specify the entity class (annotated POJO).
- withEntity(Class<T>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
-
Specify the entity class in the input
PCollection
. - withEntries - Variable in class org.apache.beam.sdk.extensions.sql.zetasql.QueryTrait
- withEntryMapper(SqsIO.WriteBatches.EntryMapperFn.Builder<T>) - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.WriteBatches
-
Optional mapper to create a batch entry from the input
T
using a builder, otherwise inferred from the schema. - withEntryMapper(SqsIO.WriteBatches.EntryMapperFn<T>) - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.WriteBatches
-
Optional mapper to create a batch entry from a unique entry id and the input
T
, otherwise inferred from the schema. - withEnvironmentId(String) - Method in class org.apache.beam.sdk.values.WindowingStrategy
- withEOS(int, String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
-
Wrapper method over
KafkaIO.WriteRecords.withEOS(int, String)
, used to keep the compatibility with old API based on KV type of element. - withEOS(int, String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
-
Provides exactly-once semantics while writing to Kafka, which enables applications with end-to-end exactly-once guarantees on top of exactly-once semantics within Beam pipelines.
- withEpsilon(double) - Method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
-
Returns an
ApproximateQuantilesCombineFn
that's like this one except that it uses the specifiedepsilon
value. - withErrorHandler(ErrorHandler<BadRecord, ?>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- withErrorHandler(ErrorHandler<BadRecord, ?>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
- withErrorHandler(ErrorHandler<BadRecord, ?>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
- withErrorHandler(ErrorHandler<BadRecord, ?>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
-
Configures the PubSub read with an alternate error handler.
- withErrorHandler(ErrorHandler<BadRecord, ?>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
-
Writes any serialization failures out to the Error Handler.
- withErrorHandler(ErrorHandler<BadRecord, ?>) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Write
-
An optional error handler for handling records that failed to publish to Solace.
- withErrorsTransformer(PTransform<PCollection<Row>, POutput>) - Method in interface org.apache.beam.sdk.extensions.sql.impl.rel.BeamRelNode
- withErrorsTransformer(PTransform<PCollection<Row>, ? extends POutput>) - Method in class org.apache.beam.sdk.extensions.sql.SqlTransform
- withEvent(String) - Method in class org.apache.beam.sdk.io.splunk.SplunkEvent.Builder
-
Assigns the event payload to be sent to the HEC endpoint.
- withEventStore(String) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportUserEvents
- withEventStore(String) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIPredict
- withEventStore(String) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIWriteUserEvent
- withEverythingCounted() - Method in class org.apache.beam.io.requestresponse.Monitoring
-
Turns on all monitoring.
- withEverythingCountedExceptedCaching() - Method in class org.apache.beam.io.requestresponse.Monitoring
-
Turns on all monitoring except for cache related metrics.
- withExceptionReporting(Schema) - Static method in class org.apache.beam.sdk.transforms.JsonToRow
-
Enable Exception Reporting support.
- withExchange(String) - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO.Write
-
Defines the existing exchange where the messages will be sent.
- withExchange(String, String) - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO.Read
-
In AMQP, messages are published to an exchange and routed to queues based on the exchange type and a queue binding.
- withExchange(String, String) - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO.Write
-
Defines the to-be-declared exchange where the messages will be sent.
- withExchange(String, String, String) - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO.Read
-
In AMQP, messages are published to an exchange and routed to queues based on the exchange type and a queue binding.
- withExecuteStreamingSqlRetrySettings(RetrySettings) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the ExecuteStreamingSql retry settings.
- withExecutorService(ExecutorService) - Static method in class org.apache.beam.runners.dataflow.util.PackageUtil
- withExistingPipelineOptions(BigtableIO.ExistingPipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
-
Returns a new
BigtableIO.ReadChangeStream
that decides what to do if an existing pipeline exists with the same change stream name. - withExpansionService(String) - Method in class org.apache.beam.sdk.extensions.python.transforms.DataframeTransform
-
Sets an expansion service endpoint for DataframeTransform.
- withExpansionService(String) - Method in class org.apache.beam.sdk.extensions.python.transforms.PythonMap
- withExpansionService(String) - Method in class org.apache.beam.sdk.extensions.python.transforms.RunInference
-
Sets an expansion service endpoint for RunInference.
- withExpireTime(Long) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Write
- withExtendedErrorInfo() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Enables extended error information by enabling
WriteResult.getFailedInsertsWithErr()
- withExtendedErrorInfo() - Method in class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn
-
Adds the error message to the returned error Row.
- withExtendedErrorInfo(boolean) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingInserts
-
Specify whether to use extended error info or not.
- withExtensionsFrom(Class<?>...) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
- withExtensionsFrom(Iterable<Class<?>>) - Method in class org.apache.beam.sdk.extensions.protobuf.DynamicProtoCoder
-
Returns a
DynamicProtoCoder
like this one, but with the extensions from the given classes registered. - withExtensionsFrom(Iterable<Class<?>>) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
-
Returns a
ProtoCoder
like this one, but with the extensions from the given classes registered. - withExternalSorterType(ExternalSorter.Options.SorterType) - Method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter.Options
-
Sets the external sorter type.
- withExternalSynchronization(ExternalSynchronization) - Method in interface org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Write.ExternalSynchronizationBuilder
-
Specifies class which will provide external synchronization required for hadoop write operation.
- withExtractOutputTimestampFn(SerializableFunction<KafkaRecord<K, V>, Instant>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
A function to calculate output timestamp for a given
KafkaRecord
. - withExtractOutputTimestampFn(SerializableFunction<Message<byte[]>, Instant>) - Method in class org.apache.beam.sdk.io.pulsar.PulsarIO.Read
- withExtraPackages(List<String>) - Method in class org.apache.beam.sdk.extensions.python.PythonExternalTransform
-
Specifies that the given Python packages are required for this transform, which will cause them to be installed in both the construction-time and execution time environment.
- withExtraPackages(List<String>) - Method in class org.apache.beam.sdk.extensions.python.PythonService
-
Specifies that the given Python packages should be installed for this service environment.
- withExtraPackages(List<String>) - Method in class org.apache.beam.sdk.extensions.python.transforms.PythonMap
-
Specifies any extra packages required by the Python function.
- withExtraPackages(List<String>) - Method in class org.apache.beam.sdk.extensions.python.transforms.RunInference
-
Specifies any extra packages required by the RunInference model handler.
- withFailedInsertRetryPolicy(InsertRetryPolicy) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Specifies a policy for handling failed inserts.
- withFailureMode(SpannerIO.FailureMode) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
Specifies failure mode.
- WithFailures - Class in org.apache.beam.sdk.transforms
-
A collection of utilities for writing transforms that can handle exceptions raised during processing of elements.
- WithFailures() - Constructor for class org.apache.beam.sdk.transforms.WithFailures
- WithFailures.ExceptionAsMapHandler<T> - Class in org.apache.beam.sdk.transforms
-
A simple handler that extracts information from an exception to a
Map<String, String>
and returns aKV
where the key is the input element that failed processing, and the value is the map of exception attributes. - WithFailures.ExceptionElement<T> - Class in org.apache.beam.sdk.transforms
-
The value type passed as input to exception handlers.
- WithFailures.Result<OutputT,
FailureElementT> - Class in org.apache.beam.sdk.transforms -
An intermediate output type for PTransforms that allows an output collection to live alongside a collection of elements that failed the transform.
- WithFailures.ThrowableHandler<T> - Class in org.apache.beam.sdk.transforms
- withFanout(int) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
- withFanout(int) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineGlobally
- withFanout(int) - Method in class org.apache.beam.sdk.transforms.Combine.Globally
-
Returns a
PTransform
identical to this, but that uses an intermediate node to combine parts of the data to reduce load on the final global combine step. - withFaultTolerent(boolean) - Method in class org.apache.beam.sdk.io.kudu.KuduIO.Read
-
Instructs the read scan to resume a scan on another tablet server if the current server fails and faultTolerant is set to true.
- withFetchSize(int) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
-
This method is used to set the size of the data that is going to be fetched and loaded in memory per every database call.
- withFetchSize(int) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadAll
-
This method is used to set the size of the data that is going to be fetched and loaded in memory per every database call.
- withFetchSize(int) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadRows
-
This method is used to set the size of the data that is going to be fetched and loaded in memory per every database call.
- withFetchSize(int) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadWithPartitions
-
The number of rows to fetch from the database in the same
ResultSet
round-trip. - withFieldAccessDescriptors(Map<FieldAccessDescriptor, Object>) - Method in class org.apache.beam.sdk.values.Row.FieldValueBuilder
-
Sets field values using the FieldAccessDescriptors.
- withFieldIds(Integer...) - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
-
Return a descriptor that access the specified fields.
- withFieldIds(Iterable<Integer>) - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
-
Return a descriptor that access the specified fields.
- withFieldIds(FieldAccessDescriptor, Integer...) - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
-
Return a descriptor that accesses the specified field ids as nested subfields of the baseDescriptor.
- withFieldIds(FieldAccessDescriptor, Iterable<Integer>) - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
-
Return a descriptor that accesses the specified field ids as nested subfields of the baseDescriptor.
- withFieldNameAs(String, String) - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
-
Add a field with a new name.
- withFieldNameAs(String, String) - Method in class org.apache.beam.sdk.schemas.transforms.Select.Fields
-
Add a single field to the selection, along with the name the field should take in the selected schema.
- withFieldNameAs(String, String) - Method in class org.apache.beam.sdk.schemas.transforms.Select.Flattened
-
Allows renaming a specific nested field.
- withFieldNames(Iterable<String>) - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
-
Return a descriptor that access the specified fields.
- withFieldNames(String...) - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
-
Return a descriptor that access the specified fields.
- withFieldNames(FieldAccessDescriptor, Iterable<String>) - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
-
Return a descriptor that accesses the specified field names as nested subfields of the baseDescriptor.
- withFieldNames(FieldAccessDescriptor, String...) - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
-
Return a descriptor that accesses the specified field names as nested subfields of the baseDescriptor.
- withFieldNamesAs(Map<String, String>) - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
-
Return a descriptor that accesses the specified fields, renaming those fields.
- withFieldReordering() - Method in enum class org.apache.beam.sdk.extensions.sql.meta.ProjectSupport
- withFields(JsonObject) - Method in class org.apache.beam.sdk.io.splunk.SplunkEvent.Builder
-
Assigns fields value to the event metadata.
- withFields(Iterable<FieldAccessDescriptor.FieldDescriptor>) - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
-
Returns a
FieldAccessDescriptor
that accesses the specified fields. - withFields(FieldAccessDescriptor.FieldDescriptor...) - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
-
Returns a
FieldAccessDescriptor
that accesses the specified fields. - withFieldValue(Integer, Object) - Method in class org.apache.beam.sdk.values.Row.Builder
-
Set a field value using the field id.
- withFieldValue(Integer, Object) - Method in class org.apache.beam.sdk.values.Row.FieldValueBuilder
-
Set a field value using the field id.
- withFieldValue(String, Object) - Method in class org.apache.beam.sdk.values.Row.Builder
-
Set a field value using the field name.
- withFieldValue(String, Object) - Method in class org.apache.beam.sdk.values.Row.FieldValueBuilder
-
Set a field value using the field name.
- withFieldValue(FieldAccessDescriptor, Object) - Method in class org.apache.beam.sdk.values.Row.Builder
-
Set a field value using a FieldAccessDescriptor.
- withFieldValue(FieldAccessDescriptor, Object) - Method in class org.apache.beam.sdk.values.Row.FieldValueBuilder
-
Set a field value using a FieldAccessDescriptor.
- withFieldValueGetters(Factory<List<FieldValueGetter<T, Object>>>, T) - Method in class org.apache.beam.sdk.values.Row.Builder
- withFieldValues(Map<String, Object>) - Method in class org.apache.beam.sdk.values.Row.Builder
-
Sets field values using the field names.
- withFieldValues(Map<String, Object>) - Method in class org.apache.beam.sdk.values.Row.FieldValueBuilder
-
Sets field values using the field names.
- withFileExceptionHandler(ReadAllViaFileBasedSource.ReadFileRangesFnExceptionHandler) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ParseFiles
-
Specifies if exceptions should be logged only for streaming pipelines.
- withFileExceptionHandler(ReadAllViaFileBasedSource.ReadFileRangesFnExceptionHandler) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ReadFiles
-
Specifies if exceptions should be logged only for streaming pipelines.
- withFilename(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Write
- withFileNameTemplate(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
-
A template name for files saved to GCP.
- withFilter(String) - Method in class org.apache.beam.sdk.io.iceberg.IcebergIO.ReadRows
- withFilter(String) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Read
-
Sets the filter details.
- withFilter(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
- withFilter(Filter) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
-
Filters the rows read from HBase using the given* row filter.
- withFilters(Bson) - Method in class org.apache.beam.sdk.io.mongodb.FindQuery
-
Sets the filters to find.
- withFindKey(String) - Method in class org.apache.beam.sdk.io.mongodb.UpdateConfiguration
-
Sets the filters to find.
- withFixedDelay() - Static method in interface org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory
- withFixedDelay(Duration) - Static method in interface org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory
- withFixedDelayRateLimitPolicy() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
-
Specifies a fixed delay rate limit policy with the default delay of 1 second.
- withFixedDelayRateLimitPolicy(Duration) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
-
Specifies a fixed delay rate limit policy with the given delay.
- withFlowControl(boolean) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
-
Returns a new
BigtableIO.Write
with flow control enabled if enableFlowControl is true. - withFlushRowLimit(Integer) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
-
Sets number of row limit that will be saved to the staged file and then loaded to Snowflake.
- withFlushTimeLimit(Duration) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
-
Sets duration how often staged files will be created and then how often ingested by Snowflake during streaming.
- withFooter(String) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
-
Adds a footer string to each file.
- withFooter(String) - Method in class org.apache.beam.sdk.io.TextIO.Write
- withFooter(String) - Method in class org.apache.beam.sdk.io.TextIO.Sink
- withFormat(DataFormat) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
-
See
DataFormat
. - withFormatFn(KuduIO.FormatFunction<T>) - Method in class org.apache.beam.sdk.io.kudu.KuduIO.Write
-
Writes using the given function to create the mutation operations from the input.
- withFormatFunction(SourceRecordMapper<T>) - Method in class org.apache.beam.io.debezium.DebeziumIO.Read
-
Applies a
SourceRecordMapper
to the connector. - withFormatFunction(SerializableFunction<UserT, String>) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
-
Deprecated.
- withFormatFunction(SerializableFunction<UserT, OutputT>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
-
Specifies a format function to convert
AvroIO.TypedWrite
to the output type. - withFormatFunction(SerializableFunction<T, TableRow>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Formats the user's type into a
TableRow
to be written to BigQuery. - withFormatRecordOnFailureFunction(SerializableFunction<T, TableRow>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
If an insert failure occurs, this function is applied to the originally supplied T element.
- withFromDateTime(String) - Method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Read
-
Read metric data from the fromDateTime.
- withGapDuration(Duration) - Static method in class org.apache.beam.sdk.transforms.windowing.Sessions
-
Creates a
Sessions
WindowFn
with the specified gap duration. - withGCPApplicationDefaultCredentials() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Creates and sets the Application Default Credentials for a Kafka consumer.
- withGCPApplicationDefaultCredentials() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
-
Creates and sets the Application Default Credentials for a Kafka producer.
- withGetOffsetFn(SerializableFunction<V, Long>) - Method in class org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIO.Read
-
A function to get offset in order to start
Receiver
from it. - withGoogleAdsClientFactory(GoogleAdsClientFactory) - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsV19.Read
-
Creates and returns a new
GoogleAdsV19.Read
transform with the specified client factory. - withGoogleAdsClientFactory(GoogleAdsClientFactory) - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsV19.ReadAll
-
Creates and returns a new
GoogleAdsV19.ReadAll
transform with the specified client factory. - withGroupingFactor(int) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
Specifies the multiple of max mutation (in terms of both bytes per batch and cells per batch) that is used to select a set of mutations to sort by key for batching.
- withHadoopConfiguration(Class<K>, Class<V>) - Method in class org.apache.beam.sdk.io.cdap.Plugin
-
Sets a plugin Hadoop configuration.
- withHadoopConfiguration(Configuration) - Method in class org.apache.beam.sdk.io.cdap.Plugin
-
Sets a plugin Hadoop configuration.
- withHasError(boolean) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Document
-
Used to set whether or not there was an error for a given document as indicated by the response from Elasticsearch.
- withHasMultilineCSVRecords(Boolean) - Method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.Read
-
When reading RFC4180 CSV files that have values that span multiple lines, set this to true.
- withHeader(String) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
-
Adds a header string to each file.
- withHeader(String) - Method in class org.apache.beam.sdk.io.TextIO.Write
- withHeader(String) - Method in class org.apache.beam.sdk.io.TextIO.Sink
- withHighPriority() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
- withHighPriority() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
- withHighPriority() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
- withHint(String, ResourceHint) - Method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHints
-
Declares a custom resource hint that has a specified URN.
- withHintMatchesManyFiles() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Parse
- withHintMatchesManyFiles() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Read
-
Hints that the filepattern specified in
AvroIO.Read.from(String)
matches a very large number of files. - withHintMatchesManyFiles() - Method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.Read
-
Hints that the filepattern specified in
ContextualTextIO.Read.from(String)
matches a very large number of files. - withHintMatchesManyFiles() - Method in class org.apache.beam.sdk.io.TextIO.Read
-
Hints that the filepattern specified in
TextIO.Read.from(String)
matches a very large number of files. - withHintMaxNumWorkers(int) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
-
Provide a hint to the QoS system for the intended max number of workers for a pipeline.
- withHintNumWorkers(int) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
-
Returns a new
DatastoreV1.DeleteEntity
with a different worker count hint for ramp-up throttling. - withHintNumWorkers(int) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntityWithSummary
-
Returns a new
DatastoreV1.DeleteEntityWithSummary
with a different worker count hint for ramp-up throttling. - withHintNumWorkers(int) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
-
Returns a new
DatastoreV1.DeleteKey
with a different worker count hint for ramp-up throttling. - withHintNumWorkers(int) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKeyWithSummary
-
Returns a new
DatastoreV1.DeleteKeyWithSummary
with a different worker count hint for ramp-up throttling. - withHintNumWorkers(int) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
-
Returns a new
DatastoreV1.Write
with a different worker count hint for ramp-up throttling. - withHintNumWorkers(int) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.WriteWithSummary
-
Returns a new
DatastoreV1.WriteWithSummary
with a different worker count hint for ramp-up throttling. - withHintNumWorkers(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
-
Same as
DatastoreV1.DeleteEntity.withHintNumWorkers(int)
but with aValueProvider
. - withHintNumWorkers(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntityWithSummary
-
Same as
DatastoreV1.DeleteEntityWithSummary.withHintNumWorkers(int)
but with aValueProvider
. - withHintNumWorkers(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
-
Same as
DatastoreV1.DeleteKey.withHintNumWorkers(int)
but with aValueProvider
. - withHintNumWorkers(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKeyWithSummary
-
Same as
DatastoreV1.DeleteKeyWithSummary.withHintNumWorkers(int)
but with aValueProvider
. - withHintNumWorkers(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
-
Same as
DatastoreV1.Write.withHintNumWorkers(int)
but with aValueProvider
. - withHintNumWorkers(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.WriteWithSummary
-
Same as
DatastoreV1.WriteWithSummary.withHintNumWorkers(int)
but with aValueProvider
. - withHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
- withHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
- withHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
- withHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
Specifies the Cloud Spanner host.
- withHost(String) - Method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
-
Define the host name of the Redis server.
- withHost(String) - Method in class org.apache.beam.sdk.io.splunk.SplunkEvent.Builder
-
Assigns host value to the event metadata.
- withHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the Cloud Spanner host.
- withHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
-
Specifies the Cloud Spanner host.
- withHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
-
Specifies the Cloud Spanner host.
- withHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
-
Specifies the Cloud Spanner host.
- withHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
Specifies the Cloud Spanner host.
- withHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
- withHostName(String) - Method in class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
-
Sets the host name to be used on the database.
- withHostName(ValueProvider<String>) - Method in class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
-
Sets the host name to be used on the database.
- withHosts(List<String>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
-
Specify the hosts of the Apache Cassandra instances.
- withHosts(List<String>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
-
Specify the Cassandra instance hosts where to write data.
- withHosts(ValueProvider<List<String>>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
-
Specify the hosts of the Apache Cassandra instances.
- withHosts(ValueProvider<List<String>>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
-
Specify the hosts of the Apache Cassandra instances.
- withHotKeyFanout(int) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
- withHotKeyFanout(int) - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
-
Like
Combine.PerKey.withHotKeyFanout(SerializableFunction)
, but returning the given constant value for every key. - withHotKeyFanout(SerializableFunction<? super K, Integer>) - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
-
If a single key has disproportionately many values, it may become a bottleneck, especially in streaming mode.
- withHotKeyFanout(SerializableFunction<Row, Integer>) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
- withIdAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
-
When reading from Cloud Pub/Sub where unique record identifiers are provided as Pub/Sub message attributes, specifies the name of the attribute containing the unique identifier.
- withIdAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
-
Writes to Pub/Sub, adding each record's unique identifier to the published messages in an attribute with the specified name.
- withIdFn(ElasticsearchIO.Write.FieldValueExtractFn) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocToBulk
-
Provide a function to extract the id from the document.
- withIdFn(ElasticsearchIO.Write.FieldValueExtractFn) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
- withIdGenerator(IdGenerator) - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient
- withIgnoreSSLCertificate(boolean) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
-
Enable ignoreSSLCertificate for ssl for connection (allow for self signed certificates).
- withIgnoreSSLCertificate(boolean) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
-
Enable ignoreSSLCertificate for ssl for connection (allow for self signed certificates).
- withIgnoreVersionConflicts(boolean) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BulkIO
-
Whether or not to suppress version conflict errors in a Bulk API response.
- withIgnoreVersionConflicts(boolean) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
- withIgnoreWindowing() - Method in class org.apache.beam.sdk.io.FileIO.Write
-
Deprecated.Avoid usage of this method: its effects are complex and it will be removed in future versions of Beam. Right now it exists for compatibility with
WriteFiles
. - withInclusiveEndAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
-
Specifies the end time of the change stream.
- withInclusiveStartAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
-
Specifies the time that the change stream should be read from.
- withIndex(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- withIndex(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
- withIndex(String) - Method in class org.apache.beam.sdk.io.splunk.SplunkEvent.Builder
-
Assigns index value to the event metadata.
- withIndexes() - Method in class org.apache.beam.sdk.extensions.python.transforms.DataframeTransform
-
Sets include_indexes option for DataframeTransform.
- withIndexFn(ElasticsearchIO.Write.FieldValueExtractFn) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocToBulk
-
Provide a function to extract the target index from the document allowing for dynamic document routing.
- withIndexFn(ElasticsearchIO.Write.FieldValueExtractFn) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
- withIndexOffset(long) - Method in class org.apache.beam.runners.dataflow.internal.IsmFormat.IsmShard
-
Returns a new IsmShard like this one with the specified index offset.
- withInitialBackoff(Duration) - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
-
Set initial backoff duration.
- withInitialBackoff(Duration) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
-
Configure the initial backoff duration to be used before retrying a request for the first time.
- withInitialPositionInStream(InitialPositionInStream) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
-
Specify reading from some initial position in stream.
- withInitialSplitDuration(Duration) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.ListHL7v2Messages
- withInitialTimestampInStream(Instant) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
-
Specify reading beginning at given
Instant
. - withInputDoc(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Document
-
Sets the input document i.e.
- withInputMetadata(Metadata) - Method in class org.apache.beam.sdk.io.tika.TikaIO.ParseFiles
-
Sets the input metadata for
Parser.parse(java.io.InputStream, org.xml.sax.ContentHandler, org.apache.tika.metadata.Metadata, org.apache.tika.parser.ParseContext)
. - withInputTimestamp() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
-
Wrapper method over
KafkaIO.WriteRecords.withInputTimestamp()
, used to keep the compatibility with old API based on KV type of element. - withInputTimestamp() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
-
The timestamp for each record being published is set to timestamp of the element in the pipeline.
- withInsertDeduplicate(Boolean) - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
-
For INSERT queries in the replicated table, specifies that deduplication of inserting blocks should be performed.
- withInsertDistributedSync(Boolean) - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
-
If setting is enabled, insert query into distributed waits until data will be sent to all nodes in cluster.
- withInsertQuorum(Long) - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
-
For INSERT queries in the replicated table, wait writing for the specified number of replicas and linearize the addition of the data.
- withInsertRetryPolicy(InsertRetryPolicy) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingInserts
-
Specify a retry policy for failed inserts.
- withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Returns a new
BigtableIO.Read
that will read from the Cloud Bigtable instance indicated by given parameter, requiresBigtableIO.Read.withProjectId(org.apache.beam.sdk.options.ValueProvider<java.lang.String>)
to be called to determine the project. - withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
-
Returns a new
BigtableIO.ReadChangeStream
that will stream from the Cloud Bigtable instance indicated by given parameter, requiresBigtableIO.ReadChangeStream.withProjectId(java.lang.String)
to be called to determine the project. - withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
-
Returns a new
BigtableIO.Write
that will write into the Cloud Bigtable instance indicated by given parameter, requiresBigtableIO.Write.withProjectId(org.apache.beam.sdk.options.ValueProvider<java.lang.String>)
to be called to determine the project. - withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the Cloud Spanner instance ID.
- withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
-
Specifies the Cloud Spanner instance.
- withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
-
Specifies the Cloud Spanner instance.
- withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
-
Specifies the Cloud Spanner instance.
- withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
-
Specifies the Cloud Spanner instance.
- withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
Specifies the Cloud Spanner instance.
- withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
- withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Returns a new
BigtableIO.Read
that will read from the Cloud Bigtable instance indicated by given parameter, requiresBigtableIO.Read.withProjectId(org.apache.beam.sdk.options.ValueProvider<java.lang.String>)
to be called to determine the project. - withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
-
Returns a new
BigtableIO.Write
that will write into the Cloud Bigtable instance indicated by given parameter, requiresBigtableIO.Write.withProjectId(org.apache.beam.sdk.options.ValueProvider<java.lang.String>)
to be called to determine the project. - withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the Cloud Spanner instance ID.
- withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
-
Specifies the Cloud Spanner instance.
- withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
-
Specifies the Cloud Spanner instance.
- withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
-
Specifies the Cloud Spanner instance.
- withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
-
Specifies the Cloud Spanner instance.
- withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
Specifies the Cloud Spanner instance.
- withInterceptors(List<ClientInterceptor>) - Method in class org.apache.beam.sdk.fn.channel.ManagedChannelFactory
-
Returns a
ManagedChannelFactory
like this one, but which will apply the providedClientInterceptors
to any channel it creates. - withInterpolateFunction(SerializableFunction<FillGaps.InterpolateData<ValueT>, ValueT>) - Method in class org.apache.beam.sdk.extensions.timeseries.FillGaps
-
This function can be used to modify elements before propagating to the next bucket.
- withInterval(Duration) - Method in class org.apache.beam.sdk.transforms.PeriodicImpulse
- withIsDeleteFn(ElasticsearchIO.Write.BooleanFieldValueExtractFn) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocToBulk
-
Provide a function to extract the target operation either upsert or delete from the document fields allowing dynamic bulk operation decision.
- withIsDeleteFn(ElasticsearchIO.Write.BooleanFieldValueExtractFn) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
- withIsLocalChannelProvider(ValueProvider<Boolean>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies whether a local channel provider should be used.
- withIsReady(Supplier<Boolean>) - Method in class org.apache.beam.sdk.fn.test.TestStreams.Builder
-
Returns a new
TestStreams.Builder
like this one with the specifiedCallStreamObserver.isReady()
callback. - withIsUpsert(boolean) - Method in class org.apache.beam.sdk.io.mongodb.UpdateConfiguration
- withJobService(BigQueryServices.JobService) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
- withJsonClustering(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
The same as
BigQueryIO.Write.withClustering(Clustering)
, but takes a JSON-serialized Clustering object in a deferredValueProvider
. - withJsonSchema(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Similar to
BigQueryIO.Write.withSchema(TableSchema)
but takes in a JSON-serializedTableSchema
. - withJsonSchema(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Same as
BigQueryIO.Write.withJsonSchema(String)
but using a deferredValueProvider
. - withJsonTimePartitioning(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
The same as
BigQueryIO.Write.withTimePartitioning(com.google.api.services.bigquery.model.TimePartitioning)
, but takes a JSON-serialized object. - withKeyClass(Class<K>) - Method in class org.apache.beam.sdk.io.cdap.CdapIO.Read
-
Sets a key class.
- withKeyClass(Class<K>) - Method in class org.apache.beam.sdk.io.cdap.CdapIO.Write
-
Sets a key class.
- withKeyDeserializer(Class<? extends Deserializer<K>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Sets a Kafka
Deserializer
to interpret key bytes read from Kafka. - withKeyDeserializer(Class<? extends Deserializer<K>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
Sets a Kafka
Deserializer
to interpret key bytes read from Kafka. - withKeyDeserializer(DeserializerProvider<K>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- withKeyDeserializerAndCoder(Class<? extends Deserializer<K>>, Coder<K>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Sets a Kafka
Deserializer
for interpreting key bytes read from Kafka along with aCoder
for helping the Beam runner materialize key objects at runtime if necessary. - withKeyDeserializerAndCoder(Class<? extends Deserializer<K>>, Coder<K>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
Sets a Kafka
Deserializer
for interpreting key bytes read from Kafka along with aCoder
for helping the Beam runner materialize key objects at runtime if necessary. - withKeyDeserializerProvider(DeserializerProvider<K>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
- withKeyDeserializerProviderAndCoder(DeserializerProvider<K>, Coder<K>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
- withKeyDeserializerProviderAndCoder(DeserializerProvider<K>, Coder<K>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- withKeyField(String) - Method in class org.apache.beam.sdk.schemas.transforms.CoGroup.Impl
- withKeyField(String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
- withKeyField(String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
-
Set the name of the key field in the resulting schema.
- withKeyPairAuth(String, PrivateKey) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
Sets key pair authentication.
- withKeyPairPathAuth(String, String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
Sets key pair authentication.
- withKeyPairPathAuth(String, String, String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
Sets key pair authentication.
- withKeyPairPathAuth(ValueProvider<String>, String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
Sets key pair authentication.
- withKeyPairPathAuth(ValueProvider<String>, String, ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
Sets key pair authentication.
- withKeyPairRawAuth(String, String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
Sets key pair authentication.
- withKeyPairRawAuth(String, String, String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
Sets key pair authentication.
- withKeyPairRawAuth(ValueProvider<String>, ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
Sets key pair authentication.
- withKeyPairRawAuth(ValueProvider<String>, ValueProvider<String>, ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
Sets key pair authentication.
- withKeyPattern(String) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Read
- withKeyRange(byte[], byte[]) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
-
Reads only rows in the specified range.
- withKeyRange(ByteKeyRange) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Returns a new
BigtableIO.Read
that will read only rows in the specified range. - withKeyRange(ByteKeyRange) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
-
Reads only rows in the specified range.
- withKeyRanges(List<ByteKeyRange>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Returns a new
BigtableIO.Read
that will read only rows in the specified ranges. - withKeyRanges(ValueProvider<List<ByteKeyRange>>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Returns a new
BigtableIO.Read
that will read only rows in the specified ranges. - WithKeys<T> - Class in org.apache.beam.sdk.schemas.transforms
- WithKeys<K,
V> - Class in org.apache.beam.sdk.transforms -
WithKeys<K, V>
takes aPCollection<V>
, and either a constant key of typeK
or a function fromV
toK
, and returns aPCollection<KV<K, V>>
, where each of the values in the inputPCollection
has been paired with either the constant key or a key computed from the value. - withKeySerializer(Class<? extends Serializer<K>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
-
Wrapper method over
KafkaIO.WriteRecords.withKeySerializer(Class)
, used to keep the compatibility with old API based on KV type of element. - withKeySerializer(Class<? extends Serializer<K>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
-
Sets a
Serializer
for serializing key (if any) to bytes. - withKeySet(KeySet) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- withKeySet(KeySet) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
- withKeyspace(String) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
-
Specify the Cassandra keyspace where to read data.
- withKeyspace(String) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
-
Specify the Cassandra keyspace where to write data.
- withKeyspace(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
-
Specify the Cassandra keyspace where to read data.
- withKeyspace(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
-
Specify the Cassandra keyspace where to read data.
- withKeystorePassword(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
If Elasticsearch uses SSL/TLS with mutual authentication (via shield), provide the password to open the client keystore.
- withKeystorePath(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
If Elasticsearch uses SSL/TLS with mutual authentication (via shield), provide the keystore containing the client key.
- withKeyTranslation(SimpleFunction<?, K>) - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
-
Transforms the keys read from the source using the given key translation function.
- withKeyTranslation(SimpleFunction<?, K>, Coder<K>) - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
-
Transforms the keys read from the source using the given key translation function.
- withKeyType(TypeDescriptor<K>) - Method in class org.apache.beam.sdk.transforms.WithKeys
-
Return a
WithKeys
that is like this one with the specified key type descriptor. - withKmsKey(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
-
For query sources, use this Cloud KMS key to encrypt any temporary tables created.
- withKmsKey(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
- withKwarg(String, Object) - Method in class org.apache.beam.sdk.extensions.python.PythonExternalTransform
-
Specifies a single keyword argument for the Python cross-language transform.
- withKwarg(String, Object) - Method in class org.apache.beam.sdk.extensions.python.transforms.RunInference
-
Sets keyword arguments for the model loader.
- withKwargs(Map<String, Object>) - Method in class org.apache.beam.sdk.extensions.python.PythonExternalTransform
-
Specifies keyword arguments for the Python cross-language transform.
- withKwargs(Row) - Method in class org.apache.beam.sdk.extensions.python.PythonExternalTransform
-
Specifies keyword arguments as a Row objects.
- withLabel(String) - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
-
Set the item
label
. - withLateFirings(Trigger.OnceTrigger) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
- withLateFirings(Trigger.OnceTrigger) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.FromEndOfWindow
-
Creates a new
Trigger
like the this, except that it fires repeatedly whenever the givenTrigger
fires after the watermark has passed the end of the window. - withLimit(int) - Method in class org.apache.beam.sdk.io.mongodb.FindQuery
-
Sets the limit of documents to find.
- withLinkUrl(String) - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
-
Set the item
link url
. - withLiteralGqlQuery(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
-
Returns a new
DatastoreV1.Read
that reads the results of the specified GQL query. - withLiteralGqlQuery(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
-
Same as
DatastoreV1.Read.withLiteralGqlQuery(String)
but with aValueProvider
. - withLoadJobProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Set the project the BigQuery load job will be initiated from.
- withLoadJobProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
- withLocalDc(String) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
-
Specify the local DC used for the load balancing.
- withLocalDc(String) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
-
Specify the local DC used by the load balancing policy.
- withLocalDc(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
-
Specify the local DC used for the load balancing.
- withLocalDc(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
-
Specify the local DC used for the load balancing.
- withLocalhost(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
-
Returns a new
DatastoreV1.DeleteEntity
that deletes entities from the Cloud Datastore Emulator running locally on the specified host port. - withLocalhost(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntityWithSummary
-
Returns a new
DatastoreV1.DeleteEntityWithSummary
that deletes entities from the Cloud Datastore Emulator running locally on the specified host port. - withLocalhost(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
-
Returns a new
DatastoreV1.DeleteKey
that deletes entities from the Cloud Datastore Emulator running locally on the specified host port. - withLocalhost(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKeyWithSummary
-
Returns a new
DatastoreV1.DeleteKeyWithSummary
that deletes entities from the Cloud Datastore Emulator running locally on the specified host port. - withLocalhost(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
-
Returns a new
DatastoreV1.Read
that reads from a Datastore Emulator running at the given localhost address. - withLocalhost(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
-
Returns a new
DatastoreV1.Write
that writes to the Cloud Datastore Emulator running locally on the specified host port. - withLocalhost(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.WriteWithSummary
-
Returns a new
DatastoreV1.WriteWithSummary
that writes to the Cloud Datastore Emulator running locally on the specified host port. - withLocksDirPath(String) - Method in class org.apache.beam.sdk.io.cdap.CdapIO.Write
-
Sets path to directory where locks will be stored.
- withLogAppendTime() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- withLogAppendTime() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
Use the log append time as the output timestamp.
- withLogAppendTime() - Static method in interface org.apache.beam.sdk.io.kafka.TimestampPolicyFactory
-
A
TimestampPolicy
that assigns Kafka's log append time (server side ingestion time) to each record. - withLoginCustomerId(Long) - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsV19.Read
-
Creates and returns a new
GoogleAdsV19.Read
transform with the specified login customer ID. - withLoginCustomerId(Long) - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsV19.ReadAll
-
Creates and returns a new
GoogleAdsV19.ReadAll
transform with the specified login customer ID. - withLoginTimeout(Integer) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
Sets loginTimeout that will be used in
SnowflakeBasicDataSource.setLoginTimeout(int)
. - withLowerBound(PartitionColumnT) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadWithPartitions
- withLowPriority() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
- withLowPriority() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
- withLowPriority() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
- withManualWatermarkEstimator() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
Use the
WatermarkEstimators.Manual
as the watermark estimator. - withMapper(ObjectMapper) - Method in class org.apache.beam.sdk.extensions.jackson.AsJsons
-
Use custom Jackson
ObjectMapper
instead of the default one. - withMapper(ObjectMapper) - Method in class org.apache.beam.sdk.extensions.jackson.ParseJsons
-
Use custom Jackson
ObjectMapper
instead of the default one. - withMapperFactoryFn(SerializableFunction<Session, Mapper>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
-
A factory to create a specific
Mapper
for a given Cassandra Session. - withMapperFactoryFn(SerializableFunction<Session, Mapper>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
- withMasterAddresses(String) - Method in class org.apache.beam.sdk.io.kudu.KuduIO.Read
-
Reads from the Kudu cluster on the specified master addresses.
- withMasterAddresses(String) - Method in class org.apache.beam.sdk.io.kudu.KuduIO.Write
-
Writes to the Kudu cluster on the specified master addresses.
- withMatchConfiguration(FileIO.MatchConfiguration) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Parse
-
Sets the
FileIO.MatchConfiguration
. - withMatchConfiguration(FileIO.MatchConfiguration) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ParseAll
-
Deprecated.Sets the
FileIO.MatchConfiguration
. - withMatchConfiguration(FileIO.MatchConfiguration) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Read
-
Sets the
FileIO.MatchConfiguration
. - withMatchConfiguration(FileIO.MatchConfiguration) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ReadAll
-
Deprecated.Sets the
FileIO.MatchConfiguration
. - withMatchConfiguration(FileIO.MatchConfiguration) - Method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.Read
-
Sets the
FileIO.MatchConfiguration
. - withMatchConfiguration(FileIO.MatchConfiguration) - Method in class org.apache.beam.sdk.io.TextIO.Read
-
Sets the
FileIO.MatchConfiguration
. - withMatchConfiguration(FileIO.MatchConfiguration) - Method in class org.apache.beam.sdk.io.TextIO.ReadAll
-
Deprecated.Sets the
FileIO.MatchConfiguration
. - withMaxActiveBundlesPerWorker(int) - Method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHints
- withMaxAttempts(int) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
-
Configure the maximum number of times a request will be attempted for a complete successful result.
- withMaxBatchBufferingDuration(long) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
- withMaxBatchBufferingDuration(long) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteVoid
-
Provide maximum buffering time to batch elements before committing SQL statement.
- withMaxBatchBufferingDuration(long) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteWithResults
- withMaxBatchBytesSize(int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
-
Writes to Pub/Sub are limited by 10mb in general.
- withMaxBatchSize(int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
-
Writes to Pub/Sub are batched to efficiently send data.
- withMaxBatchSize(int) - Method in class org.apache.beam.sdk.io.solr.SolrIO.Write
-
Provide a maximum size in number of documents for the batch.
- withMaxBatchSize(long) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BulkIO
-
Provide a maximum size in number of documents for the batch see bulk API (https://www.elastic.co/guide/en/elasticsearch/reference/7.17/docs-bulk.html).
- withMaxBatchSize(long) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
- withMaxBatchSizeBytes(long) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BulkIO
-
Provide a maximum size in bytes for the batch see bulk API (https://www.elastic.co/guide/en/elasticsearch/reference/7.17/docs-bulk.html).
- withMaxBatchSizeBytes(long) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
- withMaxBufferElementCount(Integer) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Returns a new
BigtableIO.Read
that will break up read requests into smaller batches. - withMaxBufferingDuration(Duration) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BulkIO
-
If using
ElasticsearchIO.BulkIO.withUseStatefulBatches(boolean)
, this can be used to set a maximum elapsed time before buffered elements are emitted to Elasticsearch as a Bulk API request. - withMaxBufferingDuration(Duration) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
- withMaxBufferingDuration(Duration) - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches
-
Sets a time limit (in processing time) on how long an incomplete batch of elements is allowed to be buffered.
- withMaxBytesPerBatch(long) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
-
Returns a new
BigtableIO.Write
with the max bytes a batch can have. - withMaxBytesPerPartition(long) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Control how much data will be assigned to a single BigQuery load job.
- withMaxCapacityPerShard(Integer) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
-
Specifies the maximum number of messages per one shard.
- withMaxCommitDelay(long) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- withMaxCommitDelay(long) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
Specifies max commit delay for the Commit API call for throughput optimized writes.
- withMaxCommitDelay(ValueProvider<Duration>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the max commit delay for high throughput writes.
- withMaxCommitDelay(Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the max commit delay for high throughput writes.
- withMaxConnectionIdleTime(int) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
-
Sets the maximum idle time for a pooled connection.
- withMaxConnectionIdleTime(int) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
-
Sets the maximum idle time for a pooled connection.
- withMaxConnections(Integer) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
-
Sets the maximum total number of connections.
- withMaxConnections(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
-
Same as
JdbcIO.DataSourceConfiguration.withMaxConnections(Integer)
but accepting a ValueProvider. - withMaxCumulativeBackoff(ValueProvider<Duration>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the maximum cumulative backoff.
- withMaxCumulativeBackoff(Duration) - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
-
Limits total time spent in backoff.
- withMaxCumulativeBackoff(Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the maximum cumulative backoff.
- withMaxCumulativeBackoff(Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
Specifies the maximum cumulative backoff time when retrying after DEADLINE_EXCEEDED errors.
- withMaxElementsPerBatch(long) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
-
Returns a new
BigtableIO.Write
with the max elements a batch can have. - withMaxFileSize(long) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Controls the maximum byte size per file to be loaded into BigQuery.
- withMaxFilesPerBundle(int) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Control how many files will be written concurrently by a single worker when using BigQuery load jobs before spilling to a shuffle.
- withMaxFilesPerPartition(int) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Controls how many files will be assigned to a single BigQuery load job.
- withMaxGapFillBuckets(Long) - Method in class org.apache.beam.sdk.extensions.timeseries.FillGaps
- withMaxInputSize(long) - Method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
-
Returns an
ApproximateQuantilesCombineFn
that's like this one except that it uses the specifiedmaxNumElements
value. - withMaxInsertBlockSize(long) - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
-
The maximum block size for insertion, if we control the creation of blocks for insertion.
- withMaxLen(long) - Method in class org.apache.beam.sdk.io.redis.RedisIO.WriteStreams
-
When appending (XADD) to a stream, set a MAXLEN option.
- withMaxNumberOfRecords(Integer) - Method in class org.apache.beam.io.debezium.DebeziumIO.Read
-
Once the specified number of records has been reached, it will stop fetching them.
- withMaxNumConnections(Integer) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Read
-
Optional.
- withMaxNumMutations(long) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
Specifies the cell mutation limit (maximum number of mutated cells per batch).
- withMaxNumRecords(long) - Method in class org.apache.beam.sdk.io.amqp.AmqpIO.Read
-
Define the max number of records received by the
AmqpIO.Read
. - withMaxNumRecords(long) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
-
Specifies to read at most a given number of records.
- withMaxNumRecords(long) - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.Read
-
Define the max number of records received by the
SqsIO.Read
. - withMaxNumRecords(long) - Method in class org.apache.beam.sdk.io.BoundedReadFromUnboundedSource
-
Returns a new
BoundedReadFromUnboundedSource
that reads a bounded amount of data from the givenUnboundedSource
. - withMaxNumRecords(long) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
-
Define the max number of records that the source will read.
- withMaxNumRecords(long) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Similar to
Read.Unbounded.withMaxNumRecords(long)
. - withMaxNumRecords(long) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Read
-
Define the max number of records received by the
MqttIO.Read
. - withMaxNumRecords(long) - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO.Read
-
Define the max number of records received by the
RabbitMqIO.Read
. - withMaxNumRecords(long) - Method in class org.apache.beam.sdk.io.Read.Unbounded
-
Returns a new
BoundedReadFromUnboundedSource
that reads a bounded amount of data from the givenUnboundedSource
. - withMaxNumRows(long) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
Specifies the row mutation limit (maximum number of mutated rows per batch).
- withMaxNumWritersPerBundle(int) - Method in class org.apache.beam.sdk.io.WriteFiles
-
Set the maximum number of writers created in a bundle before spilling to shuffle.
- withMaxNumWritersPerBundle(Integer) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
-
See
invalid reference
WriteFiles#withMaxNumWritersPerBundle()
- withMaxNumWritersPerBundle(Integer) - Method in class org.apache.beam.sdk.io.csv.CsvIO.Write
-
Set the maximum number of writers created in a bundle before spilling to shuffle.
- withMaxNumWritersPerBundle(Integer) - Method in class org.apache.beam.sdk.io.FileIO.Write
-
Set the maximum number of writers created in a bundle before spilling to shuffle.
- withMaxNumWritersPerBundle(Integer) - Method in class org.apache.beam.sdk.io.json.JsonIO.Write
-
Set the maximum number of writers created in a bundle before spilling to shuffle.
- withMaxNumWritersPerBundle(Integer) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
-
Set the maximum number of writers created in a bundle before spilling to shuffle.
- withMaxNumWritersPerBundle(Integer) - Method in class org.apache.beam.sdk.io.TextIO.Write
- withMaxNumWritersPerBundle(Integer) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
-
See
invalid reference
WriteFiles#withMaxNumWritersPerBundle()
- withMaxOutstandingBytes(long) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
-
Returns a new
BigtableIO.Write
with the max number of outstanding bytes allowed before enforcing flow control. - withMaxOutstandingElements(long) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
-
Returns a new
BigtableIO.Write
with the max number of outstanding elements allowed before enforcing flow control. - withMaxParallelRequests(int) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BulkIO
-
When using
ElasticsearchIO.BulkIO.withUseStatefulBatches(boolean)
Stateful Processing, states and therefore batches are maintained per-key-per-window. - withMaxParallelRequests(int) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
- withMaxParallelRequestsPerWindow(int) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BulkIO
-
Deprecated.use
ElasticsearchIO.BulkIO.withMaxParallelRequests(int)
instead. - withMaxParallelRequestsPerWindow(int) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
-
Deprecated.
- withMaxReadTime(Duration) - Method in class org.apache.beam.sdk.io.Read.Unbounded
-
Returns a new
BoundedReadFromUnboundedSource
that reads a bounded amount of data from the givenUnboundedSource
. - withMaxReadTime(Duration) - Method in class org.apache.beam.sdk.io.amqp.AmqpIO.Read
-
Define the max read time (duration) while the
AmqpIO.Read
will receive messages. - withMaxReadTime(Duration) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
-
Specifies to read records during
maxReadTime
. - withMaxReadTime(Duration) - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.Read
-
Define the max read time (duration) while the
SqsIO.Read
will receive messages. - withMaxReadTime(Duration) - Method in class org.apache.beam.sdk.io.BoundedReadFromUnboundedSource
-
Returns a new
BoundedReadFromUnboundedSource
that reads a bounded amount of data from the givenUnboundedSource
. - withMaxReadTime(Duration) - Method in class org.apache.beam.sdk.io.GenerateSequence
-
Specifies to stop generating elements after the given time.
- withMaxReadTime(Duration) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
-
Define the max read time that the source will read.
- withMaxReadTime(Duration) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Similar to
Read.Unbounded.withMaxReadTime(Duration)
. - withMaxReadTime(Duration) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Read
-
Define the max read time (duration) while the
MqttIO.Read
will receive messages. - withMaxReadTime(Duration) - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO.Read
-
Define the max read time (duration) while the
RabbitMqIO.Read
will receive messages. - withMaxRetries(int) - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
-
Maximum number of retries per insert.
- withMaxRetryJobs(int) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
If set, this will set the max number of retry of batch load jobs.
- withMaxTimeToRun(Long) - Method in class org.apache.beam.io.debezium.DebeziumIO.Read
-
Once the connector has run for the determined amount of time, it will stop.
- withMemoryMB(int) - Method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter.Options
-
Sets the size of the memory buffer in megabytes.
- withMergeFunction(SerializableBiFunction<TimestampedValue<ValueT>, TimestampedValue<ValueT>, TimestampedValue<ValueT>>) - Method in class org.apache.beam.sdk.extensions.timeseries.FillGaps
-
If there are multiple values in a single timeseries bucket, this function is used to specify what to propagate to the next bucket.
- withMessageMapper(JmsIO.MessageMapper<T>) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
- withMetadata() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
-
Include metadata in result json documents.
- withMetadata(String, byte[]) - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
-
Deprecated.use schema options instead.
- withMetadata(String, String) - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
-
Deprecated.use schema options instead.
- withMetadata(Map<String, byte[]>) - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
-
Deprecated.use schema options instead.
- withMetadata(Map<String, Object>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Sink
-
Specifies to put the given metadata into each generated file.
- withMetadata(Map<String, Object>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
-
Writes to Avro file(s) with the specified metadata.
- withMetadata(Map<String, Object>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
- withMetadataDatabase(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
-
Specifies the metadata database.
- withMetadataInstance(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
-
Specifies the metadata database.
- withMetadataTable(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
-
Specifies the metadata table name.
- withMetadataTableAppProfileId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
-
Returns a new
BigtableIO.ReadChangeStream
that will use the cluster specified by app profile id to store the metadata of the stream. - withMetadataTableInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
-
Returns a new
BigtableIO.ReadChangeStream
that will use the Cloud Bigtable instance indicated by given parameter to manage the metadata of the stream. - withMetadataTableProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
-
Returns a new
BigtableIO.ReadChangeStream
that will use the Cloud Bigtable project indicated by given parameter to manage the metadata of the stream. - withMetadataTableTableId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
-
Returns a new
BigtableIO.ReadChangeStream
that will use specified table to store the metadata of the stream. - withMethod(BigQueryIO.TypedRead.Method) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- withMethod(BigQueryIO.Write.Method) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Choose the method used to write data to BigQuery.
- withMethod(RedisIO.Write.Method) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Write
- withMetric(String) - Method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Read
-
Sets the metric to use.
- WithMetricsSupport - Class in org.apache.beam.runners.spark.metrics
-
A
MetricRegistry
decorator-like that supportsinvalid reference
AggregatorMetric
SparkBeamMetric
asGauges
. - WithMetricsSupport - Class in org.apache.beam.runners.spark.structuredstreaming.metrics
-
A
MetricRegistry
decorator-like that supportsBeamMetricSet
s asGauges
. - withMinBundleSize(long) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource
-
Sets the minimum bundle size.
- withMinBundleSize(long) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
-
Sets a parameter
minBundleSize
for the minimum bundle size of the source. - withMinNumberOfSplits(Integer) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
-
It's possible that system.size_estimates isn't populated or that the number of splits computed by Beam is still to low for Cassandra to handle it.
- withMinNumberOfSplits(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
-
It's possible that system.size_estimates isn't populated or that the number of splits computed by Beam is still to low for Cassandra to handle it.
- withMinRam(long) - Method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHints
-
Sets desired minimal available RAM size to have in transform's execution environment.
- withMinRam(String) - Method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHints
-
Sets desired minimal available RAM size to have in transform's execution environment.
- withMode(WindowingStrategy.AccumulationMode) - Method in class org.apache.beam.sdk.values.WindowingStrategy
- withMongoDbPipeline(List<BsonDocument>) - Method in class org.apache.beam.sdk.io.mongodb.AggregationQuery
- withMonitoringConfiguration(Monitoring) - Method in class org.apache.beam.io.requestresponse.RequestResponseIO
- withMonotonicallyIncreasingWatermarkEstimator() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
Use the
WatermarkEstimators.MonotonicallyIncreasing
as the watermark estimator. - withMultipleInputs(String...) - Method in class org.apache.beam.sdk.extensions.yaml.YamlTransform
-
Indicates that this YamlTransform expects multiple, named inputs.
- withMultipleOutputs(String...) - Method in class org.apache.beam.sdk.extensions.yaml.YamlTransform
-
Indicates that this YamlTransform expects multiple, named outputs.
- withName(String) - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
- withName(String) - Method in class org.apache.beam.sdk.schemas.Schema.Field
-
Returns a copy of the Field with the name set.
- withNamedParameters(Map<String, ?>) - Method in class org.apache.beam.sdk.extensions.sql.SqlTransform
- withNameOnlyQuery() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.PartitionQuery.Builder
-
Update produced queries to only retrieve their
__name__
thereby not retrieving any fields and reducing resource requirements. - withNamespace(Class<?>) - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
- withNamespace(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
-
Returns a new
DatastoreV1.Read
that reads from the given namespace. - withNamespace(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
-
Same as
DatastoreV1.Read.withNamespace(String)
but with aValueProvider
. - withNaming(FileIO.Write.FileNaming) - Method in class org.apache.beam.sdk.io.FileIO.Write
-
Specifies a custom strategy for generating filenames.
- withNaming(Contextful<Contextful.Fn<DestinationT, FileIO.Write.FileNaming>>) - Method in class org.apache.beam.sdk.io.FileIO.Write
-
Like
FileIO.Write.withNaming(SerializableFunction)
but allows accessing context, such as side inputs, from the function. - withNaming(SerializableFunction<DestinationT, FileIO.Write.FileNaming>) - Method in class org.apache.beam.sdk.io.FileIO.Write
-
Specifies a custom strategy for generating filenames depending on the destination, similar to
FileIO.Write.withNaming(FileNaming)
. - withNestedField(int, FieldAccessDescriptor) - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
-
Return a descriptor that access the specified nested field.
- withNestedField(String, FieldAccessDescriptor) - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
-
Return a descriptor that access the specified nested field.
- withNestedField(FieldAccessDescriptor.FieldDescriptor, FieldAccessDescriptor) - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
- withNestedFieldAs(String, String, FieldAccessDescriptor) - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
-
Like
FieldAccessDescriptor.withNestedField(int, org.apache.beam.sdk.schemas.FieldAccessDescriptor)
along with a rename of the nested field. - withNoOutputTimestamp() - Method in interface org.apache.beam.sdk.state.Timer
-
Asserts that there is no output timestamp.
- withNoSpilling() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
- withNoSpilling() - Method in class org.apache.beam.sdk.io.csv.CsvIO.Write
-
Whether to skip the spilling of data.
- withNoSpilling() - Method in class org.apache.beam.sdk.io.FileIO.Write
- withNoSpilling() - Method in class org.apache.beam.sdk.io.json.JsonIO.Write
-
Whether to skip the spilling of data.
- withNoSpilling() - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
- withNoSpilling() - Method in class org.apache.beam.sdk.io.TextIO.Write
- withNoSpilling() - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
- withNoSpilling() - Method in class org.apache.beam.sdk.io.WriteFiles
-
Returns a new
WriteFiles
that writes all data without spilling, simplifying the pipeline. - withNullable(boolean) - Method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
- withNullable(boolean) - Method in class org.apache.beam.sdk.schemas.Schema.Field
-
Returns a copy of the Field with isNullable set.
- withNullable(boolean) - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
- withNullBehavior(RowJson.RowJsonDeserializer.NullBehavior) - Method in class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn
-
Sets the behavior of the deserializer according to
RowJson.RowJsonDeserializer.NullBehavior
. - withNumberOfClientsPerWorker(int) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Write
-
The number of clients that each worker will create.
- withNumberOfRecordsRead(long) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
-
Sets the number of records read in the partition change stream query before reading this record.
- withNumBuckets(Integer) - Method in class org.apache.beam.sdk.transforms.Redistribute.RedistributeArbitrarily
- withNumBuckets(Integer) - Method in class org.apache.beam.sdk.transforms.Reshuffle.ViaRandomKey
- withNumFileShards(int) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Control how many file shards are written when using BigQuery load jobs.
- withNumPartitions(int) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadWithPartitions
-
The number of partitions.
- withNumQuerySplits(int) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
- withNumShards(int) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
-
Configures the number of output shards produced overall (when using unwindowed writes) or per-window (when using windowed writes).
- withNumShards(int) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
- withNumShards(int) - Method in class org.apache.beam.sdk.io.FileIO.Write
-
Specifies to use a given fixed number of shards per window.
- withNumShards(int) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Write
-
The number of workers used by the job to write to Solace.
- withNumShards(int) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
-
Configures the number of output shards produced overall (when using unwindowed writes) or per-window (when using windowed writes).
- withNumShards(int) - Method in class org.apache.beam.sdk.io.TextIO.Write
- withNumShards(int) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
-
Writes to the provided number of shards.
- withNumShards(int) - Method in class org.apache.beam.sdk.io.WriteFiles
-
Returns a new
WriteFiles
that will write to the currentFileBasedSink
using the specified number of shards. - withNumShards(Integer) - Method in class org.apache.beam.sdk.io.csv.CsvIO.Write
-
Specifies to use a given fixed number of shards per window.
- withNumShards(Integer) - Method in class org.apache.beam.sdk.io.json.JsonIO.Write
-
Specifies to use a given fixed number of shards per window.
- withNumShards(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.FileIO.Write
- withNumShards(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
- withNumShards(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.TextIO.Write
- withNumShards(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.WriteFiles
-
Returns a new
WriteFiles
that will write to the currentFileBasedSink
using theValueProvider
specified number of shards. - withNumSplits(int) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
-
Sets the user defined number of splits.
- withNumStorageWriteApiStreams(int) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Control how many parallel streams are used when using Storage API writes.
- withOAuth(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
Sets OAuth authentication.
- withOAuth(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
Sets OAuth authentication.
- withOffset(Duration) - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
-
Partitions the timestamp space into half-open intervals of the form [N * size + offset, (N + 1) * size + offset), where 0 is the epoch.
- withOffset(Duration) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
-
Assigns timestamps into half-open intervals of the form [N * period + offset, N * period + offset + size).
- withOffsetConsumerConfigOverrides(Map<String, Object>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
Set additional configuration for the offset consumer.
- withOffsetConsumerConfigOverrides(Map<String, Object>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Set additional configuration for the backend offset consumer.
- withOffsetDeduplication(Boolean) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- withOnCompleted(Runnable) - Method in class org.apache.beam.sdk.fn.test.TestStreams.Builder
-
Returns a new
TestStreams.Builder
like this one with the specifiedStreamObserver.onCompleted()
callback. - withOnError(Runnable) - Method in class org.apache.beam.sdk.fn.test.TestStreams.Builder
-
Returns a new
TestStreams.Builder
like this one with the specifiedStreamObserver.onError(java.lang.Throwable)
callback. - withOnError(Consumer<Throwable>) - Method in class org.apache.beam.sdk.fn.test.TestStreams.Builder
-
Returns a new
TestStreams.Builder
like this one with the specifiedStreamObserver.onError(java.lang.Throwable)
consumer. - withOnNext(Consumer<T>) - Static method in class org.apache.beam.sdk.fn.test.TestStreams
-
Creates a test
CallStreamObserver
TestStreams.Builder
that forwardsStreamObserver.onNext(V)
calls to the suppliedConsumer
. - withOnTimeBehavior(Window.OnTimeBehavior) - Method in class org.apache.beam.sdk.transforms.windowing.Window
-
Override the default
Window.OnTimeBehavior
, to control whether to output an empty on-time pane. - withOnTimeBehavior(Window.OnTimeBehavior) - Method in class org.apache.beam.sdk.values.WindowingStrategy
- withOperationTimeout(Duration) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Returns a new
BigtableIO.Read
with the operation timeout. - withOperationTimeout(Duration) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
-
Returns a new
BigtableIO.Write
with the operation timeout. - withOptionalParticipation() - Method in class org.apache.beam.sdk.schemas.transforms.CoGroup.By
-
Means that this field will participate in a join even when not present, similar to SQL outer-join semantics.
- withOptions(Schema.Options) - Method in class org.apache.beam.sdk.schemas.Schema.Field
-
Returns a copy of the Field with the options set.
- withOptions(Schema.Options) - Method in class org.apache.beam.sdk.schemas.Schema
-
Returns a copy of the Schema with the options set.
- withOptions(Schema.Options.Builder) - Method in class org.apache.beam.sdk.schemas.Schema.Field
-
Returns a copy of the Field with the options set.
- withOptions(Schema.Options.Builder) - Method in class org.apache.beam.sdk.schemas.Schema
-
Returns a copy of the Schema with the options set.
- withOrdered(boolean) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
-
Enables ordered bulk insertion (default: true).
- withOrderingKey() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
-
Writes to Pub/Sub with each record's ordering key.
- withOrderingKey(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
- withOrdinality - Variable in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.ZetaSqlUnnest
- WITHOUT_FIELD_REORDERING - Enum constant in enum class org.apache.beam.sdk.extensions.sql.meta.ProjectSupport
- withoutDefaults() - Method in class org.apache.beam.sdk.transforms.Combine.Globally
-
Returns a
PTransform
identical to this, but that does not attempt to provide a default value in the case of empty input. - withoutLimiter() - Static method in interface org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory
- withoutMetadata() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Returns a
PTransform
for PCollection ofKV
, dropping Kafka metatdata. - withoutPartitioning() - Method in interface org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Write.PartitionedWriterBuilder
-
Writes to the sink without need to partition output into specified number of partitions.
- withOutputCoder(Coder<?>) - Method in class org.apache.beam.sdk.extensions.python.PythonExternalTransform
-
Specifies the
Coder
of the outputPCollection
s produced by this transform. - withOutputCoder(Coder<OutputT>) - Method in class org.apache.beam.sdk.transforms.Watch.Growth
-
Specifies a
Coder
to use for the outputs. - withOutputCoders(Map<String, Coder<?>>) - Method in class org.apache.beam.sdk.extensions.python.PythonExternalTransform
-
Specifies the keys and
Coder
s of the outputPCollection
s produced by this transform. - withOutputFilenames() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
-
Specify that output filenames are wanted.
- withOutputFilenames() - Method in class org.apache.beam.sdk.io.TextIO.Write
-
Specify that output filenames are wanted.
- withOutputKeyCoder(Coder<KeyT>) - Method in class org.apache.beam.sdk.transforms.Watch.Growth
-
Specifies the coder for the output key.
- withOutputParallelization(boolean) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
-
Whether to reshuffle the resulting PCollection so results are distributed to all workers.
- withOutputParallelization(boolean) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadAll
-
Whether to reshuffle the resulting PCollection so results are distributed to all workers.
- withOutputParallelization(boolean) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadRows
-
Whether to reshuffle the resulting PCollection so results are distributed to all workers.
- withOutputParallelization(boolean) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Read
-
Whether to reshuffle the resulting PCollection so results are distributed to all workers.
- withOutputParallelization(boolean) - Method in class org.apache.beam.sdk.io.redis.RedisIO.ReadKeyPatterns
-
Whether to reshuffle the resulting PCollection so results are distributed to all workers.
- withOutputParallelization(Boolean) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.Read
-
Whether to reshuffle the resulting PCollection so results are distributed to all workers.
- withOutputs(List<TimestampedValue<OutputT>>) - Method in class org.apache.beam.sdk.transforms.Watch.Growth.PollResult
-
Returns a new
Watch.Growth.PollResult
like this one with the provided outputs. - withOutputSchema(Schema) - Method in class org.apache.beam.sdk.schemas.transforms.Select.Fields
-
Rename all output fields to match the specified schema.
- withOutputSchema(Schema) - Method in class org.apache.beam.sdk.schemas.transforms.Select.Flattened
-
Rename all output fields to match the specified schema.
- withOutputTags(TupleTag<OutputT>, TupleTagList) - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
-
Returns a new multi-output
ParDo
PTransform
that's like thisPTransform
but with the specified output tags. - withOutputTimestamp(Instant) - Method in interface org.apache.beam.sdk.state.Timer
-
Sets event time timer's output timestamp.
- withoutRepeater() - Method in class org.apache.beam.io.requestresponse.RequestResponseIO
-
Turns off repeat invocations (default is on) of
SetupTeardown
andCaller
, using theRepeater
, in the setting ofRequestResponseIO.REPEATABLE_ERROR_TYPES
. - withoutResultFlattening() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
-
Disable flattening of query results.
- withoutResultFlattening() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- withoutSharding() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
-
Forces a single file as output and empty shard name template.
- withoutSharding() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
- withoutSharding() - Method in class org.apache.beam.sdk.io.csv.CsvIO.Write
-
Forces a single file as output and empty shard name template.
- withoutSharding() - Method in class org.apache.beam.sdk.io.json.JsonIO.Write
-
Forces a single file as output and empty shard name template.
- withoutSharding() - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
-
Forces a single file as output and empty shard name template.
- withoutSharding() - Method in class org.apache.beam.sdk.io.TextIO.Write
- withoutSharding() - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
-
Forces a single file as output.
- withoutStrictParsing() - Method in class org.apache.beam.sdk.options.PipelineOptionsFactory.Builder
-
During parsing of the arguments, we will skip over improperly formatted and unknown arguments.
- withoutValidation() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
-
Disable validation that the table exists or the query succeeds prior to pipeline submission.
- withoutValidation() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- withoutValidation() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Disables BigQuery table validation.
- withoutValidation() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Disables validation that the table being read from exists.
- withoutValidation() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
-
Disables validation that the table being read and the metadata table exists, and that the app profile used is single cluster and single row transaction enabled.
- withoutValidation() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
-
Disables validation that the table being written to exists.
- withoutValidation() - Method in class org.apache.beam.sdk.io.TFRecordIO.Read
-
Returns a transform for reading TFRecord files that has GCS path validation on pipeline creation disabled.
- withOverloadRatio(double) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
-
The target ratio between requests sent and successful requests.
- withParallelism(Integer) - Method in class org.apache.beam.sdk.io.splunk.SplunkIO.Write
-
Sets the number of parallel http client connections to the HEC.
- withParallelism(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.splunk.SplunkIO.Write
-
Same as
SplunkIO.Write.withBatchCount(Integer)
but withValueProvider
. - withParameterSetter(JdbcIO.PreparedStatementSetter<ParameterT>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadAll
- withParametersFunction(SerializableFunction<ParameterT, Map<String, Object>>) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.ReadAll
- withParametersFunction(SerializableFunction<ParameterT, Map<String, Object>>) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.WriteUnwind
- withParams(Map<String, Object>) - Method in class org.apache.beam.sdk.io.cdap.ConfigWrapper
- withParent(Schema.TypeName) - Method in class org.apache.beam.sdk.schemas.utils.SchemaZipFold.Context
- withParseFn(SerializableFunction<GenericRecord, X>, Coder<X>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource
-
Reads
GenericRecord
of unspecified schema and maps them to instances of a custom type using the givenparseFn
and encoded using the given coder. - withParseFn(SerializableFunction<RowResult, T>) - Method in class org.apache.beam.sdk.io.kudu.KuduIO.Read
-
Provides the function to parse a row from Kudu into the typed object.
- withParser(MongoDbGridFSIO.Parser<X>) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
- withPartition(Map<String, String>) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Write
-
Sets the partition details.
- withPartitionCols(List<String>) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Read
-
Set the names of the columns that are partitions.
- withPartitionColumn(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadWithPartitions
-
The name of a column of numeric type that will be used for partitioning.
- withPartitionCreatedAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
-
Sets the time at which this partition was first detected and created in the metadata table.
- withPartitionEndTimestamp(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
-
Sets the end time for the partition change stream query that originated this record.
- withPartitioner(KinesisPartitioner<T>) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Write
-
Specify how to partition records among all stream shards (required).
- withPartitioning() - Method in interface org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Write.PartitionedWriterBuilder
-
Writes to the sink with partitioning by Task Id.
- withPartitionOptions(PartitionOptions) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- withPartitionOptions(PartitionOptions) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
-
Note that
PartitionOptions
are currently ignored. - withPartitionQueryTimeout(ValueProvider<Duration>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the PartitionQuery timeout.
- withPartitionQueryTimeout(Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the PartitionQuery timeout.
- withPartitionReadTimeout(ValueProvider<Duration>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the PartitionRead timeout.
- withPartitionReadTimeout(Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the PartitionRead timeout.
- withPartitionRunningAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
-
Sets the time at which the connector started processing this partition.
- withPartitionScheduledAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
-
Sets the time at which this partition was scheduled to be queried.
- withPartitionStartTimestamp(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
-
Sets the start time for the partition change stream query that originated this record.
- withPartitionToken(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
-
Sets the partition token where this record originated from.
- withPassword(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
- withPassword(String) - Method in class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
-
Sets the password to connect to your database.
- withPassword(String) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
-
Specify the password used for authentication.
- withPassword(String) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
-
Specify the password used for authentication.
- withPassword(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
If Elasticsearch authentication is enabled, provide the password.
- withPassword(String) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
-
Define the password to connect to the JMS broker (authenticated).
- withPassword(String) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Write
-
Define the password to connect to the JMS broker (authenticated).
- withPassword(String) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.ConnectionConfiguration
- withPassword(String) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverConfiguration
- withPassword(String) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.DataSourceConfiguration
- withPassword(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
- withPassword(ValueProvider<String>) - Method in class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
-
Sets the password to connect to your database.
- withPassword(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
-
Specify the password used for authentication.
- withPassword(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
-
Specify the password used for authentication.
- withPassword(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverConfiguration
- withPathPart(String) - Method in class org.apache.beam.sdk.schemas.utils.SchemaZipFold.Context
- withPathPrefix(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
If Elasticsearch is not running at the root path, e.g.
- withPayload(String) - Method in class org.apache.beam.sdk.io.splunk.SplunkWriteError.Builder
-
Assigns the payload to be used for reprocessing.
- withPayloadFn(SerializableFunction<InputT, byte[]>) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Write
- withPercision(Integer) - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.Globally
- withPercision(Integer) - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.PerKey
- withPipelineOptions(PipelineOptions) - Method in class org.apache.beam.runners.flink.translation.types.CoderTypeInformation
-
Creates a new
CoderTypeInformation
withPipelineOptions
, that can be used forFileSystems
registration. - withPlacementId(String) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIPredict
- withPluginConfig(PluginConfig) - Method in class org.apache.beam.sdk.io.cdap.CdapIO.Read
-
Sets a
PluginConfig
. - withPluginConfig(PluginConfig) - Method in class org.apache.beam.sdk.io.cdap.CdapIO.Write
-
Sets a
PluginConfig
. - withPointInTimeSearch() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
-
Configures the source to user Point In Time search iteration while reading data from Elasticsearch.
- withPointInTimeSearchAndSortConfiguration(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
-
Similar to
the default PIT search
but setting a specific sorting configuration which Elasticsearch will use to sort for the results. - withPointInTimeSearchAndTimestampSortProperty(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
-
Similar to
the default PIT search
but setting an existing timestamp based property name which Elasticsearch will use to sort for the results. - withPollingInterval(Duration) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Read
-
If specified, polling for new partitions will happen at this periodicity.
- withPollInterval(Duration) - Method in class org.apache.beam.sdk.io.iceberg.IcebergIO.ReadRows
- withPollInterval(Duration) - Method in class org.apache.beam.sdk.transforms.Watch.Growth
-
Specifies how long to wait after a call to
Watch.Growth.PollFn
before calling it again (if at all - according toWatch.Growth.PollResult
and theWatch.Growth.TerminationCondition
). - withPort(int) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
-
Specify the port number of the Apache Cassandra instances.
- withPort(int) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
-
Specify the Cassandra instance port number where to write data.
- withPort(int) - Method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
-
Define the port number of the Redis server.
- withPort(String) - Method in class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
-
Sets the port on which your database is listening.
- withPort(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
-
Specify the port number of the Apache Cassandra instances.
- withPort(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
-
Specify the port number of the Apache Cassandra instances.
- withPort(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
- withPort(ValueProvider<String>) - Method in class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
-
Sets the port on which your database is listening.
- withPortNumber(Integer) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
Sets port number to use to connect to Snowflake.
- withPositionalParameters(List<?>) - Method in class org.apache.beam.sdk.extensions.sql.SqlTransform
- withPrecision(int) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
-
Returns an
ApproximateDistinct.ApproximateDistinctFn
combiner with a new precisionp
. - withPrecision(int) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.GloballyDistinct
-
Sets the precision
p
. - withPrecision(int) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.PerKeyDistinct
-
Sets the precision
p
. - withPrecision(int) - Method in class org.apache.beam.sdk.extensions.zetasketch.HllCount.Init.Builder
-
Explicitly set the
precision
parameter used to compute HLL++ sketch. - withPrecombining(boolean) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
-
Enable precombining.
- withPredicates(List<KuduPredicate>) - Method in class org.apache.beam.sdk.io.kudu.KuduIO.Read
-
Filters the rows read from Kudu using the given predicates.
- withPrefix(String) - Method in class org.apache.beam.sdk.io.FileIO.Write
-
Specifies a common prefix to use for all generated filenames, if using the default file naming.
- withPrefix(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.FileIO.Write
-
Like
FileIO.Write.withPrefix(String)
but with aValueProvider
. - withPreparedStatementSetter(JdbcIO.PreparedStatementSetter<T>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
- withPreparedStatementSetter(JdbcIO.PreparedStatementSetter<T>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteVoid
- withPreparedStatementSetter(JdbcIO.PreparedStatementSetter<T>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteWithResults
- withPrimaryKey(List<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
- withProcessingTime() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- withProcessingTime() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
Use the processing time as the output timestamp.
- withProcessingTime() - Static method in interface org.apache.beam.sdk.io.kafka.TimestampPolicyFactory
-
A
TimestampPolicy
that assigns processing time to each record. - withProcessingTime() - Method in class org.apache.beam.sdk.io.pulsar.PulsarIO.Read
- withProcessingTimePolicy() - Static method in interface org.apache.beam.sdk.io.aws2.kinesis.WatermarkPolicyFactory
-
Returns an ProcessingTimeWatermarkPolicy.
- withProcessingTimeWatermarkPolicy() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
-
Specifies the
WatermarkPolicyFactory
as ProcessingTimeWatermarkPolicyFactory. - withProducerConfigUpdates(Map<String, Object>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
-
Update configuration for the producer.
- withProducerConfigUpdates(Map<String, Object>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
-
Update configuration for the producer.
- withProducerFactoryFn(SerializableFunction<Map<String, Object>, Producer<K, V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
-
Wrapper method over
KafkaIO.WriteRecords.withProducerFactoryFn(SerializableFunction)
, used to keep the compatibility with old API based on KV type of element. - withProducerFactoryFn(SerializableFunction<Map<String, Object>, Producer<K, V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
-
Sets a custom function to create Kafka producer.
- withProjectedColumns(List<String>) - Method in class org.apache.beam.sdk.io.kudu.KuduIO.Read
-
Filters the columns read from the table to include only those specified.
- withProjectId(String) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAICreateCatalogItem
- withProjectId(String) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportCatalogItems
- withProjectId(String) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportUserEvents
- withProjectId(String) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIPredict
- withProjectId(String) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIWriteUserEvent
- withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Returns a new
BigtableIO.Read
that will read from the Cloud Bigtable project indicated by given parameter, requiresBigtableIO.Read.withInstanceId(org.apache.beam.sdk.options.ValueProvider<java.lang.String>)
to be called to determine the instance. - withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
-
Returns a new
BigtableIO.ReadChangeStream
that will stream from the Cloud Bigtable project indicated by given parameter, requiresBigtableIO.ReadChangeStream.withInstanceId(java.lang.String)
to be called to determine the instance. - withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
-
Returns a new
BigtableIO.Write
that will write into the Cloud Bigtable project indicated by given parameter, requiresBigtableIO.Write.withInstanceId(org.apache.beam.sdk.options.ValueProvider<java.lang.String>)
to be called to determine the instance. - withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
-
Returns a new
DatastoreV1.DeleteEntity
that deletes entities from the Cloud Datastore for the specified project. - withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntityWithSummary
-
Returns a new
DatastoreV1.DeleteEntityWithSummary
that deletes entities from the Cloud Datastore for the specified project. - withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
-
Returns a new
DatastoreV1.DeleteKey
that deletes entities from the Cloud Datastore for the specified project. - withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKeyWithSummary
-
Returns a new
DatastoreV1.DeleteKeyWithSummary
that deletes entities from the Cloud Datastore for the specified project. - withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
-
Returns a new
DatastoreV1.Read
that reads from the Cloud Datastore for the specified project. - withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
-
Returns a new
DatastoreV1.Write
that writes to the Cloud Datastore for the default database. - withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.WriteWithSummary
-
Returns a new
DatastoreV1.WriteWithSummary
that writes to the Cloud Datastore for the default database. - withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.Write
-
Factory method to create a new type safe builder for
Write
operations. - withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the Cloud Spanner project ID.
- withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
-
Specifies the Cloud Spanner project.
- withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
-
Specifies the Cloud Spanner project.
- withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
-
Specifies the Cloud Spanner project.
- withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
-
Specifies the Cloud Spanner project.
- withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
Specifies the Cloud Spanner project.
- withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
- withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Returns a new
BigtableIO.Read
that will read from the Cloud Bigtable project indicated by given parameter, requiresBigtableIO.Read.withInstanceId(org.apache.beam.sdk.options.ValueProvider<java.lang.String>)
to be called to determine the instance. - withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
-
Returns a new
BigtableIO.Write
that will write into the Cloud Bigtable project indicated by given parameter, requiresBigtableIO.Write.withInstanceId(org.apache.beam.sdk.options.ValueProvider<java.lang.String>)
to be called to determine the instance. - withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
-
Same as
DatastoreV1.DeleteEntity.withProjectId(String)
but with aValueProvider
. - withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntityWithSummary
-
Same as
DatastoreV1.DeleteEntityWithSummary.withProjectId(String)
but with aValueProvider
. - withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
-
Same as
DatastoreV1.DeleteKey.withProjectId(String)
but with aValueProvider
. - withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKeyWithSummary
-
Same as
DatastoreV1.DeleteKeyWithSummary.withProjectId(String)
but with aValueProvider
. - withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
-
Same as
DatastoreV1.Read.withProjectId(String)
but with aValueProvider
. - withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
-
Same as
DatastoreV1.Write.withProjectId(String)
but with aValueProvider
. - withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.WriteWithSummary
-
Same as
DatastoreV1.WriteWithSummary.withProjectId(String)
but with aValueProvider
. - withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the Cloud Spanner project ID.
- withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
-
Specifies the Cloud Spanner project.
- withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
-
Specifies the Cloud Spanner project.
- withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
-
Specifies the Cloud Spanner project.
- withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
-
Specifies the Cloud Spanner project.
- withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
Specifies the Cloud Spanner project.
- withProjection(List<String>) - Method in class org.apache.beam.sdk.io.mongodb.FindQuery
-
Sets the projection.
- withProjection(Schema, Schema) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Read
-
Enable the reading with projection.
- withProjection(Schema, Schema) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.ReadFiles
- withPropagateSuccessfulStorageApiWrites(boolean) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
If set to true, then all successful writes will be propagated to
WriteResult
and accessible via theWriteResult.getSuccessfulStorageApiInserts()
method. - withPropagateSuccessfulStorageApiWrites(Predicate<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
If called, then all successful writes will be propagated to
WriteResult
and accessible via theWriteResult.getSuccessfulStorageApiInserts()
method. - withProtocol(TProtocolFactory) - Method in class org.apache.beam.sdk.io.thrift.ThriftIO.ReadFiles
-
Specifies the
TProtocolFactory
to be used to decode Thrift objects. - withPublishRequestBuilder(SerializableFunction<T, PublishRequest.Builder>) - Method in class org.apache.beam.sdk.io.aws2.sns.SnsIO.Write
-
Function to convert a message into a
PublishRequest.Builder
(mandatory). - withPublishRequestFn(SerializableFunction<T, PublishRequest>) - Method in class org.apache.beam.sdk.io.aws2.sns.SnsIO.Write
-
Deprecated.
- withPublishTime() - Method in class org.apache.beam.sdk.io.pulsar.PulsarIO.Read
- withPublishTimestampFunction(KafkaPublishTimestampFunction<KV<K, V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
-
Deprecated.use
KafkaIO.WriteRecords
andProducerRecords
to set publish timestamp. - withPublishTimestampFunction(KafkaPublishTimestampFunction<ProducerRecord<K, V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
-
Deprecated.use
ProducerRecords
to set publish timestamp. - withPubsubRootUrl(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
- withPullFrequencySec(Long) - Method in class org.apache.beam.sdk.io.cdap.CdapIO.Read
-
Delay in seconds between polling for new records updates.
- withPullFrequencySec(Long) - Method in class org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIO.Read
-
Delay in seconds between polling for new records updates.
- withPulsarClient(SerializableFunction<String, PulsarClient>) - Method in class org.apache.beam.sdk.io.pulsar.PulsarIO.Read
- withQuery(Statement) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- withQuery(Statement) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
- withQuery(Query) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
-
Returns a new
DatastoreV1.Read
that reads the results of the specified query. - withQuery(String) - Method in class org.apache.beam.sdk.io.azure.cosmos.CosmosIO.Read
-
Specify the query to read data.
- withQuery(String) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
-
Specify the query to read data.
- withQuery(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
-
Provide a query used while reading from Elasticsearch.
- withQuery(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- withQuery(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
- withQuery(String) - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsV19.Read
-
Creates and returns a new
GoogleAdsV19.Read
transform with the specified query. - withQuery(String) - Method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Read
-
Sets the query to use.
- withQuery(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
- withQuery(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadAll
- withQuery(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadRows
- withQuery(String) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.Read
- withQuery(String) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.ReadWithPartitions
- withQuery(String) - Method in class org.apache.beam.sdk.io.solr.SolrIO.Read
-
Provide a query used while reading from Solr.
- withQuery(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
-
Specify the query to read data.
- withQuery(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
-
Provide a
ValueProvider
that provides the query used while reading from Elasticsearch. - withQuery(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
- withQuery(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadAll
- withQuery(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadRows
- withQueryFn(SerializableFunction<MongoCollection<Document>, MongoCursor<Document>>) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
-
Sets a queryFn.
- withQueryLocation(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
-
BigQuery geographic location where the query job will be executed.
- withQueryName(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- withQueryName(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
- withQueryPlannerClass(Class<? extends QueryPlanner>) - Method in class org.apache.beam.sdk.extensions.sql.SqlTransform
- withQueryPriority(BigQueryIO.TypedRead.QueryPriority) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- withQueryStartedAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
-
Sets the time that the change stream query which produced this record started.
- withQueryTempDataset(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
-
Temporary dataset reference when using
BigQueryIO.TypedRead.fromQuery(String)
. - withQueryTempProjectAndDataset(String, String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- withQueryTimeout(Integer) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
-
Sets the default query timeout that will be used for connections created by this source.
- withQueryTimeout(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
-
Same as
JdbcIO.DataSourceConfiguration.withQueryTimeout(Integer)
but accepting a ValueProvider. - withQueryTransformation(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
-
A query to be executed in Snowflake.
- withQueryTransformation(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
- withQueue(String) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
-
Specify the JMS queue destination name where to read messages from.
- withQueue(String) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Write
-
Specify the JMS queue destination name where to send messages to.
- withQueue(String) - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO.Read
-
If you want to directly consume messages from a specific queue, you just have to specify the queue name.
- withQueue(String) - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO.Write
-
Defines the queue where the messages will be sent.
- withQueueDeclare(boolean) - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO.Read
-
You can "force" the declaration of a queue on the RabbitMQ broker.
- withQueueDeclare(boolean) - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO.Write
-
If the queue is not declared by another application,
RabbitMqIO
can declare the queue itself. - withQueueUrl(String) - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.Read
-
Define the queueUrl used by the
SqsIO.Read
to receive messages from SQS. - withQuotationMark(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
-
Sets Snowflake-specific quotations around strings.
- withQuotationMark(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
-
Sets Snowflake-specific quotations around strings.
- withQuotationMark(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
- withRampupThrottlingDisabled() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
-
Returns a new
DatastoreV1.DeleteEntity
that does not throttle during ramp-up. - withRampupThrottlingDisabled() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntityWithSummary
-
Returns a new
DatastoreV1.DeleteEntityWithSummary
that does not throttle during ramp-up. - withRampupThrottlingDisabled() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
-
Returns a new
DatastoreV1.DeleteKey
that does not throttle during ramp-up. - withRampupThrottlingDisabled() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKeyWithSummary
-
Returns a new
DatastoreV1.DeleteKeyWithSummary
that does not throttle during ramp-up. - withRampupThrottlingDisabled() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
-
Returns a new
DatastoreV1.Write
that does not throttle during ramp-up. - withRampupThrottlingDisabled() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.WriteWithSummary
-
Returns a new
DatastoreV1.WriteWithSummary
that does not throttle during ramp-up. - withRandomAccess() - Method in class org.apache.beam.sdk.transforms.View.AsList
-
Returns a PCollection view like this one, but whose resulting list will have RandomAccess (aka fast indexing).
- withRandomAccess(boolean) - Method in class org.apache.beam.sdk.transforms.View.AsList
-
Returns a PCollection view like this one, but whose resulting list will have RandomAccess (aka fast indexing) according to the input parameter.
- withRate(long, Duration) - Method in class org.apache.beam.sdk.io.GenerateSequence
-
Specifies to generate at most a given number of elements per a given period.
- withRateLimitPolicy(GoogleAdsIO.RateLimitPolicyFactory<GoogleAdsError>) - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsV19.Read
-
Creates and returns a new
GoogleAdsV19.Read
transform with the specified rate limit policy factory. - withRateLimitPolicy(GoogleAdsIO.RateLimitPolicyFactory<GoogleAdsError>) - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsV19.ReadAll
-
Creates and returns a new
GoogleAdsV19.ReadAll
transform with the specified rate limit policy factory. - withReadChangeStreamTimeout(Duration) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
-
Returns a new
BigtableIO.ReadChangeStream
that overrides timeout for ReadChangeStream requests. - withReadCommitted() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Sets "isolation_level" to "read_committed" in Kafka consumer configuration.
- withReadCommitted() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
Sets "isolation_level" to "read_committed" in Kafka consumer configuration.
- withReadOperation(ReadOperation) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
- withReadTime(Instant) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchGetDocuments.Builder
- withReadTime(Instant) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.ListCollectionIds.Builder
- withReadTime(Instant) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.ListDocuments.Builder
- withReadTime(Instant) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.PartitionQuery.Builder
- withReadTime(Instant) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.RunQuery.Builder
- withReadTime(Instant) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
-
Returns a new
DatastoreV1.Read
that reads at the specifiedreadTime
. - withReadTimeout(Integer) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
-
Specify the Cassandra client read timeout in ms.
- withReadTimeout(Integer) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
-
Cassandra client socket option to set the read timeout in ms.
- withReadTimeout(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
-
Specify the Cassandra client read timeout in ms.
- withReadTimeout(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
-
Cassandra client socket option to set the read timeout in ms.
- withReadTransaction() - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.ReadAll
- withReceiveTimeout(Duration) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
-
If set, block for the Duration of timeout for each poll to new JMS record if the previous poll returns no new record.
- withRecordAggregation(Consumer<KinesisIO.RecordAggregation.Builder>) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Write
-
Enable record aggregation that is compatible with the KPL / KCL.
- withRecordAggregation(KinesisIO.RecordAggregation) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Write
-
Enable record aggregation that is compatible with the KPL / KCL.
- withRecordAggregationDisabled() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Write
-
Disable KPL / KCL like record aggregation.
- withRecordClass(Class<T>) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
-
Sets a JAXB annotated class that can be populated using a record of the provided XML file.
- withRecordClass(Class<T>) - Method in class org.apache.beam.sdk.io.xml.XmlIO.ReadFiles
- withRecordClass(Class<T>) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Write
-
Writes objects of the given class mapped to XML elements using JAXB.
- withRecordElement(String) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
-
Sets name of the record element of the XML document.
- withRecordElement(String) - Method in class org.apache.beam.sdk.io.xml.XmlIO.ReadFiles
- withRecordNumMetadata() - Method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.Read
-
Allows the user to opt into getting recordNums associated with each record.
- withRecordReadAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
-
Sets the time at which the record was fully read.
- withRecordStreamEndedAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
-
Sets the time at which the record finished streaming.
- withRecordStreamStartedAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
-
Sets the time at which the record started to be streamed.
- withRecordTimestamp(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
-
Sets the timestamp of when this record occurred.
- withRedistribute() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Sets redistribute transform that hints to the runner to try to redistribute the work evenly.
- withRedistribute() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
Enable Redistribute.
- withRedistributeNumKeys(int) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- withRedistributeNumKeys(int) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
- withRelativeError(double) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.GlobalSketch
-
Sets the relative error
epsilon
. - withRelativeError(double) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.PerKeySketch
-
Sets the relative error
epsilon
. - withReplicaInfo(SolrIO.ReplicaInfo) - Method in class org.apache.beam.sdk.io.solr.SolrIO.Read
-
Read from a specific Replica (partition).
- withReportDiagnosticMetrics() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
-
Whether additional diagnostic metrics should be reported for a Transform.
- withRepresentativeCoder(Coder<IdT>) - Method in class org.apache.beam.sdk.transforms.Deduplicate.WithRepresentativeValues
-
Return a
WithRepresentativeValues
PTransform
that is like this one, but with the specified id type coder. - withRepresentativeType(TypeDescriptor<IdT>) - Method in class org.apache.beam.sdk.transforms.Deduplicate.WithRepresentativeValues
-
Return a
WithRepresentativeValues
PTransform
that is like this one, but with the specified id type descriptor. - withRepresentativeType(TypeDescriptor<IdT>) - Method in class org.apache.beam.sdk.transforms.Distinct.WithRepresentativeValues
-
Return a
WithRepresentativeValues
PTransform
that is like this one, but with the specified output type descriptor. - withRepresentativeValueFn(SerializableFunction<T, IdT>) - Static method in class org.apache.beam.sdk.transforms.Deduplicate
-
Returns a deduplication transform that deduplicates values using the supplied representative value for up to 10 mins within the
processing time domain
. - withRepresentativeValueFn(SerializableFunction<T, IdT>) - Static method in class org.apache.beam.sdk.transforms.Distinct
-
Returns a
Distinct<T, IdT>
PTransform
. - withRequestRecordsLimit(int) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
-
Specifies the maximum number of records in GetRecordsResult returned by GetRecords call which is limited by 10K records.
- withRequiresDeduping() - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
-
If set, requires runner deduplication for the messages.
- withResponseItemJson(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Document
-
Sets the element from Elasticsearch Bulk API response "items" pertaining to this WriteSummary.
- withResults() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
-
Returns
DatastoreV1.DeleteEntityWithSummary
transform which can be used inWait.on(PCollection[])
to wait until all data is deleted. - withResults() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
-
Returns
DatastoreV1.DeleteKeyWithSummary
transform which can be used inWait.on(PCollection[])
to wait until all data is deleted. - withResults() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
-
Returns
DatastoreV1.WriteWithSummary
transform which can be used inWait.on(PCollection[])
to wait until all data is written. - withResults() - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
-
Returns
JdbcIO.WriteVoid
transform which can be used inWait.on(PCollection[])
to wait until all data is written. - withResumeDelay(Duration) - Method in class org.apache.beam.sdk.transforms.DoFn.ProcessContinuation
-
Builder method to set the value of
DoFn.ProcessContinuation.resumeDelay()
. - withRetained(boolean) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Write
-
Whether or not the publish message should be retained by the messaging engine.
- withRetentionPolicy(String) - Method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Read
-
Sets the retention policy to use.
- withRetentionPolicy(String) - Method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Write
-
Sets the retention policy to use.
- withRetryableCodes(ImmutableSet<StatusCode.Code>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the errors that will be retried by the client library for all operations.
- withRetryConfiguration(ElasticsearchIO.RetryConfiguration) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BulkIO
-
Provides configuration to retry a failed batch call to Elasticsearch.
- withRetryConfiguration(ElasticsearchIO.RetryConfiguration) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
- withRetryConfiguration(JdbcIO.RetryConfiguration) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
- withRetryConfiguration(JdbcIO.RetryConfiguration) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteVoid
-
When a SQL exception occurs,
JdbcIO.Write
uses thisJdbcIO.RetryConfiguration
to exponentially back off and retry the statements based on theJdbcIO.RetryConfiguration
mentioned. - withRetryConfiguration(JdbcIO.RetryConfiguration) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteWithResults
-
When a SQL exception occurs,
JdbcIO.Write
uses thisJdbcIO.RetryConfiguration
to exponentially back off and retry the statements based on theJdbcIO.RetryConfiguration
mentioned. - withRetryConfiguration(RetryConfiguration) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Write
-
Specify the JMS retry configuration.
- withRetryConfiguration(SolrIO.RetryConfiguration) - Method in class org.apache.beam.sdk.io.solr.SolrIO.Write
-
Provides configuration to retry a failed batch call to Solr.
- withRetryStrategy(JdbcIO.RetryStrategy) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
- withRetryStrategy(JdbcIO.RetryStrategy) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteVoid
-
When a SQL exception occurs,
JdbcIO.Write
uses thisJdbcIO.RetryStrategy
to determine if it will retry the statements. - withRetryStrategy(JdbcIO.RetryStrategy) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteWithResults
-
When a SQL exception occurs,
JdbcIO.Write
uses thisJdbcIO.RetryStrategy
to determine if it will retry the statements. - withRingRanges(Set<RingRange>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
- withRingRanges(ValueProvider<Set<RingRange>>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
- withRole(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
Sets user's role to be used when running queries on Snowflake.
- withRole(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
Sets user's role to be used when running queries on Snowflake.
- withRootCaCertificatePath(String) - Method in class org.apache.beam.sdk.io.splunk.SplunkIO.Write
-
Method to set the root CA certificate.
- withRootCaCertificatePath(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.splunk.SplunkIO.Write
-
Same as
SplunkIO.Write.withRootCaCertificatePath(ValueProvider)
but without aValueProvider
. - withRootElement(String) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
-
Sets name of the root element of the XML document.
- withRootElement(String) - Method in class org.apache.beam.sdk.io.xml.XmlIO.ReadFiles
- withRootElement(String) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Sink
- withRootElement(String) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Write
-
Sets the enclosing root element for the generated XML files.
- withRoutingFn(ElasticsearchIO.Write.FieldValueExtractFn) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocToBulk
-
Provide a function to extract the target routing from the document allowing for dynamic document routing.
- withRoutingFn(ElasticsearchIO.Write.FieldValueExtractFn) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
- withRowFilter(RowFilter) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Returns a new
BigtableIO.Read
that will filter the rows read from Cloud Bigtable using the given row filter. - withRowFilter(ValueProvider<RowFilter>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Returns a new
BigtableIO.Read
that will filter the rows read from Cloud Bigtable using the given row filter. - withRowGroupSize(int) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Sink
-
Specify row-group size; if not set or zero, a default is used by the underlying writer.
- withRowMapper(JdbcIO.RowMapper<OutputT>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadAll
- withRowMapper(JdbcIO.RowMapper<T>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
- withRowMapper(JdbcIO.RowMapper<T>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadWithPartitions
- withRowMapper(JdbcIO.RowMapper<V>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteWithResults
- withRowMapper(Neo4jIO.RowMapper<OutputT>) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.ReadAll
- withRowMapper(SingleStoreIO.RowMapper<T>) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.Read
- withRowMapper(SingleStoreIO.RowMapper<T>) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.ReadWithPartitions
- withRowMutationInformationFn(SerializableFunction<T, RowMutationInformation>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Allows upserting and deleting rows for tables with a primary key defined.
- withRowOutput() - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadWithPartitions
-
Data output type is
Row
, and schema is auto-inferred from the database. - withRowRestriction(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- withRowRestriction(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
-
Read only rows which match the specified filter, which must be a SQL expression compatible with Google standard SQL.
- withRowSchema(Schema) - Method in class org.apache.beam.sdk.transforms.Create.Values
-
Returns a
Create.Values
PTransform like this one that uses the givenSchema
to represent objects. - withRpcPriority(Options.RpcPriority) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the RPC priority.
- withRpcPriority(Options.RpcPriority) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
-
Specifies the priority of the change stream queries.
- withRpcPriority(ValueProvider<Options.RpcPriority>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the RPC priority.
- withRpcQosOptions(RpcQosOptions) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchGetDocuments.Builder
-
Specify the
RpcQosOptions
that will be used when bootstrapping the QOS of each running instance of theTransform
created by this builder. - withRpcQosOptions(RpcQosOptions) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchWriteWithDeadLetterQueue.Builder
-
Specify the
RpcQosOptions
that will be used when bootstrapping the QOS of each running instance of theTransform
created by this builder. - withRpcQosOptions(RpcQosOptions) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchWriteWithSummary.Builder
-
Specify the
RpcQosOptions
that will be used when bootstrapping the QOS of each running instance of theTransform
created by this builder. - withRpcQosOptions(RpcQosOptions) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.ListCollectionIds.Builder
-
Specify the
RpcQosOptions
that will be used when bootstrapping the QOS of each running instance of theTransform
created by this builder. - withRpcQosOptions(RpcQosOptions) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.ListDocuments.Builder
-
Specify the
RpcQosOptions
that will be used when bootstrapping the QOS of each running instance of theTransform
created by this builder. - withRpcQosOptions(RpcQosOptions) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.PartitionQuery.Builder
-
Specify the
RpcQosOptions
that will be used when bootstrapping the QOS of each running instance of theTransform
created by this builder. - withRpcQosOptions(RpcQosOptions) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.RunQuery.Builder
-
Specify the
RpcQosOptions
that will be used when bootstrapping the QOS of each running instance of theTransform
created by this builder. - withRunnerDeterminedSharding() - Method in class org.apache.beam.sdk.io.WriteFiles
-
Returns a new
WriteFiles
that will write to the currentFileBasedSink
with runner-determined sharding. - withSamplePeriod(Duration) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
-
Configure the length of time sampled request data will be retained.
- withSamplePeriodBucketSize(Duration) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
-
Configure the size of buckets within the specified
samplePeriod
. - withScan(Scan) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
-
Filters the rows read from HBase using the given* scan.
- withScanRequestFn(SerializableFunction<Void, ScanRequest>) - Method in class org.apache.beam.sdk.io.aws2.dynamodb.DynamoDBIO.Read
-
Can't pass ScanRequest object directly from client since this object is not full serializable.
- withScanResponseMapperFn(SerializableFunction<ScanResponse, T>) - Method in class org.apache.beam.sdk.io.aws2.dynamodb.DynamoDBIO.Read
- withSchema(TableSchema) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Uses the specified schema for rows to be written.
- withSchema(Class<X>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource
-
Reads files containing records of the given class.
- withSchema(String) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource
-
Reads files containing records that conform to the given schema.
- withSchema(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
Sets schema to use when connecting to Snowflake.
- withSchema(Schema) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
-
Sets the output schema.
- withSchema(Schema) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
- withSchema(Schema) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource
- withSchema(ValueProvider<TableSchema>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Same as
BigQueryIO.Write.withSchema(TableSchema)
but using a deferredValueProvider
. - withSchema(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- withSchema(Schema) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTable.DocumentToRow
- withSchema(Schema) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadRows
- withSchema(Schema) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadWithPartitions
- withSchema(Schema) - Static method in class org.apache.beam.sdk.transforms.JsonToRow
- withSchema(Schema) - Static method in class org.apache.beam.sdk.values.Row
-
Creates a row builder with specified
Row.getSchema()
. - withSchema(Schema, TypeDescriptor<T>, SerializableFunction<T, Row>, SerializableFunction<Row, T>) - Method in class org.apache.beam.sdk.transforms.Create.TimestampedValues
-
Returns a
Create.TimestampedValues
PTransform like this one that uses the givenSchema
to represent objects. - withSchema(Schema, TypeDescriptor<T>, SerializableFunction<T, Row>, SerializableFunction<Row, T>) - Method in class org.apache.beam.sdk.transforms.Create.Values
-
Returns a
Create.Values
PTransform like this one that uses the givenSchema
to represent objects. - withSchema(Schema, TypeDescriptor<T>, SerializableFunction<T, Row>, SerializableFunction<Row, T>) - Method in class org.apache.beam.sdk.transforms.Create.WindowedValues
-
Returns a
Create.WindowedValues
PTransform like this one that uses the givenSchema
to represent objects. - withSchemaAndNullBehavior(Schema, RowJson.RowJsonDeserializer.NullBehavior) - Static method in class org.apache.beam.sdk.transforms.JsonToRow
- withSchemaFromView(PCollectionView<Map<String, String>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Allows the schemas for each table to be computed within the pipeline itself.
- withSchemaReadySignal(PCollection<?>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
Specifies an optional input PCollection that can be used as the signal for
Wait.OnSignal
to indicate when the database schema is ready to be read. - withSchemaUpdateOptions(Set<BigQueryIO.Write.SchemaUpdateOption>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Allows the schema of the destination table to be updated as a side effect of the write.
- withScrollKeepalive(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
-
Provide a scroll keepalive.
- withSelectedFields(List<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- withSelectedFields(ValueProvider<List<String>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
-
Read only the specified fields (columns) from a BigQuery table.
- withSempClientFactory(SempClientFactory) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Read
-
Set a factory that creates a
SempClientFactory
. - withSerializer(SerializableFunction<T, byte[]>) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Write
-
Specify how to serialize records to bytes on the stream (required).
- withServerName(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
Sets the name of the Snowflake server.
- withServerName(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
- withServerUri(String) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.ConnectionConfiguration
-
Set up the MQTT broker URI.
- withSessionConfig(ValueProvider<SessionConfig>) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.WriteUnwind
- withSessionConfig(SessionConfig) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.ReadAll
- withSessionConfig(SessionConfig) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.WriteUnwind
- withSessionServiceFactory(SessionServiceFactory) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Read
-
Set a factory that creates a
SessionService
. - withSessionServiceFactory(SessionServiceFactory) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Write
-
Set the provider used to obtain the properties to initialize a new session in the broker.
- withShard(int) - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
- withShardedKey() - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches
-
Outputs batched elements associated with sharded input keys.
- withSharding(PTransform<PCollection<UserT>, PCollectionView<Integer>>) - Method in class org.apache.beam.sdk.io.FileIO.Write
-
Specifies a
PTransform
to use for computing the desired number of shards in each window. - withSharding(PTransform<PCollection<UserT>, PCollectionView<Integer>>) - Method in class org.apache.beam.sdk.io.WriteFiles
-
Returns a new
WriteFiles
that will write to the currentFileBasedSink
using the specifiedPTransform
to compute the number of shards. - withShardingFunction(ShardingFunction<UserT, DestinationT>) - Method in class org.apache.beam.sdk.io.WriteFiles
-
Returns a new
WriteFiles
that will write to the currentFileBasedSink
using the specified sharding function to assign shard for inputs. - withShardNameTemplate(String) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
-
Uses the given
ShardNameTemplate
for naming output files. - withShardNameTemplate(String) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
- withShardNameTemplate(String) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
-
Uses the given
ShardNameTemplate
for naming output files. - withShardNameTemplate(String) - Method in class org.apache.beam.sdk.io.TextIO.Write
- withShardNameTemplate(String) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
-
Uses the given shard name template.
- withShardsNumber(Integer) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
-
Number of shards that are created per window.
- withShardTemplate(String) - Method in class org.apache.beam.sdk.io.csv.CsvIO.Write
-
Uses the given
ShardNameTemplate
for naming output files. - withShardTemplate(String) - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy.Params
-
Sets the shard template.
- withShardTemplate(String) - Method in class org.apache.beam.sdk.io.json.JsonIO.Write
-
Uses the given
ShardNameTemplate
for naming output files. - withSideInput() - Method in class org.apache.beam.sdk.schemas.transforms.CoGroup.By
- withSideInput(String, PCollectionView<?>) - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
-
Returns a new multi-output
ParDo
PTransform
that's like thisPTransform
but with the specified additional side inputs. - withSideInput(String, PCollectionView<?>) - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
-
Returns a new
ParDo
PTransform
that's like thisPTransform
but with the specified additional side inputs. - withSideInputs(Iterable<? extends PCollectionView<?>>) - Method in class org.apache.beam.sdk.transforms.Combine.Globally
-
Returns a
PTransform
identical to this, but with the specified side inputs to use inCombineWithContext.CombineFnWithContext
. - withSideInputs(Iterable<? extends PCollectionView<?>>) - Method in class org.apache.beam.sdk.transforms.Combine.GroupedValues
- withSideInputs(Iterable<? extends PCollectionView<?>>) - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
-
Returns a
PTransform
identical to this, but with the specified side inputs to use inCombineWithContext.CombineFnWithContext
. - withSideInputs(Iterable<? extends PCollectionView<?>>) - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
- withSideInputs(Iterable<? extends PCollectionView<?>>) - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
-
Returns a new
ParDo
PTransform
that's like thisPTransform
but with the specified additional side inputs. - withSideInputs(List<PCollectionView<?>>) - Method in class org.apache.beam.sdk.io.WriteFiles
- withSideInputs(Map<String, PCollectionView<?>>) - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
-
Returns a new multi-output
ParDo
PTransform
that's like thisPTransform
but with the specified additional side inputs. - withSideInputs(Map<String, PCollectionView<?>>) - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
-
Returns a new
ParDo
PTransform
that's like thisPTransform
but with the specified additional side inputs. - withSideInputs(PCollectionView<?>...) - Method in class org.apache.beam.sdk.transforms.Combine.Globally
-
Returns a
PTransform
identical to this, but with the specified side inputs to use inCombineWithContext.CombineFnWithContext
. - withSideInputs(PCollectionView<?>...) - Method in class org.apache.beam.sdk.transforms.Combine.GroupedValues
- withSideInputs(PCollectionView<?>...) - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
-
Returns a
PTransform
identical to this, but with the specified side inputs to use inCombineWithContext.CombineFnWithContext
. - withSideInputs(PCollectionView<?>...) - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
-
Returns a new multi-output
ParDo
PTransform
that's like thisPTransform
but with the specified additional side inputs. - withSideInputs(PCollectionView<?>...) - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
-
Returns a new
ParDo
PTransform
that's like thisPTransform
but with the specified additional side inputs. - withSingletonValues() - Method in class org.apache.beam.sdk.transforms.View.AsMap
-
Deprecated.this method simply returns this AsMap unmodified
- withSize(int) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.FixedBytesField
-
Create a
AvroUtils.FixedBytesField
with the specified size. - withSize(long) - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches
- withSkew(Duration) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
- withSkipHeaderLines(int) - Method in class org.apache.beam.sdk.io.TextIO.Read
- withSkipHeaderLines(int) - Method in class org.apache.beam.sdk.io.TextIO.ReadFiles
- withSkipIfEmpty() - Method in class org.apache.beam.sdk.io.WriteFiles
- withSkipIfEmpty(boolean) - Method in class org.apache.beam.sdk.io.WriteFiles
-
Set this sink to skip writing any files if the PCollection is empty.
- withSkipKeyClone(boolean) - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
-
Determines if key clone should be skipped or not (default is 'false').
- withSkipValueClone(boolean) - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
-
Determines if value clone should be skipped or not (default is 'false').
- withSnowflakeServices(SnowflakeServices) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
-
A snowflake service
SnowflakeServices
implementation which is supposed to be used. - withSnowPipe(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
-
Sets name of SnowPipe which can be created in Snowflake dashboard or cli:
- withSnowPipe(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
-
Same as
withSnowPipe(String
, but with aValueProvider
. - withSocketTimeout(Integer) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
If set, overwrites the default max retry timeout (30000ms) in the Elastic
RestClient
and the default socket timeout (30000ms) in theRequestConfig
of the ElasticRestClient
. - withSoftTimeout(Duration) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.RestrictionInterrupter
-
Sets a soft timeout from now for processing new positions.
- withSource(String) - Method in class org.apache.beam.sdk.io.splunk.SplunkEvent.Builder
-
Assigns source value to the event metadata.
- withSourceConnector(ValueProvider<SourceConnector>) - Method in class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
- withSourceConnector(SourceConnector) - Method in class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
-
Sets the
SourceConnector
to be used. - withSourceType(String) - Method in class org.apache.beam.sdk.io.splunk.SplunkEvent.Builder
-
Assigns sourceType value to the event metadata.
- withSpannerConfig(SpannerConfig) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
-
Specifies the Cloud Spanner configuration.
- withSpannerConfig(SpannerConfig) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
-
Specifies the Cloud Spanner configuration.
- withSpannerConfig(SpannerConfig) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
-
Specifies the Cloud Spanner configuration.
- withSpannerConfig(SpannerConfig) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
-
Specifies the Cloud Spanner configuration.
- withSpannerConfig(SpannerConfig) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
Specifies the Cloud Spanner configuration.
- withSparkReceiverBuilder(ReceiverBuilder<V, ? extends Receiver<V>>) - Method in class org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIO.Read
-
Sets
ReceiverBuilder
with value and custom SparkReceiver
class. - withSparsePrecision(int) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.GloballyDistinct
-
Sets the sparse representation's precision
sp
. - withSparsePrecision(int) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.PerKeyDistinct
-
Sets the sparse representation's precision
sp
. - withSparseRepresentation(int) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
-
Returns an
ApproximateDistinct.ApproximateDistinctFn
combiner with a new sparse representation's precisionsp
. - withSsl(SSLOptions) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
-
Optionally, specify
SSLOptions
configuration to utilize SSL. - withSsl(SSLOptions) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
-
Optionally, specify
SSLOptions
configuration to utilize SSL. - withSsl(ValueProvider<SSLOptions>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
-
Optionally, specify
SSLOptions
configuration to utilize SSL. - withSsl(ValueProvider<SSLOptions>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
-
Optionally, specify
SSLOptions
configuration to utilize SSL. - withSSL(ValueProvider<Boolean>) - Method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
-
Define if a SSL connection to Redis server should be used.
- withSSLEnabled(boolean) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
-
Enable ssl for connection.
- withSSLEnabled(boolean) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
-
Enable ssl for connection.
- withSSLInvalidHostNameAllowed(boolean) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
-
Enable invalidHostNameAllowed for ssl for connection.
- withSSLInvalidHostNameAllowed(boolean) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
-
Enable invalidHostNameAllowed for ssl for connection.
- withStagingBucketName(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
-
Name of the cloud bucket (GCS by now) to use as tmp location of CSVs during COPY statement.
- withStagingBucketName(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
-
Name of the cloud bucket (GCS by now) to use as tmp location of CSVs during COPY statement.
- withStagingBucketName(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
- withStagingBucketName(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
- withStartingDay(int, int, int) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
- withStartingMonth(int, int) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
- withStartingStrategy(IcebergIO.ReadRows.StartingStrategy) - Method in class org.apache.beam.sdk.io.iceberg.IcebergIO.ReadRows
- withStartingYear(int) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
- withStartKey(ByteKey) - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
-
Returns new
ByteKeyRange
like this one, but with the specified start key. - withStartOffset(Long) - Method in class org.apache.beam.sdk.io.cdap.CdapIO.Read
-
Inclusive start offset from which the reading should be started.
- withStartOffset(Long) - Method in class org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIO.Read
-
Inclusive start offset from which the reading should be started.
- withStartPollTimeoutSec(Long) - Method in class org.apache.beam.sdk.io.cdap.CdapIO.Read
-
Delay in seconds before start polling.
- withStartPollTimeoutSec(Long) - Method in class org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIO.Read
-
Waiting time after the
Receiver
starts. - withStartReadTime(Instant) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Use timestamp to set up start offset.
- withStartTime(Instant) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
-
Returns a new
BigtableIO.ReadChangeStream
that will start streaming at the specified start time. - withStartTimestamp(Long) - Method in class org.apache.beam.sdk.io.pulsar.PulsarIO.Read
- withStatement(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
- withStatement(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteVoid
- withStatement(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteWithResults
- withStatement(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteVoid
- withStatement(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteWithResults
- withStatementPreparator(JdbcIO.StatementPreparator) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
- withStatementPreparator(JdbcIO.StatementPreparator) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadRows
- withStatementPreparator(SingleStoreIO.StatementPreparator) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.Read
- withStatusCode(Integer) - Method in class org.apache.beam.sdk.io.splunk.SplunkWriteError.Builder
-
Assigns a return status code to assist with debugging.
- withStatusMessage(String) - Method in class org.apache.beam.sdk.io.splunk.SplunkWriteError.Builder
-
Assigns a return status message to assist with debugging.
- withStopReadTime(Instant) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Use timestamp to set up stop offset.
- withStopTime(Instant) - Method in class org.apache.beam.sdk.extensions.timeseries.FillGaps
- withStorageClient(BigQueryServices.StorageClient) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
- withStorageIntegrationName(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
-
Name of the Storage Integration in Snowflake to be used.
- withStorageIntegrationName(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
-
Name of the Storage Integration in Snowflake to be used.
- withStorageIntegrationName(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
- withStorageIntegrationName(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
- withStreamName(String) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
-
Specify reading from streamName.
- withStreamName(String) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Write
-
Kinesis stream name which will be used for writing (required).
- withSubmissionMode(SolaceIO.SubmissionMode) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Write
-
This setting controls the JCSMP property MESSAGE_CALLBACK_ON_REACTOR.
- withSuccessfulInsertsPropagation(boolean) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
If true, it enables the propagation of the successfully inserted TableRows on BigQuery as part of the
WriteResult
object when usingBigQueryIO.Write.Method.STREAMING_INSERTS
. - withSuffix(String) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
-
Configures the filename suffix for written files.
- withSuffix(String) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
- withSuffix(String) - Method in class org.apache.beam.sdk.io.csv.CsvIO.Write
-
Configures the filename suffix for written files.
- withSuffix(String) - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy.Params
-
Sets the suffix.
- withSuffix(String) - Method in class org.apache.beam.sdk.io.FileIO.Write
-
Specifies a common suffix to use for all generated filenames, if using the default file naming.
- withSuffix(String) - Method in class org.apache.beam.sdk.io.json.JsonIO.Write
-
Configures the filename suffix for written files.
- withSuffix(String) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
-
Configures the filename suffix for written files.
- withSuffix(String) - Method in class org.apache.beam.sdk.io.TextIO.Write
- withSuffix(String) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
-
Writes to the file(s) with the given filename suffix.
- withSuffix(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.FileIO.Write
-
Like
FileIO.Write.withSuffix(String)
but with aValueProvider
. - withSyncInterval(int) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
-
Sets the approximate number of uncompressed bytes to write in each block for the AVRO container format.
- withSyncInterval(int) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
- withTable(String) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
-
Specify the Cassandra table where to read data.
- withTable(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- withTable(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
- withTable(String) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Read
-
Sets the table name to read from.
- withTable(String) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Write
-
Sets the table name to write to, the table should exist beforehand.
- withTable(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadWithPartitions
-
Name of the table in the external database.
- withTable(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
- withTable(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteVoid
- withTable(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteWithResults
- withTable(String) - Method in class org.apache.beam.sdk.io.kudu.KuduIO.Read
-
Reads from the specified table.
- withTable(String) - Method in class org.apache.beam.sdk.io.kudu.KuduIO.Write
-
Writes to the specified table.
- withTable(String) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.Read
- withTable(String) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.ReadWithPartitions
- withTable(String) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.Write
- withTable(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
-
Specify the Cassandra table where to read data.
- withTableDescription(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Specifies the table description.
- withTableId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Returns a new
BigtableIO.Read
that will read from the specified table. - withTableId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
-
Returns a new
BigtableIO.ReadChangeStream
that will stream from the specified table. - withTableId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
-
Returns a new
BigtableIO.Write
that will write to the specified table. - withTableId(String) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
-
Reads from the specified table.
- withTableId(String) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Write
-
Writes to the specified table.
- withTableId(String) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.WriteRowMutations
-
Writes to the specified table.
- withTableId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Returns a new
BigtableIO.Read
that will read from the specified table. - withTableId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
-
Returns a new
BigtableIO.Write
that will write to the specified table. - withTableProvider(String, TableProvider) - Method in class org.apache.beam.sdk.extensions.sql.SqlTransform
- withTableProvider(TableProvider) - Static method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
-
This method creates
BeamSqlEnv
using empty Pipeline Options. - withTableReference(TableReference) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- withTableSchema(TableSchema) - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
-
Set TableSchema.
- withTableSchema(SnowflakeTableSchema) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
-
Table schema to be used during creating table.
- withTempDirectory(String) - Method in class org.apache.beam.sdk.io.FileIO.Write
-
Specifies a directory into which all temporary files will be placed.
- withTempDirectory(ResourceId) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
-
Set the base directory used to generate temporary files.
- withTempDirectory(ResourceId) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
- withTempDirectory(ResourceId) - Method in class org.apache.beam.sdk.io.csv.CsvIO.Write
-
Set the base directory used to generate temporary files.
- withTempDirectory(ResourceId) - Method in class org.apache.beam.sdk.io.json.JsonIO.Write
-
Set the base directory used to generate temporary files.
- withTempDirectory(ResourceId) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
-
Set the base directory used to generate temporary files.
- withTempDirectory(ResourceId) - Method in class org.apache.beam.sdk.io.TextIO.Write
- withTempDirectory(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.FileIO.Write
- withTempDirectory(ValueProvider<ResourceId>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
-
Set the base directory used to generate temporary files.
- withTempDirectory(ValueProvider<ResourceId>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
- withTempDirectory(ValueProvider<ResourceId>) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
-
Set the base directory used to generate temporary files.
- withTempDirectory(ValueProvider<ResourceId>) - Method in class org.apache.beam.sdk.io.TextIO.Write
- withTemplateCompatibility() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
-
Use new template-compatible source implementation.
- withTemplateCompatibility() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- withTempLocation(String) - Method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter.Options
-
Sets the path to a temporary location where the sorter writes intermediate files.
- withTerminationCondition(Watch.Growth.TerminationCondition<HCatalogIO.Read, ?>) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Read
-
If specified, the poll function will stop polling after the termination condition has been satisfied.
- withTerminationPerInput(Watch.Growth.TerminationCondition<InputT, ?>) - Method in class org.apache.beam.sdk.transforms.Watch.Growth
-
Specifies a
Watch.Growth.TerminationCondition
that will be independently used for every input. - withTestServices(BigQueryServices) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
- withTestServices(BigQueryServices) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- withTestServices(BigQueryServices) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
- withThrottleDuration(Duration) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
-
Configure the amount of time an attempt will be throttled if deemed necessary based on previous success rate.
- withThrottlingReportTargetMs(int) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
-
Deprecated.This method has been deprecated in Beam 2.60.0. It does not have an effect.
- withThrottlingTargetMs(int) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
-
Deprecated.This method has been deprecated in Beam 2.60.0. It does not have an effect.
- withThrowWriteErrors(boolean) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BulkIO
-
Whether to throw runtime exceptions when write (IO) errors occur.
- withThrowWriteErrors(boolean) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
- withTikaConfigPath(String) - Method in class org.apache.beam.sdk.io.tika.TikaIO.ParseFiles
-
Uses the given Tika Configuration XML file.
- withTikaConfigPath(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.tika.TikaIO.ParseFiles
-
Like
with(tikaConfigPath)
. - withTime(Long) - Method in class org.apache.beam.sdk.io.splunk.SplunkEvent.Builder
-
Assigns time value to the event metadata.
- withTimeDomain(TimeDomain) - Method in class org.apache.beam.sdk.transforms.Deduplicate.KeyedValues
-
Returns a
KeyedValues
PTransform
like this one but with the specified time domain. - withTimeDomain(TimeDomain) - Method in class org.apache.beam.sdk.transforms.Deduplicate.Values
-
Returns a
Values
PTransform
like this one but with the specified time domain. - withTimeDomain(TimeDomain) - Method in class org.apache.beam.sdk.transforms.Deduplicate.WithRepresentativeValues
-
Returns a
WithRepresentativeValues
PTransform
like this one but with the specified time domain. - withTimeout(int) - Method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
-
Define the Redis connection timeout.
- withTimeout(int) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Read
- withTimeout(int) - Method in class org.apache.beam.sdk.io.redis.RedisIO.ReadKeyPatterns
- withTimeout(int) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Write
- withTimeout(int) - Method in class org.apache.beam.sdk.io.redis.RedisIO.WriteStreams
-
Set the connection timeout for the Redis server connection.
- withTimeout(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
- withTimeout(Duration) - Method in class org.apache.beam.io.requestresponse.RequestResponseIO
-
Overrides the
RequestResponseIO.DEFAULT_TIMEOUT
expected timeout of all user custom code. - withTimePartitioning(TimePartitioning) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Allows newly created tables to include a
TimePartitioning
class. - withTimePartitioning(ValueProvider<TimePartitioning>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Like
BigQueryIO.Write.withTimePartitioning(TimePartitioning)
but using a deferredValueProvider
. - withTimestamp(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
- withTimestamp(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
- withTimestamp(Instant) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Document
-
Sets the timestamp of the element in the PCollection, to be used in order to output WriteSummary to the same window from which the inputDoc originated.
- withTimestamp(Instant) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload
- withTimestampAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
-
When reading from Cloud Pub/Sub where record timestamps are provided as Pub/Sub message attributes, specifies the name of the attribute that contains the timestamp.
- withTimestampAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
-
Writes to Pub/Sub and adds each record's timestamp to the published messages in an attribute with the specified name.
- withTimestampBound(TimestampBound) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
- withTimestampBound(TimestampBound) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
- withTimestampBound(TimestampBound) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
- withTimestampCombiner(TimestampCombiner) - Method in class org.apache.beam.sdk.transforms.windowing.Window
-
Override the default
TimestampCombiner
, to control the output timestamp of values output from aGroupByKey
operation. - withTimestampCombiner(TimestampCombiner) - Method in class org.apache.beam.sdk.values.WindowingStrategy
- withTimestampFn(SerializableFunction<Long, Instant>) - Method in class org.apache.beam.sdk.io.GenerateSequence
-
Specifies the function to use to assign timestamps to the elements.
- withTimestampFn(SerializableFunction<KinesisRecord, Instant>) - Method in class org.apache.beam.sdk.io.aws2.kinesis.WatermarkParameters
-
Specify the
SerializableFunction
to extract the event time from aKinesisRecord
. - withTimestampFn(SerializableFunction<KafkaRecord<K, V>, Instant>) - Static method in interface org.apache.beam.sdk.io.kafka.TimestampPolicyFactory
-
Deprecated.
- withTimestampFn(SerializableFunction<KV<K, V>, Instant>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Deprecated.as of version 2.4. Use
KafkaIO.Read.withTimestampPolicyFactory(TimestampPolicyFactory)
instead. - withTimestampFn(SerializableFunction<T, Instant>) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Read
-
The timestamp function, used for estimating the watermark, mapping the record T to an
Instant
- withTimestampFn(SerializableFunction<V, Instant>) - Method in class org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIO.Read
-
A function to calculate timestamp for a record.
- withTimestampFn2(SerializableFunction<KafkaRecord<K, V>, Instant>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Deprecated.as of version 2.4. Use
KafkaIO.Read.withTimestampPolicyFactory(TimestampPolicyFactory)
instead. - withTimestampPolicyFactory(TimestampPolicyFactory<K, V>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Provide custom
TimestampPolicyFactory
to set event times and watermark for each partition. - WithTimestamps<T> - Class in org.apache.beam.sdk.transforms
-
A
PTransform
for assigning timestamps to all the elements of aPCollection
. - withTimeZone(DateTimeZone) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
- withTimeZone(DateTimeZone) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
- withTimeZone(DateTimeZone) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
- withToDateTime(String) - Method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Read
-
Read metric data till the toDateTime.
- withTopic(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
- withTopic(String) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
-
Specify the JMS topic destination name where to receive messages from.
- withTopic(String) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Write
-
Specify the JMS topic destination name where to send messages to.
- withTopic(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Sets the topic to read from.
- withTopic(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
-
Wrapper method over
KafkaIO.WriteRecords.withTopic(String)
, used to keep the compatibility with old API based on KV type of element. - withTopic(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
-
Sets the default Kafka topic to write to.
- withTopic(String) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.ConnectionConfiguration
-
Set up the MQTT getTopic pattern.
- withTopic(String) - Method in class org.apache.beam.sdk.io.pulsar.PulsarIO.Read
- withTopic(String) - Method in class org.apache.beam.sdk.io.pulsar.PulsarIO.Write
- withTopicArn(String) - Method in class org.apache.beam.sdk.io.aws2.sns.SnsIO.Write
-
SNS topic ARN used for publishing to SNS.
- withTopicFn(SerializableFunction<InputT, String>) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Write
- withTopicNameMapper(SerializableFunction<EventT, String>) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Write
-
Specify the JMS topic destination name where to send messages to dynamically.
- withTopicPartitions(List<TopicPartition>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Sets a list of partitions to read from.
- withTopicPattern(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Internally sets a
Pattern
of topics to read from. - withTopics(List<String>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Sets a list of topics to read from.
- withTopicVerificationLogging(boolean) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- withTotalStreamTimeMillis(long) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
-
Sets the total streaming time (in millis) for this record.
- withTraceSampleProbability(Double) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
-
Deprecated.This configuration has no effect, as tracing is not available.
- withTransaction(PCollectionView<Transaction>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
- withTransaction(PCollectionView<Transaction>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
- withTransactionConfig(ValueProvider<TransactionConfig>) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.WriteUnwind
- withTransactionConfig(TransactionConfig) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.ReadAll
- withTransactionConfig(TransactionConfig) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.WriteUnwind
- withTrigger(Trigger) - Method in class org.apache.beam.sdk.values.WindowingStrategy
- withTriggeringFrequency(Duration) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Choose the frequency at which file writes are triggered.
- withTriggeringFrequency(Duration) - Method in class org.apache.beam.sdk.io.iceberg.IcebergIO.WriteRows
-
Sets the frequency at which data is written to files and a new
Snapshot
is produced. - withTrustSelfSignedCerts(boolean) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
If Elasticsearch uses SSL/TLS then configure whether to trust self signed certs or not.
- withType(Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.Field
-
Returns a copy of the Field with the
Schema.FieldType
set. - withType(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.transforms.Create.TimestampedValues
-
Returns a
Create.TimestampedValues
PTransform like this one that uses the givenTypeDescriptor<T>
to determine theCoder
to use to decode each of the objects into a value of typeT
. - withType(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.transforms.Create.Values
-
Returns a
Create.Values
PTransform like this one that uses the givenTypeDescriptor<T>
to determine theCoder
to use to decode each of the objects into a value of typeT
. - withType(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.transforms.Create.WindowedValues
-
Returns a
Create.WindowedValues
PTransform like this one that uses the givenTypeDescriptor<T>
to determine theCoder
to use to decode each of the objects into a value of typeT
. - withTypeFn(ElasticsearchIO.Write.FieldValueExtractFn) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocToBulk
-
Provide a function to extract the target type from the document allowing for dynamic document routing.
- withTypeFn(ElasticsearchIO.Write.FieldValueExtractFn) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
- withTypeHint(Class<?>, Schema.FieldType) - Method in class org.apache.beam.sdk.extensions.python.PythonExternalTransform
-
Specifies the field type of arguments.
- withUnwindMapName(String) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.WriteUnwind
- withUnwindMapName(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.WriteUnwind
- withUpdateConfiguration(UpdateConfiguration) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
- withUpdateFields(UpdateField...) - Method in class org.apache.beam.sdk.io.mongodb.UpdateConfiguration
-
Sets the configurations for multiple updates.
- withUpdateKey(String) - Method in class org.apache.beam.sdk.io.mongodb.UpdateConfiguration
- withUpperBound(PartitionColumnT) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadWithPartitions
- withUpsertScript(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocToBulk
-
Whether to use scripted updates and what script to use.
- withUpsertScript(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
- withUpToDateThreshold(Duration) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
-
Specifies how late records consumed by this source can be to still be considered on time.
- withUri(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
- withUri(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Write
- withUri(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
-
Define the location of the MongoDB instances using an URI.
- withUri(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
-
Define the location of the MongoDB instances using an URI.
- withUri(String) - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO.Read
- withUri(String) - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO.Write
- withUrl(String) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverConfiguration
- withUrl(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
Sets URL of Snowflake server in following format: jdbc:snowflake://
.snowflakecomputing.com - withUrl(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverConfiguration
- withUrls(List<String>) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverConfiguration
- withUrls(ValueProvider<List<String>>) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverConfiguration
- withUseCorrelationId(boolean) - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO.Read
-
Toggles deduplication of messages based on the amqp correlation-id property on incoming messages.
- withUsePartialUpdate(boolean) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocToBulk
-
Provide an instruction to control whether partial updates or inserts (default) are issued to Elasticsearch.
- withUsePartialUpdate(boolean) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
- withUserDataMapper(SingleStoreIO.UserDataMapper<T>) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.Write
- withUserDataMapper(SnowflakeIO.UserDataMapper<T>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
-
User-defined function mapping user data into CSV lines.
- withUsername(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
- withUsername(String) - Method in class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
-
Sets the username to connect to your database.
- withUsername(String) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
-
Specify the username for authentication.
- withUsername(String) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
-
Specify the username used for authentication.
- withUsername(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
If Elasticsearch authentication is enabled, provide the username.
- withUsername(String) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
-
Define the username to connect to the JMS broker (authenticated).
- withUsername(String) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Write
-
Define the username to connect to the JMS broker (authenticated).
- withUsername(String) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.ConnectionConfiguration
- withUsername(String) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverConfiguration
- withUsername(String) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.DataSourceConfiguration
- withUsername(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
- withUsername(ValueProvider<String>) - Method in class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
-
Sets the username to connect to your database.
- withUsername(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
-
Specify the username for authentication.
- withUsername(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
-
Specify the username for authentication.
- withUsername(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverConfiguration
- withUsernamePasswordAuth(String, String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
Sets username/password authentication.
- withUsernamePasswordAuth(ValueProvider<String>, ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
Sets username/password authentication.
- withUsesReshuffle(boolean) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ParseFiles
-
Specifies if a Reshuffle should run before file reads occur.
- withUsesReshuffle(boolean) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ReadFiles
-
Specifies if a Reshuffle should run before file reads occur.
- withUseStatefulBatches(boolean) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BulkIO
-
Whether or not to use Stateful Processing to ensure bulk requests have the desired number of entities i.e.
- withUseStatefulBatches(boolean) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
- withValidate(boolean) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
- withValidation() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
-
Enable validation of the PubSub Read.
- withValidation() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
-
Enable validation of the PubSub Write.
- withValidation() - Method in class org.apache.beam.sdk.options.PipelineOptionsFactory.Builder
-
After creation we will validate that
PipelineOptions
conforms to all the validation criteria from<T>
. - withValidation() - Method in class org.apache.beam.sdk.options.PipelineOptionsFactory
-
After creation we will validate that
<T>
conforms to all the validation criteria. - withValidationEventHandler(ValidationEventHandler) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
-
Sets the
ValidationEventHandler
to use with JAXB. - withValidationEventHandler(ValidationEventHandler) - Method in class org.apache.beam.sdk.io.xml.XmlIO.ReadFiles
- withValue(WindowedValue<OldT>, NewT) - Static method in class org.apache.beam.sdk.values.WindowedValues
-
Returns a new
WindowedValue
that is a copy of this one, but with a different value, which may have a new typeNewT
. - withValue(OtherT) - Method in interface org.apache.beam.sdk.values.WindowedValue
-
A
WindowedValue
with identical metadata to the current one, but with the provided value. - withValueClass(Class<V>) - Method in class org.apache.beam.sdk.io.cdap.CdapIO.Read
-
Sets a value class.
- withValueClass(Class<V>) - Method in class org.apache.beam.sdk.io.cdap.CdapIO.Write
-
Sets a value class.
- withValueCoder(Coder<NewT>) - Method in class org.apache.beam.sdk.values.WindowedValues.FullWindowedValueCoder
- withValueCoder(Coder<NewT>) - Method in class org.apache.beam.sdk.values.WindowedValues.ParamWindowedValueCoder
- withValueCoder(Coder<NewT>) - Method in class org.apache.beam.sdk.values.WindowedValues.ValueOnlyWindowedValueCoder
-
Deprecated.
- withValueCoder(Coder<NewT>) - Method in class org.apache.beam.sdk.values.WindowedValues.WindowedValueCoder
-
Returns a new
WindowedValueCoder
that is a copy of this one, but with a different value coder. - withValueDeserializer(Class<? extends Deserializer<V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Sets a Kafka
Deserializer
to interpret value bytes read from Kafka. - withValueDeserializer(Class<? extends Deserializer<V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
Sets a Kafka
Deserializer
to interpret value bytes read from Kafka. - withValueDeserializer(DeserializerProvider<V>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- withValueDeserializerAndCoder(Class<? extends Deserializer<V>>, Coder<V>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Sets a Kafka
Deserializer
for interpreting value bytes read from Kafka along with aCoder
for helping the Beam runner materialize value objects at runtime if necessary. - withValueDeserializerAndCoder(Class<? extends Deserializer<V>>, Coder<V>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
Sets a Kafka
Deserializer
for interpreting value bytes read from Kafka along with aCoder
for helping the Beam runner materialize value objects at runtime if necessary. - withValueDeserializerProvider(DeserializerProvider<V>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
- withValueDeserializerProviderAndCoder(DeserializerProvider<V>, Coder<V>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
- withValueDeserializerProviderAndCoder(DeserializerProvider<V>, Coder<V>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- withValueField(String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
- withValueMapper(SerializableBiFunction<EventT, Session, Message>) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Write
-
Map the
EventT
object to aMessage
. - withValueSerializer(Class<? extends Serializer<V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
-
Wrapper method over
KafkaIO.WriteRecords.withValueSerializer(Class)
, used to keep the compatibility with old API based on KV type of element. - withValueSerializer(Class<? extends Serializer<V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
-
Sets a
Serializer
for serializing value to bytes. - withValueTranslation(SimpleFunction<?, V>) - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
-
Transforms the values read from the source using the given value translation function.
- withValueTranslation(SimpleFunction<?, V>, Coder<V>) - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
-
Transforms the values read from the source using the given value translation function.
- withWallTimeWatermarkEstimator() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
Use the
WatermarkEstimators.WallTime
as the watermark estimator. - withWarehouse(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
Sets Snowflake Warehouse to use.
- withWarehouse(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
Sets Snowflake Warehouse to use.
- withWatermark(Instant) - Method in class org.apache.beam.sdk.transforms.Watch.Growth.PollResult
-
Returns a new
Watch.Growth.PollResult
like this one with the provided watermark. - withWatermarkFn(SerializableFunction<KV<K, V>, Instant>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Deprecated.as of version 2.4. Use
KafkaIO.Read.withTimestampPolicyFactory(TimestampPolicyFactory)
instead. - withWatermarkFn2(SerializableFunction<KafkaRecord<K, V>, Instant>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Deprecated.as of version 2.4. Use
KafkaIO.Read.withTimestampPolicyFactory(TimestampPolicyFactory)
instead. - withWatermarkIdleDurationThreshold(Duration) - Method in class org.apache.beam.sdk.io.aws2.kinesis.WatermarkParameters
-
Specify the watermark idle duration to consider before advancing the watermark.
- withWatermarkIdleDurationThreshold(Duration) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Read
-
Optional.
- withWatermarkRefreshRate(Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
- withWindowCoder(Coder<? extends BoundedWindow>) - Method in class org.apache.beam.sdk.transforms.Create.WindowedValues
-
Returns a
Create.WindowedValues
PTransform like this one that uses the givenCoder<T>
to decode each of the objects into a value of typeT
. - withWindowedWrites() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
-
Preserves windowing of input elements and writes them to files based on the element's window.
- withWindowedWrites() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
- withWindowedWrites() - Method in class org.apache.beam.sdk.io.csv.CsvIO.Write
-
Preserves windowing of input elements and writes them to files based on the element's window.
- withWindowedWrites() - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy.Params
-
Specify that writes are windowed.
- withWindowedWrites() - Method in class org.apache.beam.sdk.io.json.JsonIO.Write
-
Preserves windowing of input elements and writes them to files based on the element's window.
- withWindowedWrites() - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
-
Preserves windowing of input elements and writes them to files based on the element's window.
- withWindowedWrites() - Method in class org.apache.beam.sdk.io.TextIO.Write
- withWindowedWrites() - Method in class org.apache.beam.sdk.io.WriteFiles
-
Returns a new
WriteFiles
that writes preserves windowing on it's input. - withWindowFn(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.values.WindowingStrategy
- withWritableByteChannelFactory(FileBasedSink.WritableByteChannelFactory) - Method in class org.apache.beam.sdk.io.csv.CsvIO.Write
-
Returns a transform for writing to text files like this one but that has the given
FileBasedSink.WritableByteChannelFactory
to be used by theFileBasedSink
during output. - withWritableByteChannelFactory(FileBasedSink.WritableByteChannelFactory) - Method in class org.apache.beam.sdk.io.json.JsonIO.Write
-
Returns a transform for writing to text files like this one but that has the given
FileBasedSink.WritableByteChannelFactory
to be used by theFileBasedSink
during output. - withWritableByteChannelFactory(FileBasedSink.WritableByteChannelFactory) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
-
Returns a transform for writing to text files like this one but that has the given
FileBasedSink.WritableByteChannelFactory
to be used by theFileBasedSink
during output. - withWritableByteChannelFactory(FileBasedSink.WritableByteChannelFactory) - Method in class org.apache.beam.sdk.io.TextIO.Write
- withWriteDisposition(BigQueryIO.Write.WriteDisposition) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Specifies what to do with existing data in the table, in case the table already exists.
- withWriteDisposition(WriteDisposition) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
-
A disposition to be used during writing to table phase.
- withWriteRequestMapperFn(SerializableFunction<T, KV<String, WriteRequest>>) - Method in class org.apache.beam.sdk.io.aws2.dynamodb.DynamoDBIO.Write
- withWriteResults() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
-
Returns a
BigtableIO.WriteWithResults
that will emit aBigtableWriteResult
for each batch of rows written. - withWriteResults(JdbcIO.RowMapper<V>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
-
Returns
JdbcIO.WriteWithResults
transform that could return a specific result. - withWriterType(SolaceIO.WriterType) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Write
-
Set the type of writer used by the connector.
- withWriteTempDataset(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Temporary dataset.
- withWriteTransaction() - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.ReadAll
- witValueField(String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
-
Set the name of the value field in the resulting schema.
- WordCount - Class in org.apache.beam.runners.spark.structuredstreaming.examples
-
Duplicated from beam-examples-java to avoid dependency.
- WordCount() - Constructor for class org.apache.beam.runners.spark.structuredstreaming.examples.WordCount
- WordCount.CountWords - Class in org.apache.beam.runners.spark.structuredstreaming.examples
-
A PTransform that converts a PCollection containing lines of text into a PCollection of formatted word counts.
- WordCount.FormatAsTextFn - Class in org.apache.beam.runners.spark.structuredstreaming.examples
-
A SimpleFunction that converts a Word and Count into a printable string.
- WordCount.WordCountOptions - Interface in org.apache.beam.runners.spark.structuredstreaming.examples
-
Options supported by
WordCount
. - Workarounds - Class in org.apache.beam.runners.flink.translation.utils
-
Workarounds for dealing with limitations of Flink or its libraries.
- Workarounds() - Constructor for class org.apache.beam.runners.flink.translation.utils.Workarounds
- WorkerLogLevelOverrides() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.WorkerLogLevelOverrides
-
Deprecated.
- workerStatus(StreamObserver<BeamFnApi.WorkerStatusRequest>) - Method in class org.apache.beam.runners.fnexecution.status.BeamWorkerStatusGrpcService
- WorkItemKeySelector<K,
V> - Class in org.apache.beam.runners.flink.translation.wrappers.streaming -
KeySelector
that retrieves a key from aKeyedWorkItem
. - WorkItemKeySelector(Coder<K>) - Constructor for class org.apache.beam.runners.flink.translation.wrappers.streaming.WorkItemKeySelector
- wrap(String) - Method in class org.apache.beam.sdk.testing.PAssert.PAssertionSite
- wrap(Throwable) - Method in class org.apache.beam.sdk.testing.PAssert.PAssertionSite
- wrapDescriptorProto(DescriptorProtos.DescriptorProto) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
- WrappedList(List<Object>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel.WrappedList
- WrappedMap(Map<Object, Object>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel.WrappedMap
- WrappedRow(Row) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel.WrappedRow
- WrappedSupervisor - Class in org.apache.beam.sdk.io.sparkreceiver
-
Wrapper class for
ReceiverSupervisor
that doesn't use Spark Environment. - WrappedSupervisor(Receiver<?>, SparkConf, SerializableFunction<Object[], Void>) - Constructor for class org.apache.beam.sdk.io.sparkreceiver.WrappedSupervisor
- wrapping(StreamObserver<V>) - Static method in class org.apache.beam.sdk.fn.stream.SynchronizedStreamObserver
-
Create a new
SynchronizedStreamObserver
which will delegate all calls to the underlyingStreamObserver
, synchronizing access to that observer. - wrapProcessContext(DoFn.ProcessContext) - Static method in class org.apache.beam.sdk.transforms.Contextful.Fn.Context
-
Convenience wrapper for creating a
Contextful.Fn.Context
from aDoFn.ProcessContext
, to support the common case when aPTransform
is invoking theclosure
from inside aDoFn
. - wrapSegment(String) - Static method in class org.apache.beam.sdk.metrics.Lineage
-
Wrap segment to valid segment name.
- WritableCoder<T> - Class in org.apache.beam.sdk.io.hadoop
- WritableCoder(Class<T>) - Constructor for class org.apache.beam.sdk.io.hadoop.WritableCoder
- WritableCoder.WritableCoderProviderRegistrar - Class in org.apache.beam.sdk.io.hadoop
- WritableCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.io.hadoop.WritableCoder.WritableCoderProviderRegistrar
- write() - Static method in class org.apache.beam.sdk.io.amqp.AmqpIO
- write() - Static method in class org.apache.beam.sdk.io.aws2.dynamodb.DynamoDBIO
- write() - Static method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO
-
Returns a new
KinesisIO.Write
transform for writing to Kinesis. - write() - Static method in class org.apache.beam.sdk.io.aws2.sns.SnsIO
- write() - Static method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO
-
Deprecated.Use
SqsIO.writeBatches()
for more configuration options. - write() - Static method in class org.apache.beam.sdk.io.cassandra.CassandraIO
-
Provide a
CassandraIO.Write
PTransform
to write data to a Cassandra database. - write() - Static method in class org.apache.beam.sdk.io.cdap.CdapIO
- write() - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO
- write() - Static method in class org.apache.beam.sdk.io.FileIO
-
Writes elements to files using a
FileIO.Sink
. - write() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
-
A
PTransform
that writes aPCollection
to a BigQuery table. - write() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO
-
Creates an uninitialized
BigtableIO.Write
. - write() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1
-
Returns an empty
DatastoreV1.Write
builder. - write() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1
-
The class returned by this method provides the ability to create
PTransforms
for write operations available in the Firestore V1 API provided byFirestoreStub
. - write() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
-
Creates an uninitialized instance of
SpannerIO.Write
. - write() - Static method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO
-
Creates an
HadoopFormatIO.Write.Builder
for creation of Write Transformation. - write() - Static method in class org.apache.beam.sdk.io.hbase.HBaseIO
-
Creates an uninitialized
HBaseIO.Write
. - write() - Static method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO
-
Write data to Hive.
- write() - Static method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO
- write() - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO
-
Write data to a JDBC datasource.
- write() - Static method in class org.apache.beam.sdk.io.jms.JmsIO
- write() - Static method in class org.apache.beam.sdk.io.kafka.KafkaIO
-
Creates an uninitialized
KafkaIO.Write
PTransform
. - write() - Static method in class org.apache.beam.sdk.io.kudu.KuduIO
- write() - Static method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO
-
Write data to GridFS.
- write() - Static method in class org.apache.beam.sdk.io.mongodb.MongoDbIO
-
Write data to MongoDB.
- write() - Static method in class org.apache.beam.sdk.io.mqtt.MqttIO
- write() - Static method in class org.apache.beam.sdk.io.pulsar.PulsarIO
-
Write to Apache Pulsar.
- write() - Static method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO
- write() - Static method in class org.apache.beam.sdk.io.redis.RedisIO
-
Write data to a Redis server.
- write() - Static method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO
-
Write data to a SingleStoreDB datasource.
- write() - Static method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO
-
Write data to Snowflake via COPY statement.
- write() - Static method in class org.apache.beam.sdk.io.solace.SolaceIO
-
Create a
SolaceIO.Write
transform, to write to Solace usingSolace.Record
objects. - write() - Static method in class org.apache.beam.sdk.io.solr.SolrIO
- write() - Static method in class org.apache.beam.sdk.io.TextIO
-
A
PTransform
that writes aPCollection
to a text file (or multiple text files matching a sharding pattern), with each element of the input collection encoded into its own line. - write() - Static method in class org.apache.beam.sdk.io.TFRecordIO
-
A
PTransform
that writes aPCollection
to TFRecord file (or multiple TFRecord files matching a sharding pattern), with each element of the input collection encoded into its own record. - write() - Static method in class org.apache.beam.sdk.io.xml.XmlIO
-
Writes all elements in the input
PCollection
to a single XML file usingXmlIO.sink(java.lang.Class<T>)
. - write(byte[]) - Method in class org.apache.beam.sdk.io.TFRecordIO.Sink
- write(byte[], int, int) - Method in class org.apache.beam.runners.flink.translation.wrappers.DataOutputViewWrapper
- write(byte[], int, int) - Method in class org.apache.beam.sdk.fn.stream.DataStreams.ElementDelimitedOutputStream
- write(int) - Method in class org.apache.beam.runners.flink.translation.wrappers.DataOutputViewWrapper
- write(int) - Method in class org.apache.beam.sdk.fn.stream.DataStreams.ElementDelimitedOutputStream
- write(Kryo, Output) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.SideInputValues.BaseSideInputValues
- write(Kryo, Output, ValueAndCoderLazySerializable<T>) - Method in class org.apache.beam.runners.spark.translation.ValueAndCoderKryoSerializer
- write(ElementT) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Sink
- write(ElementT) - Method in interface org.apache.beam.sdk.io.FileIO.Sink
-
Appends a single element to the file.
- write(Class<T>) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
-
Writes a
PCollection
to an Avro file (or multiple Avro files matching a sharding pattern). - write(Object, Encoder) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.encoder.TimestampEncoding
-
Serializes a
Timestamp
received as datum to the output encoder out. - write(String) - Static method in class org.apache.beam.sdk.io.json.JsonIO
-
Instantiates a
JsonIO.Write
for writing user types ininvalid reference
JSONFormat
- write(String) - Method in class org.apache.beam.sdk.io.TextIO.Sink
- write(String) - Static method in class org.apache.beam.sdk.managed.Managed
-
Instantiates a
Managed.ManagedTransform
transform for the specified sink. - write(String, String) - Static method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO
- write(String, String) - Static method in class org.apache.beam.sdk.io.splunk.SplunkIO
-
Write to Splunk's Http Event Collector (HEC).
- write(String, CSVFormat) - Static method in class org.apache.beam.sdk.io.csv.CsvIO
-
Instantiates a
CsvIO.Write
for writing user types inCSVFormat
format. - write(ByteBuffer) - Method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingSeekableByteChannel
- write(ByteBuffer) - Method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingWritableByteChannel
- write(GenericRecord) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Sink
- write(PublisherOptions) - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteIO
-
Write messages to Pub/Sub Lite.
- write(MongoDbGridFSIO.WriteFn<T>) - Static method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO
- write(SnowflakeBatchServiceConfig) - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceImpl
-
Writing data to Snowflake in batch mode.
- write(SnowflakeBatchServiceConfig) - Method in interface org.apache.beam.sdk.io.snowflake.services.SnowflakeServices.BatchService
- write(SnowflakeStreamingServiceConfig) - Method in interface org.apache.beam.sdk.io.snowflake.services.SnowflakeServices.StreamingService
- write(SnowflakeStreamingServiceConfig) - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeStreamingServiceImpl
-
Writing data to Snowflake in streaming mode.
- write(ValueProvider<String>, ValueProvider<String>) - Static method in class org.apache.beam.sdk.io.splunk.SplunkIO
-
Same as
SplunkIO.write(String, String)
but withValueProvider
. - write(SerializableFunction<T, Solace.Record>) - Static method in class org.apache.beam.sdk.io.solace.SolaceIO
-
Create a
SolaceIO.Write
transform, to write to Solace with a custom type. - write(DataOutputView) - Method in class org.apache.beam.runners.flink.adapter.FlinkKey
- write(OutputT) - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
-
Called for each value in the bundle.
- write(T) - Method in class org.apache.beam.sdk.io.thrift.ThriftIO.Sink
- write(T) - Method in class org.apache.beam.sdk.io.thrift.ThriftIO.ThriftWriter
- write(T) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Sink
- write(T) - Method in interface org.apache.beam.sdk.state.ValueState
-
Set the value.
- write(T, OutputStream) - Method in interface org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.WriteFn
-
Output the object to the given OutputStream.
- Write - Search tag in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1
- Section
- Write() - Constructor for class org.apache.beam.sdk.io.amqp.AmqpIO.Write
- Write() - Constructor for class org.apache.beam.sdk.io.aws2.dynamodb.DynamoDBIO.Write
- Write() - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Write
- Write() - Constructor for class org.apache.beam.sdk.io.aws2.sns.SnsIO.Write
- Write() - Constructor for class org.apache.beam.sdk.io.aws2.sqs.SqsIO.Write
-
Deprecated.
- Write() - Constructor for class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
- Write() - Constructor for class org.apache.beam.sdk.io.cdap.CdapIO.Write
- Write() - Constructor for class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
- Write() - Constructor for class org.apache.beam.sdk.io.csv.CsvIO.Write
- Write() - Constructor for class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
- Write() - Constructor for class org.apache.beam.sdk.io.FileIO.Write
- Write() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
- Write() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
- Write() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
- Write() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write
- Write() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
- Write() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
- Write() - Constructor for class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Write
- Write() - Constructor for class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Write
- Write() - Constructor for class org.apache.beam.sdk.io.jms.JmsIO.Write
- Write() - Constructor for class org.apache.beam.sdk.io.json.JsonIO.Write
- Write() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaIO.Write
- Write() - Constructor for class org.apache.beam.sdk.io.kudu.KuduIO.Write
- Write() - Constructor for class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Write
- Write() - Constructor for class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
- Write() - Constructor for class org.apache.beam.sdk.io.mqtt.MqttIO.Write
- Write() - Constructor for class org.apache.beam.sdk.io.pulsar.PulsarIO.Write
- Write() - Constructor for class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO.Write
- Write() - Constructor for class org.apache.beam.sdk.io.redis.RedisIO.Write
- Write() - Constructor for class org.apache.beam.sdk.io.singlestore.SingleStoreIO.Write
- Write() - Constructor for class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
- Write() - Constructor for class org.apache.beam.sdk.io.solace.SolaceIO.Write
- Write() - Constructor for class org.apache.beam.sdk.io.solr.SolrIO.Write
- Write() - Constructor for class org.apache.beam.sdk.io.splunk.SplunkIO.Write
- Write() - Constructor for class org.apache.beam.sdk.io.TFRecordIO.Write
- Write() - Constructor for class org.apache.beam.sdk.io.xml.XmlIO.Write
- WRITE - Enum constant in enum class org.apache.beam.sdk.io.cassandra.CassandraIO.MutationType
- WRITE - Enum constant in enum class org.apache.beam.sdk.jmh.schemas.RowBundle.Action
-
Write field to object using
GetterBasedSchemaProvider.fromRowFunction(TypeDescriptor)
. - WRITE_APPEND - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.WriteDisposition
-
Specifies that rows may be appended to an existing table.
- WRITE_EMPTY - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.WriteDisposition
-
Specifies that the output table must be empty.
- WRITE_TRANSFORMS - Static variable in class org.apache.beam.sdk.managed.Managed
- WRITE_TRUNCATE - Enum constant in enum class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.WriteDisposition
-
Specifies that write should replace a table.
- WRITE_URN - Static variable in class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar
- WRITE_URN - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.internal.ExternalTransformRegistrarImpl
- WRITE_URN - Static variable in class org.apache.beam.sdk.io.snowflake.crosslanguage.SnowflakeTransformRegistrar
- writeArtifacts(RunnerApi.Pipeline, String) - Method in class org.apache.beam.runners.jobsubmission.PortablePipelineJarCreator
-
Stages all dependencies in pipeline into the jar file at outputStream, returning a new pipeline that references these artifacts as classpath artifacts.
- writeAvros(Class<T>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
-
Returns A
PTransform
that writes binary encoded Avro messages of a given type to a Google Cloud Pub/Sub stream. - writeAvros(Class<T>, SerializableFunction<ValueInSingleWindow<T>, Map<String, String>>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
-
Returns A
PTransform
that writes binary encoded Avro messages of a given type to a Google Cloud Pub/Sub stream. - writeBatches() - Static method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO
- WriteBatches() - Constructor for class org.apache.beam.sdk.io.aws2.sqs.SqsIO.WriteBatches
- WriteBuilder - Class in org.apache.beam.sdk.io.snowflake.crosslanguage
- WriteBuilder() - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.KinesisTransformRegistrar.WriteBuilder
- WriteBuilder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite.WriteBuilder
- WriteBuilder() - Constructor for class org.apache.beam.sdk.io.snowflake.crosslanguage.WriteBuilder
- WriteBuilder.Configuration - Class in org.apache.beam.sdk.io.snowflake.crosslanguage
-
Parameters class to expose the transform to an external SDK.
- writeCallMetric(TableReference) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
- writeCompressed(WritableByteChannel) - Method in enum class org.apache.beam.sdk.io.Compression
- writeCustomType() - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
-
Deprecated.Use
AvroIO.writeCustomType(Class)
instead and provide the custom record class - writeCustomType() - Static method in class org.apache.beam.sdk.io.TextIO
-
A
PTransform
that writes aPCollection
to a text file (or multiple text files matching a sharding pattern), with each element of the input collection encoded into its own line. - writeCustomType(Class<OutputT>) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
-
A
PTransform
that writes aPCollection
to an avro file (or multiple avro files matching a sharding pattern), with each element of the input collection encoded into its own record of type OutputT. - writeCustomTypeToGenericRecords() - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
-
Similar to
AvroIO.writeCustomType()
, but specialized for the case where the output type isGenericRecord
. - writeDefaultJobName(JarOutputStream, String) - Static method in class org.apache.beam.runners.jobsubmission.PortablePipelineJarUtils
- writeDetectNewPartitionMissingPartitions(HashMap<Range.ByteStringRange, Instant>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
-
Write to metadata table serialized missing partitions and how long they have been missing.
- writeDetectNewPartitionVersion() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
-
Set the version number for DetectNewPartition.
- WriteDisposition - Enum Class in org.apache.beam.sdk.io.snowflake.enums
-
Enum containing all supported dispositions during writing to table phase.
- writeDynamic() - Static method in class org.apache.beam.sdk.io.FileIO
-
Writes elements to files using a
FileIO.Sink
and grouping the elements using "dynamic destinations". - WriteErrorMetrics(String) - Constructor for class org.apache.beam.sdk.transforms.errorhandling.ErrorHandler.PTransformErrorHandler.WriteErrorMetrics
- writeExternal(ObjectOutput) - Method in class org.apache.beam.sdk.io.hadoop.SerializableConfiguration
- WriteFailure(Write, WriteResult, Status) - Constructor for class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteFailure
- WriteFiles<UserT,
DestinationT, - Class in org.apache.beam.sdk.ioOutputT> -
A
PTransform
that writes to aFileBasedSink
. - WriteFiles() - Constructor for class org.apache.beam.sdk.io.WriteFiles
- WriteFilesResult<DestinationT> - Class in org.apache.beam.sdk.io
-
The result of a
WriteFiles
transform. - writeFooter() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
-
Writes footer at the end of output files.
- writeGenericRecords() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
- writeGenericRecords(String) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
-
Writes Avro records of the specified schema.
- writeGenericRecords(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
-
Writes Avro records of the specified schema.
- WriteGrouped(SpannerIO.Write) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.WriteGrouped
- writeHeader() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
-
Writes header at the beginning of output files.
- WriteJmsResult<EventT> - Class in org.apache.beam.sdk.io.jms
-
Return type of
JmsIO.Write
transform. - WriteJmsResult(Pipeline, TupleTag<EventT>, PCollection<EventT>) - Constructor for class org.apache.beam.sdk.io.jms.WriteJmsResult
- writeMessages() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
-
Returns A
PTransform
that writes to a Google Cloud Pub/Sub stream. - writeMessagesDynamic() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
-
Enables dynamic destination topics.
- writeMetrics(MetricQueryResults) - Method in interface org.apache.beam.sdk.metrics.MetricsSink
- writeNewPartition(NewPartition) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
-
After a split or merge from a close stream, write the new partition's information to the metadata table.
- writeObject(FileSystem, Path, Object) - Static method in class org.apache.beam.runners.spark.translation.streaming.Checkpoint
- WriteOperation(FileBasedSink<?, DestinationT, OutputT>) - Constructor for class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
-
Constructs a WriteOperation using the default strategy for generating a temporary directory from the base output filename.
- WriteOperation(FileBasedSink<?, DestinationT, OutputT>, ResourceId) - Constructor for class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
-
Create a new WriteOperation.
- writeProtos(Class<T>) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
-
A
PTransform
that writes aPCollection
containing protocol buffer objects to a BigQuery table. - writeProtos(Class<T>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
-
Returns A
PTransform
that writes binary encoded protobuf messages of a given type to a Google Cloud Pub/Sub stream. - writeProtos(Class<T>, SerializableFunction<ValueInSingleWindow<T>, Map<String, String>>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
-
Returns A
PTransform
that writes binary encoded protobuf messages of a given type to a Google Cloud Pub/Sub stream. - Writer(FileBasedSink.WriteOperation<DestinationT, OutputT>, String) - Constructor for class org.apache.beam.sdk.io.FileBasedSink.Writer
-
Construct a new
FileBasedSink.Writer
that will produce files of the given MIME type. - writeRecords() - Static method in class org.apache.beam.sdk.io.kafka.KafkaIO
-
Creates an uninitialized
KafkaIO.WriteRecords
PTransform
. - WriteRecords() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- WriteRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOTranslation.WriteRegistrar
- WriteRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubSubPayloadTranslation.WriteRegistrar
- WriteRegistrar() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergSchemaTransformTranslation.WriteRegistrar
- WriteRegistrar() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaSchemaTransformTranslation.WriteRegistrar
- WriteRegistrar() - Constructor for class org.apache.beam.sdk.io.kafka.upgrade.KafkaIOTranslation.WriteRegistrar
- WriteResult - Class in org.apache.beam.sdk.io.gcp.bigquery
-
The result of a
BigQueryIO.Write
transform. - writeRowMutations() - Static method in class org.apache.beam.sdk.io.hbase.HBaseIO
- writeRows() - Static method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO
-
Write Beam
Row
s to a SingleStoreDB datasource. - writeRows(String) - Static method in class org.apache.beam.sdk.io.json.JsonIO
- writeRows(String, CSVFormat) - Static method in class org.apache.beam.sdk.io.csv.CsvIO
- writeRows(IcebergCatalogConfig) - Static method in class org.apache.beam.sdk.io.iceberg.IcebergIO
- WriteRows() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergIO.WriteRows
- writeSnapshot(DataOutputView) - Method in class org.apache.beam.runners.flink.translation.types.UnversionedTypeSerializerSnapshot
- writeSnapshot(DataOutputView) - Method in class org.apache.beam.runners.flink.translation.wrappers.streaming.state.FlinkStateInternals.FlinkStateNamespaceKeySerializer.FlinkStateNameSpaceSerializerSnapshot
- writeStatement(Schema) - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransform
- writeStreams() - Static method in class org.apache.beam.sdk.io.redis.RedisIO
-
Write stream data to a Redis server.
- WriteStreams() - Constructor for class org.apache.beam.sdk.io.redis.RedisIO.WriteStreams
- WriteStreamServiceImpl(BigQueryOptions) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.WriteStreamServiceImpl
- writeStrings() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
-
Returns A
PTransform
that writes UTF-8 encoded strings to a Google Cloud Pub/Sub stream. - WriteSuccessSummary(int, long) - Constructor for class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.WriteSuccessSummary
- WriteSuccessSummary(int, long) - Constructor for class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteSuccessSummary
- writeTableRows() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
- writeTimeout() - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration
-
Milliseconds to wait for a write on a socket before an exception is thrown.
- writeTimeout(Integer) - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration.Builder
-
Milliseconds to wait for a write on a socket before an exception is thrown.
- writeTo(OutputStream, int, int) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
-
Writes
length
bytes starting atoffset
from the backing data store to the specified output stream. - Write to Cdap Plugin Bounded Sink - Search tag in class org.apache.beam.sdk.io.cdap.CdapIO
- Section
- WriteToMySqlSchemaTransformProvider - Class in org.apache.beam.sdk.io.jdbc.providers
- WriteToMySqlSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.jdbc.providers.WriteToMySqlSchemaTransformProvider
- WriteToOracleSchemaTransformProvider - Class in org.apache.beam.sdk.io.jdbc.providers
- WriteToOracleSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.jdbc.providers.WriteToOracleSchemaTransformProvider
- writeToPort(String, BeamFnApi.RemoteGrpcPort) - Static method in class org.apache.beam.sdk.fn.data.RemoteGrpcPortWrite
-
Create a
RemoteGrpcPortWrite
which writes theRunnerApi.PCollection
with the provided Pipeline id to the providedBeamFnApi.RemoteGrpcPort
. - WriteToPostgresSchemaTransformProvider - Class in org.apache.beam.sdk.io.jdbc.providers
- WriteToPostgresSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.jdbc.providers.WriteToPostgresSchemaTransformProvider
- WriteToPulsarDoFn - Class in org.apache.beam.sdk.io.pulsar
-
Transform for writing to Apache Pulsar.
- WriteToSqlServerSchemaTransformProvider - Class in org.apache.beam.sdk.io.jdbc.providers
- WriteToSqlServerSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.jdbc.providers.WriteToSqlServerSchemaTransformProvider
- writeUnwind() - Static method in class org.apache.beam.sdk.io.neo4j.Neo4jIO
-
Write all rows using a Neo4j Cypher UNWIND cypher statement.
- WriteUnwind() - Constructor for class org.apache.beam.sdk.io.neo4j.Neo4jIO.WriteUnwind
- writeUserEvent() - Static method in class org.apache.beam.sdk.extensions.ml.RecommendationAIIO
- writeVoid() - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO
- WriteVoid() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteVoid
- writeWithKeyNormalization(byte[], DataOutputView) - Method in class org.apache.beam.runners.flink.translation.types.EncodedValueComparator
- WriteWithResults() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteWithResults
- Writing - Search tag in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
- Section
- Writing - Search tag in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
- Section
- Writing - Search tag in class org.apache.beam.sdk.io.solace.SolaceIO
- Section
- Writing Avro files - Search tag in class org.apache.beam.sdk.extensions.avro.io.AvroIO
- Section
- Writing CSV files - Search tag in class org.apache.beam.sdk.io.csv.CsvIO
- Section
- Writing custom types to sinks - Search tag in class org.apache.beam.sdk.io.FileIO
- Section
- Writing data to multiple destinations - Search tag in class org.apache.beam.sdk.extensions.avro.io.AvroIO
- Section
- Writing different values to different tables - Search tag in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
- Section
- Writing files - Search tag in class org.apache.beam.sdk.io.FileIO
- Section
- Writing JSON files - Search tag in class org.apache.beam.sdk.io.json.JsonIO
- Section
- Writing Mutation - Search tag in class org.apache.beam.sdk.io.hbase.HBaseIO
- Section
- Writing Parquet files - Search tag in class org.apache.beam.sdk.io.parquet.ParquetIO
- Section
- Writing Redis key/value pairs - Search tag in class org.apache.beam.sdk.io.redis.RedisIO
- Section
- Writing Redis Streams - Search tag in class org.apache.beam.sdk.io.redis.RedisIO
- Section
- Writing RowMutations - Search tag in class org.apache.beam.sdk.io.hbase.HBaseIO
- Section
- Writing specific or generic records - Search tag in class org.apache.beam.sdk.extensions.avro.io.AvroIO
- Section
- Writing text files - Search tag in class org.apache.beam.sdk.io.TextIO
- Section
- Writing Thrift Files - Search tag in class org.apache.beam.sdk.io.thrift.ThriftIO
- Section
- Writing to a JMS destination - Search tag in class org.apache.beam.sdk.io.jms.JmsIO
- Section
- Writing to a MQTT broker - Search tag in class org.apache.beam.sdk.io.mqtt.MqttIO
- Section
- Writing to Apache Cassandra - Search tag in class org.apache.beam.sdk.io.cassandra.CassandraIO
- Section
- Writing to a static topic or queue - Search tag in class org.apache.beam.sdk.io.solace.SolaceIO
- Section
- Writing to ClickHouse - Search tag in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO
- Section
- Writing to Cloud Bigtable - Search tag in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO
- Section
- Writing to Cloud Spanner - Search tag in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
- Section
- Writing to dynamic destinations - Search tag in class org.apache.beam.sdk.io.solace.SolaceIO
- Section
- Writing to DynamoDB - Search tag in class org.apache.beam.sdk.io.aws2.dynamodb.DynamoDBIO
- Section
- Writing to Elasticsearch - Search tag in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO
- Section
- Writing to HBase - Search tag in class org.apache.beam.sdk.io.hbase.HBaseIO
- Section
- Writing to InfluxDB - Search tag in class org.apache.beam.sdk.io.influxdb.InfluxDbIO
- Section
- Writing to JDBC datasource - Search tag in class org.apache.beam.sdk.io.jdbc.JdbcIO
- Section
- Writing to Kafka - Search tag in class org.apache.beam.sdk.io.kafka.KafkaIO
- Section
- Writing to Kinesis - Search tag in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO
- Section
- Writing to Kudu - Search tag in class org.apache.beam.sdk.io.kudu.KuduIO
- Section
- Writing to MongoDB - Search tag in class org.apache.beam.sdk.io.mongodb.MongoDbIO
- Section
- Writing to MongoDB via GridFS - Search tag in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO
- Section
- Writing to Neo4j - Search tag in class org.apache.beam.sdk.io.neo4j.Neo4jIO
- Section
- Writing to SingleStoreDB datasource - Search tag in class org.apache.beam.sdk.io.singlestore.SingleStoreIO
- Section
- Writing to Snowflake - Search tag in class org.apache.beam.sdk.io.snowflake.SnowflakeIO
- Section
- Writing to SNS - Search tag in class org.apache.beam.sdk.io.aws2.sns.SnsIO
- Section
- Writing to Solr - Search tag in class org.apache.beam.sdk.io.solr.SolrIO
- Section
- Writing to Splunk's HEC - Search tag in class org.apache.beam.sdk.io.splunk.SplunkIO
- Section
- Writing to SQS - Search tag in class org.apache.beam.sdk.io.aws2.sqs.SqsIO
- Section
- Writing to Tables - Search tag in class org.apache.beam.sdk.io.iceberg.IcebergIO
- Section
- Writing using Hadoop HadoopFormatIO - Search tag in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO
- Section
- Writing using HCatalog - Search tag in class org.apache.beam.sdk.io.hcatalog.HCatalogIO
- Section
- Writing windowed or unbounded data - Search tag in class org.apache.beam.sdk.extensions.avro.io.AvroIO
- Section
- WS - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
- WS - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
X
- XmlConfiguration() - Constructor for class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.XmlConfiguration
- xmlConfigurationBuilder() - Static method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration
- XmlIO - Class in org.apache.beam.sdk.io.xml
-
Transforms for reading and writing XML files using JAXB mappers.
- XmlIO() - Constructor for class org.apache.beam.sdk.io.xml.XmlIO
- XmlIO.Read<T> - Class in org.apache.beam.sdk.io.xml
-
Implementation of
XmlIO.read()
. - XmlIO.Read.CompressionType - Enum Class in org.apache.beam.sdk.io.xml
-
Deprecated.Use
Compression
instead. - XmlIO.ReadFiles<T> - Class in org.apache.beam.sdk.io.xml
-
Implementation of
XmlIO.readFiles()
. - XmlIO.Sink<T> - Class in org.apache.beam.sdk.io.xml
-
Implementation of
XmlIO.sink(java.lang.Class<T>)
. - XmlIO.Write<T> - Class in org.apache.beam.sdk.io.xml
-
Implementation of
XmlIO.write()
. - XmlSource<T> - Class in org.apache.beam.sdk.io.xml
-
Implementation of
XmlIO.read()
. - XmlWriteSchemaTransformFormatProvider - Class in org.apache.beam.sdk.io.fileschematransform
-
A
FileWriteSchemaTransformFormatProvider
for XML format. - XmlWriteSchemaTransformFormatProvider() - Constructor for class org.apache.beam.sdk.io.fileschematransform.XmlWriteSchemaTransformFormatProvider
Y
- yamlStringFromMap(Map<String, Object>) - Static method in class org.apache.beam.sdk.schemas.utils.YamlUtils
- yamlStringToMap(String) - Static method in class org.apache.beam.sdk.schemas.utils.YamlUtils
- YamlTransform<InputT,
OutputT> - Class in org.apache.beam.sdk.extensions.yaml -
Allows one to invoke Beam YAML transforms from Java.
- YamlUtils - Class in org.apache.beam.sdk.schemas.utils
- YamlUtils() - Constructor for class org.apache.beam.sdk.schemas.utils.YamlUtils
- years(int) - Static method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows
-
Returns a
WindowFn
that windows elements into periods measured by years.
Z
- ZERO_CURSOR - Static variable in class org.apache.beam.sdk.io.redis.RedisCursor
- ZERO_KEY - Static variable in class org.apache.beam.sdk.io.redis.RedisCursor
- ZETASQL_FUNCTION_GROUP_NAME - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.BeamZetaSqlCatalog
-
Same as
Function
.ZETASQL_FUNCTION_GROUP_NAME. - ZETASQL_NUMERIC_MAX_VALUE - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSqlCalciteTranslationUtils
- ZETASQL_NUMERIC_MIN_VALUE - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSqlCalciteTranslationUtils
- ZETASQL_NUMERIC_SCALE - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSqlCalciteTranslationUtils
- ZETASQL_TIMESTAMP_ADD - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
- ZetaSqlBeamTranslationUtils - Class in org.apache.beam.sdk.extensions.sql.zetasql
-
Utility methods for ZetaSQL invalid input: '<'=> Beam translation.
- ZetaSqlCalciteTranslationUtils - Class in org.apache.beam.sdk.extensions.sql.zetasql
-
Utility methods for ZetaSQL invalid input: '<'=> Calcite translation.
- ZetaSqlException - Exception Class in org.apache.beam.sdk.extensions.sql.zetasql
-
Exception to be thrown by the Beam ZetaSQL planner.
- ZetaSqlException(StatusRuntimeException) - Constructor for exception class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSqlException
- ZetaSqlException(String) - Constructor for exception class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSqlException
- ZetaSQLQueryPlanner - Class in org.apache.beam.sdk.extensions.sql.zetasql
-
ZetaSQLQueryPlanner.
- ZetaSQLQueryPlanner(JdbcConnection, Collection<RuleSet>) - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner
-
Called by
BeamSqlEnv
.instantiatePlanner() reflectively. - ZetaSQLQueryPlanner(FrameworkConfig) - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner
- ZetaSqlScalarFunctionImpl - Class in org.apache.beam.sdk.extensions.sql.zetasql.translation
-
ZetaSQL-specific extension to
ScalarFunctionImpl
. - ZetaSqlUnnest - Class in org.apache.beam.sdk.extensions.sql.zetasql.unnest
-
This class is a copy of Uncollect.java in Calcite: https://github.com/apache/calcite/blob/calcite-1.20.0/core/src/main/java/org/apache/calcite/rel/core/Uncollect.java except that in deriveUncollectRowType() it does not unwrap array elements of struct type.
- ZetaSqlUnnest(RelOptCluster, RelTraitSet, RelNode, boolean) - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.unnest.ZetaSqlUnnest
-
Creates an Uncollect.
- ZetaSqlUnnest(RelInput) - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.unnest.ZetaSqlUnnest
-
Creates an Uncollect by parsing serialized output.
- ZetaSqlUserDefinedSQLNativeTableValuedFunction - Class in org.apache.beam.sdk.extensions.sql.impl
-
This is a class to indicate that a TVF is a ZetaSQL SQL native UDTVF.
- ZetaSqlUserDefinedSQLNativeTableValuedFunction(SqlIdentifier, SqlReturnTypeInference, SqlOperandTypeInference, SqlOperandTypeChecker, List<RelDataType>, Function) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.ZetaSqlUserDefinedSQLNativeTableValuedFunction
- ZIP - Enum constant in enum class org.apache.beam.sdk.io.CompressedSource.CompressionMode
-
Deprecated.
- ZIP - Enum constant in enum class org.apache.beam.sdk.io.Compression
-
Zip compression.
- ZIP - Enum constant in enum class org.apache.beam.sdk.io.TextIO.CompressionType
-
Deprecated.
- ZIP - Enum constant in enum class org.apache.beam.sdk.io.xml.XmlIO.Read.CompressionType
-
Deprecated.
- ZLIB - Enum constant in enum class org.apache.beam.sdk.io.TFRecordIO.CompressionType
-
Deprecated.
- ZSTD - Enum constant in enum class org.apache.beam.sdk.io.CompressedSource.CompressionMode
-
Deprecated.
- ZSTD - Enum constant in enum class org.apache.beam.sdk.io.Compression
-
ZStandard compression.
- ZSTD - Enum constant in enum class org.apache.beam.sdk.io.FileBasedSink.CompressionType
-
Deprecated.
- ZSTD - Enum constant in enum class org.apache.beam.sdk.io.TextIO.CompressionType
-
Deprecated.
- ZstdCoder<T> - Class in org.apache.beam.sdk.coders
-
Wraps an existing coder with Zstandard compression.
_
- _ATN - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
- _ATN - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- _decisionToDFA - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
- _decisionToDFA - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- _serializedATN - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
- _serializedATN - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
- _sharedContextCache - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
- _sharedContextCache - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
All Classes and Interfaces|All Packages|Constant Field Values|Serialized Form
Consider using
ApproximateCountDistinct
in thezetasketch
extension module, which makes use of theHllCount
implementation.If
ApproximateCountDistinct
does not meet your needs then you can directly useHllCount
. Direct usage will also give you access to save intermediate aggregation result into a sketch for later processing.For example, to estimate the number of distinct elements in a
For more details about usingPCollection<String>
:HllCount
and thezetasketch
extension module, see https://s.apache.org/hll-in-beam#bookmark=id.v6chsij1ixo7.